var/home/core/zuul-output/0000755000175000017500000000000015153506177014537 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015153517607015503 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000202305015153517421020254 0ustar corecoreikubelet.lognc9r~DYd` \-Hږ%C{sg5݁ϑ)Ӄis$WU)X62?Kx6b}Wߟ/nțx{w~{N_~𒆷7̗8zTY\].f}嗷ovϷw_>on3cvX~egQBeH,nWb m/m}*L~AzHevp7뼝ca㑔`e0I1Q!&ѱ[/o^{W-{t3_U|6 x)K#/5ΌR"ggóisR)N %emOQ/Ϋ_oa0vs68/Jʢ ܚʂ9ss3+aô٥J}{37FEb%]˜(O)X}d\UlxDJfw6xi1U2 c#FD?2SgafO3|,ejoLR3[ D HJP1Ub2i]$HU^L_cZ_:F9TJJ{,mvgL;: ԓ$aёdt6o[ .`:J ]HmS>v5gCh31 )Kh3i J1hG{aD4iӌçN/e] o;ijiBx_2dd$YLYG(#?%U? ` 17ׅwڋًM)$Fiqw7Gt7L"u 0V9c  ˹dvYļU[ Z.׿/h QZ*U1|t5wKOM6_Oފ?O1nԝG?ƥF%QV5pDVHwԡ/.2h{qۀK8yUOdssdMvw`21ɻ]/ƛ"@8(PN_,_0;_x+Vy<h\dN9:bġ7 -Pwȹl;M@v4If{5C/(\ Q] R['>v*;o57sp$3nC|]|[>ӸUKޥg9b2oII"9 1"6Dkſ~I=嚲W9ȝQEkT/*BR =v*.h4(^&-Wg̫b]OBEFδW~N 97;Zp0s]UIĀg)4 B^S4t; *퇄u p}du ~z/9آs;DPsif39HoN λC?; H^-¸oZ( +"@@%'0MtWG uIo1]ߔr TGGJ\ C.iTEZ{(¥:-³xlՐ0A_Fݗw)(c>/1:N3cl.:f 3 JJ5Z<{kJ_O{*Z8Y CEO+'HqZY PTUJ2dic3w ?YQgpa` Z_0΁?kMPc_Ԝ*΄Bs`kmJ?t 53@հ1hr}=5t;at 9:Ѥ߯R> kH&Y``zG,z҄R K&Nh c{A`O'd1*-B[aL"T 1dȂ0TJ#r)٧4!)'qOFz|&4@2ƭ1-RN%?i¸ `eH&MJ!&ᙢ(<<-ja0Tazkm{ GYә7}U>>a>Ҟҝ>Ϗ ,ȓw`E_d$Ə{(he NSfX1982THwnUC9fDx5X@O5OޔL<'Adp]{1DL^5"Ϧޙ`F}W5XDV7V5EE9esYYfiMOV i/ f>3VQ 7,oT tMK%\t=[ٹ:11:2`c J1bV_gɊ:+^͒V,~0{gj"A, rXr*0ngY.] <ʜ6 ;,9VPAHuŠկiw=m{> *nacԇ&~hb[nӉ>'݌6od y$P'BOTLl-9Ja [$3BV2DC4l!TO C*Mrii1f5 JA *#jv߿Imy%u LOL8fq CXReQP2$TbgK !)CGm`XYB[9% E*:`cBCIqC(1&b f]fNhdQvݸCVA/e.# Okx܍>М>ӗom$rۇnu~Y݇̇TIwӜ'=A7,Q)-,=1A sK|ۜLɽy]ʸEO<-YEqKzϢ \{>dDLF amK/0-Vb=SSO|k1A[|gbͧɇد;:X:@;afU=Sru CK >Y%LwM*9ƀg[@=q' $dߋ{Ny[$ {ɴ6hOI']dC5`t9:GO: FmlN*:g^;T^B0$B=aK`3CmF1K>*Mk{_'֜dw$FEc* A>{avdt)8|mg定TN7,TEV |ɧ<Ғ8_iqEGPVC P2EU:F4!ʢlQHZ9E CBU)Y(S8)c yO[E}Lc&l .U轋RQ'Vt3,F3,#Y3,kJ3,LhVnKauomˠ_~g,ZByXϯ&Ksg3["66hŢFD&iQCFd4%h}̗Uɾ?si&2"C]uG~^`X\u6|6rcIF3b9O:j 2IN…D% Y"O j\WFI#R޸B4rc\>1yFX09'A%bDb0CPvw/T/ia v[|mAC5t[OM91c:VJR9&ksvJ;0ɝ$krogB= FYtЩOte=?>T&O{Ll)HClba1PIFĀ":tu^}.&R*!^pHPQuSVO$.KMb.:DK>WtWǭKv4@Va3"a`R@gbu%_J5Ґ 3c̖F4BJ2ᮚ苮p(r%Q 6<$(Ӣ(RvA A-^dX?BpY]Q4`Iz_*2coT'ƟlQ.Ff!bpRw@\6"yr+i37Z_j*YLfnYJ~Z~okJX ?A?gU3U;,ד1t7lJ#wՆ;I|p"+I4ˬZcն a.1wXhxDI:;.^m9W_c.4z+ϟMn?!ԫ5H&=JkܓhkB\LQ"<LxeLo4l_m24^3.{oɼʪ~75/nQ?s d|pxu\uw?=QR -Mݞίk@Pc n1æ*m$=4Dbs+J \EƄզ}@۶(ߐ/ۼ𹫘qݎt7Ym݃|M$ 6.x5 TMXbXj-P\jА޴y$j`ROA"EkuS#q * CƂ lu" yo6"3껝I~flQ~NCBX`]ڦÞhkXO _-Qy2$?T3ͤEZ긊mۘ$XD.bͮW`AީClСw5/lbl[N*t*@56."D/< {Dۥ sLxZn$N(lYiV =?_e^0)?]{ @| 6+#gPX>Bk2_@L `CZ?z3~ }[ tŪ)۲-9ֆP}b&x Uhm._O 4m6^^osVЦ+*@5Fˢg'!>$]0 5_glg}릅h:@61Xv` 5DFnx ˭jCtu,R|ۯG8`&ו:ݓ3<:~iXN9`2ŦzhѤ^ MW`c?&d.'[\]}7A[?~R6*.9t,綨 3 6DFe^u; +֡X< paan}7ftJ^%0\?mg5k][ip4@]p6Uu|܀|Kx6خQU2KTǺ.ȕPQVzWuk{n#NWj8+\[ ?yiI~fs[:.۽ '5nWppH? 8>X+m7_Z`V j[ s3nϏT=1:T <= pDCm3-b _F(/f<8sl, 0۬Z"X.~b٦G3TE.֣eմi<~ik[m9뀥!cNIl8y$~\T B "2j*ҕ;ێIs ɛqQQKY`\ +\0(FęRQ hN œ@n|Vo|6 8~J[,o%l%!%tyNO}}=ʬ-'vlQ]m"ifӠ1˟ud9)˔~BѤ]һS8]uBi( Ql{]UcLxٻa,2r(#'CDd2݄kTxn@v7^58þ Ţ&VY+yn~F8I !6WB3C%X)ybLFB%X2U6vw8uUF+X|YukXxVO(+gIQp؎Z{TcR@MSRδ~+1æ|mq՗5$B᲋eY(|*磎\Dži`dZe j'V!Mu@ KV{XץF .Jg< ƜINs:b zĄu3=Az4 u5'og^s7`Rzu-anOIq;6z( rx߅ euPvIɦ7聀t>G;_H;2ʗ6 h6QװxmR JQUbTP2j˔Ni)C)HKE"$ӝ!@2<Bq 2oh80,kNA7,?ע|tC3.㤣TiHEIǢƅaeGF$ u2`d)/-st{E1kٌS*#¦۵_Vu3ЩpRIDr/TxF8g4sѓ{%w .ʕ+84ztT:eEK[[;0(1Q@ET0>@wY)aL5ׄӫ A^%f+[`sb˟(]m`F3 W((!5F-9]dDqL&RΖd}})7 k11 K ;%v'_3 dG8d t#MTU']h7^)O>?~?_ȿM4ə#a&Xi`O}6a-xm`8@;of,![0-7 4f kUy:M֖Esa./zʕy[/ݩqz2¼&'QxJE{cZ7C:?pM z*"#窾+ HsOt۩%͟A498SwWv|jN\zZ*8/' ٻҦ62L0ډ"ܺ_z9JNȯ=@oUI y4 BVQXodՔz q[*ڔC"1Ȋ-R0ڱ}oF4 3vFf#8^Vє+k@ :)@%9@nA B q 62!/ 6G (" u:)fSGAV(e֖t܁ ft~c.!R0N<R{mtdFdHÃФsxBl] " Δ<=9i/ d ␙F9Ґ)Hnxps2wApP!se]I)^ k?'k:%Ѹ)?wɧ6a{r7%]_Ϧi~ԞnZhubW*IakVC-(>Z#"U4Xk1G;7#m eji'ĒGIqB//(O &1I;svHd=mJW~ړUCOīpAiB^MP=MQ`=JB!"]b6Ƞi]ItЀ'Vf:yo=K˞r:( n72-˒#K9T\aVܩO "^OF1%e"xm뻱~0GBeFO0ޑ]w(zM6j\v00ׅYɓHڦd%NzT@gID!EL2$%Ӧ{(gL pWkn\SDKIIKWi^9)N?[tLjV}}O͌:&c!JC{J` nKlȉW$)YLE%I:/8)*H|]}\E$V*#(G;3U-;q7KǰfξC?ke`~UK mtIC8^P߼fub8P銗KDi'U6K×5 .]H<$ ^D'!" b1D8,?tT q lKxDȜOY2S3ҁ%mo(YT\3}sѦoY=-- /IDd6Gs =[F۴'c,QAIٰ9JXOz);B= @%AIt0v[Ƿ&FJE͙A~IQ%iShnMІt.޿>q=$ts,cJZڗOx2c6 .1zҪR "^Q[ TF )㢥M-GicQ\BL(hO7zNa>>'(Kgc{>/MoD8q̒vv73'9pM&jV3=ɹvYƛ{3iψI4Kp5 d2oOgd||K>R1Qzi#f>夑3KմԔ萴%|xyr>ķx>{E>Z4Ӥ͋#+hI{hNZt 9`b˝`yB,Ȍ=6Z" 8L O)&On?7\7ix@ D_P"~GijbɠM&HtpR:4Si גt&ngb9%islԃ)Hc`ebw|Ī Zg_0FRYeO:F)O>UD;;MY,2ڨi"R"*R2s@AK/u5,b#u>cY^*xkJ7C~pۊ ~;ɰ@ՙ.rT?m0:;}d8ۈ ݨW>.[Vhi̒;̥_9$W!p.zu~9x۾vC;kN?WƟ+fx3SuKQqxST Ζ2%?T74a{N8;lr`$pZds=3jwlL Eڲ t|*n8[#yN SrA GYb8ZIaʼn8 #fg3i`F#5N 3q_M]j 8E!@1vցP7!|+R@;HspSI]ڻCZUcg5pDcIϹ,oN-_XI,3\j ]ٟ5~' SuipA!C厐$&k7dmhz/#"݃,YqCL$ڲ`"MUbeT>Xuv~4Le͢ }UVM)[A`b}mcE]LCEg=2ȴcmZ?E*-8nhױ1xR2ϫCya` A y!?h!9yL%VLU2gr26A!4vbSG ]ꧧWp/ &ee *w$-`J\ ptǣC^p#_`{ К8EW>*(D{ٛ,[fnY𱹞M=6&$<,"lX-Ǐ_whaE 98 (oѢ/Р΅ 7ցl6618ł_1/=fu).s¯?.S[{'g=Ҥ):d8h\y6]t1T7IUV:;.1& ,5΀j:<< +Y?58In'bXIǣO{&V\DŽ0,9f O_"[l:h¢8wݓ19\:f6:+ .3}=uvKc ٹeS<>ij(o'ciS<{1$E[nP b?8E'xv[K+E{,Qƙ1*dcs_Z'407|qBOgYU|U--sG8`u! qGYܷw;ȌCPc_|(RaIBKb+{P.T! =ĦiTob d<>SHr][KqWs7ѝBYǭ~RR"p9dFg|K- obY_vM 4>/]e/dy,8!xŋ5 R<^mYo 3c9(F?he:9[_v~\:P ؇'k01Q1jlX)/ΏL+NhBUx~Ga>Z"Q_wjTLRˀtL L+BT҂ll魳cf[L̎`;rK+S- (J[(6 b F? ZvƂcW+dˍ-m𢛲@ms~}3ɱ© R$ T5%:zZ甎܋)`ŰJ38!;NfHohVbK :S50exU}W`upHЍE_fNTU*q%bq@/5q0);F74~'*z[\M-~#aSmMÉB2Nnʇ)bAg`u2t"8U [tJYSk, "vu\h1Yhl~[mhm+F(g 6+YtHgd/}7m]Q!Mę5bR!JbV>&w6οH+NL$]p>8UU>Ѫg39Yg>OF9V?SAT~:gGt $*}aQ.Zi~%K\rfm$%ɪq(%W>*Hg>KStE)KS1z2"h%^NEN?  hxnd/)O{,:خcX1nIaJ/t4J\bƀWc-d4M^d/ ʂK0`v%"s#PCoT/*,:[4b=]N&, ,B82^WK9EHLPm))2.9ȱ  QAcBC-|$M\^B!`}M^t+C~Lb }D>{N{Vt)tpDN,FCz~$)*417l;V iэ(_,j]$9O+/Sh]ice wy\Mڗ$,DJ|lj*à␻,?XAe0bX@ h0[}BU0v']#Vo !ې: Z%ƶ(fl>'"Bg< 0^_d0Y@2!ӸfZ{Ibi/^cygwדzY'Ź$:fr;)ٔf ՠ3Kcxwg*EQU{$Sڸ3x~ 5clgSAW"X Pҿ.ظwyV}̒KX9U1>V..W%GX +Uvzg=npu{do#Vb4ra\sNC/T"*!k愨}plm@+@gSUX覽t01:)6kSL9Ug6rEr(3{ xRP8_S( $?uk| ]bP\vۗ晋cgLz2r~MMp!~~h?ljUc>rw}xxݸǻ*Wu{}M?\GSߋ2ꮺ5w"7U0)lۨB0ח*zW߬V}Z۫ܨJ<]B=\>V7¯8nq~q?A-?T_qOq?5-3 |q|w.dަ'/Y?> (<2y. ">8YAC| w&5fɹ(ȊVã50z)la.~LlQx[b&Pĥx BjIKn"@+z'}ũrDks^F\`%Di5~cZ*sXLqQ$q6v+jRcepO}[ s\VF5vROq%mX-RÈlб 6jf/AfN vRPػ.6<'"6dv .z{I>|&ׇ4Ăw4 [P{]"}r1殲)ߚA 2J1SGpw>ٕQѱ vb;pV ^WO+į1tq61W vzZ U'=҅}rZ:T#\_:ď);KX!LHuQ (6c94Ce|u$4a?"1] `Wa+m𢛲`Rs _I@U8jxɕͽf3[Pg%,IR Ř`QbmүcH&CLlvLҼé1ivGgJ+u7Τ!ljK1SpHR>:YF2cU(77eGG\ m#Tvmە8[,)4\\=V~?C~>_) cxF;;Ds'n [&8NJP5H2Զj{RC>he:ա+e/.I0\lWoӊĭYcxN^SPiMrFI_"*l§,̀+ å} .[c&SX( ( =X?D5ۙ@m cEpR?H0F>v6A*:W?*nzfw*B#d[se$U>tLNÔ+XX߇`cu0:U[tp^}{>H4z 4 (DtH-ʐ?sk7iIbΏ%T}v}e{aBs˞L=ilNeb]nltwfCEI"*S k`u ygz[~S [j3+sE.,uDΡ1R:Vݐ/CBc˾] shGՙf 2+);W{@dlG)%عF&4D&u.Im9c$A$Dfj-ء^6&#OȯTgرBӆI t[ 5)l>MR2ǂv JpU1cJpրj&*ߗEЍ0U#X) bpNVYSD1౱UR}UR,:lơ2<8"˓MlA2 KvP8 I7D Oj>;V|a|`U>D*KS;|:xI/ió21׭ȦS!e^t+28b$d:z4 .}gRcƈ^ʮC^0l[hl"য*6 ny!HQ=GOf"8vAq&*țTOWse~ (5TX%/8vS:w}[ą qf2Lυi lm/+QD4t.P*2V J`\g2%tJ4vX[7g"z{1|\*& >Vv:V^S7{{u%[^g=pn]Y#&ߓTί_z7e&ӃCx;xLh+NOEp";SB/eWٹ`64F 2AhF{Ɩ;>87DǍ-~e;\26Lة:*mUAN=VޮL> jwB}ѹ .MVfz0Ïd0l?7- }|>TT%9d-9UK=&l&~g&i"L{vrQk壹n\殸,˛_uu.Jssu/*47U0)l?R_^Uon̝f-nnZTeuu nn/*0׷տ·sHH?Et _I`[>>0ւcz/Adh.$@bѨLtT=cKGX nݔ͆!`c|Lu_~ǴT?crO e9d ljB?K_z>p%'3JQK-͗R>KkΤOq,*I|0]Sj%|-Ԟ = Ʃ%>H&t;9`>$& nIdE Ͻq*nŘʰNҁV?mܛP9Nz't:HAoAR؆eL:#QowX,]!#D(@˸tɎ¤^[C)ZY UIb+U rWBm㇪ H` QH#K64ACUR&\.!$[ zg KkxOD0H'cX㼜 ^|Q S".q*p_<&읣$?.)v]Rvb@D]7ЏNƗeVhYK̳ ^fPi,F+ݗ-Iy#^r%IX/Ќlˏ#Ba3wi9̍x(Nk6ʩmo 5ڃbȳx&#fj{=ԇz],yߗ/fR.z~(a(lE<7{<9ADZdڪ ^* U_2 &/Hsm?#_NngE /I?շ_,Q^>ħ("/I_20%,o q}FaȂGQ<[|$i/x*7)wI!}rs~O^L&wzKD2WqUӋ%-qTMXdL=2X\5bI=I$,Fs^*b\fᐷe,%&ygȋ-*e<<ϡXec$#)i;LX|ϢH7|; -sddOY-Qv]*HQ\S-9E˄+RDK[]IdiCdZ^=LX祼Cd Y[@Y@/ 5Ґȳy" LGBx5<.G9/N@|rdQ\P Jo-Uh*h^UyzrOwEW%Kߒb=ko^NFQV08U2~t֢^lHCE}KRo7v\9dQ{`<^ۖ)%hL- V;Aď 2hDFy-bZj*Tv9ڝ?q&qn$Wx9W<:N=Eo>:+|UiP&Oqk,].~Ɋޜ38r Q}498A vFiӪ|qPwРӪE c9.'?j/ӷPˢ΋@g)+S[)%t^櫓3v\ZMi%҅P5G.,Y"KЛܻBD~q:g= cFqfEM+(h‚m勵_urkWs@E%xj ?8m2n &BPxα" )ZyeQͦM4:AMh#B\s"R{qU:b1\d**il77J "ep*g*^vAp T !n58 ^ȢwT(t^WSxpA@IO'l}J氋2i*Evq"q"DO~juRpWDy{'I E(T&v5W6c;o&g`Y~༤l+({=1Rhr]8AҺ:r7Q0 S+[t>U*>du?S~E#o#GO/>?|r s cf"ĭj qUAmfW<U-Ɨ/ila` UVqhCOX0 k+Aqsh DzvT8,k>eq`ڏs: ݋iih\XeVnl7\qѝ R]@2%;<y˥"߿-9$,ԕf##;$@d,sˆ\y 6w3˛4q:Uwk#Ja=m @ǙhD a;;/u[ru|C$٨FdWHKkk} |vmCHqLUHoԼ 6G沝H2g3PƜ:E~3j̃y!0`= '`4R[7QՐ5P_dQ[TuP%>86K$.BS :Y^) L+HcM`S7gǪ6Ptޕ{xwq"u!iuc#SĴv%CLL~ؽa!;& y,clZi#Y!o 11MM'qm}㤋->Xt <7l1t:֝)L aKgFUrK-SSHs.yVJ\yvҡ%O +KEsڝ&vma"3ۖ&6oR18wgx6x\y[+߲t<`1QmZC #2RKlW "ס:VU}dcYqc#!:qBvŸ<͍jv%txLɎIvkpNBw!F. g'nL",M==HVn?iTLFdF,P* G單6d+yQuQ2Q9`7$;`ޞ?`PfhXu dS6q22K voiiܖ0SXu a.`.盾ŹLES\OURF,N0\yT-UA+yS3w]87Q"LW.$"u|[)x"M F] l"!ki8eeEb ,WDWⵏm%;`(`,$Λ&-`$\]&m%EgjCd33cB>bnFQb Ncycfc 9ںĤ%_xQd߫ߞfRt걏-koϨ'YybfHzMd4#ߊy#M:%6[e Չ}Tga,`/5_5V4s9XipV,]ιMkǁ]OmGw/> ,7} B/.u=."2ֳ{Mp4X8_[ -zu|TlI:% =6ipnQS=EhTꐑjc)h!.isݢP Xs)M]q7U ٢,K:~H"]Z K4-:ͱRhASC!.VSWً&wmcӢN4`yz&W6>iKؖ.U=lCMU_xY]7%2 Ev,6dٲ u|G{o3ۇ[hqQ XpXf5A꟯=c8yY^g* UX9 R'9d}`|( Su`j^z9r6b|ep1A4`=7)cՂgDO*٦D~2œCOÈk}TuF<@2^[FM]1doFz1J"0l(eCQFShq!y)2@73Xũ*T՜]4Epf!V~3fYח =cN}g9>==9nO!V[3aRd̥?j)J[JQճJSt%0yMuE4Yp! nD* U"˻`Z3LzaW_;/تۗ*(|JJĮ{ \ĿC|kJ\^`zCo7 3(O ?x9Xu﫧B-j e xU?P<{=@EFۤr5]иe}u[=ϱ@7dl^e)y񡁟gG}~Rzqtꧣ_vr ^|2A/\ 4ު#8QmH4^ަ9 c!s;a@umBI hlQyC"T8%G(S{Θ ]s^L"q4  eUup ?ǖFB^>|qpSNɴ:p4썭#k> A%wDX"/蘋5Tg0V"SCi% 1h @(쀍!/ 7"n#vFq;^<`%(q&jBED  L)hL۱se/8;fR!" z`NU@(9 KkOh'Ap֑ioD2 !{ɀx5zfh ln:z:rt:;:cP򉫡׷,j{7W#ELwG;s$znWluB?H_O㠹-olzߖ8M~3wOl]W6n7v=#TFg{o"Fig\ eYP]ʓ4z A(/wq=P܎0ɿN,je~.u_|461Ɯ)a]3qR `:nØ弸]1D5BqxӉYk /neLCiۛMoq ?vv'ܤNAfHƅxqC̎ċ1nu|nl|3׈l{ u`f&و;pm1ӭE$^MZϐrf׉d':z+,L00`xIL ~a:oil7@鼨qO:ǾH 8"yHR,=,+jGz:vF;}qbaf̫>u1! : sd ݰ-QL( CHR p> Rfȱ |@f% iS Mwn{%YI.h4n'2 lB']3{7{0rO㉡׍~V`Z3)+5,9M^MiS0YPSӜrYbѤO MO|Y+ꪚ}rѸ)i',엪iŠ9]I-[ꯪ/>>ᔘnQw>MYS_pJ q {C$z[s=8%\I@XݿA t<meUT=Gr283%4ӁaI|L! FUH (I8P u%Y=w:vAO 9S6p<ɂ8%j3״ .Û8Y5 gn{GQ(~MX̹?7 r3QF zepaeŢ,l6ؓf(w|Ehfi%|ؒZ=Rvlmi(j0Vi4PaKE;L g}۴ Po2]CۢW {&0mԏFO|"ih-틭Vjyd?)܀mlqG6|cAkAQ>S޺G-.[=!ӊzi֠eqЗoMA *IIgBNyV?'-ʶ mO(ۏP#lBWmӧlNe{*۵[Vڱ[.]oܹ._oA,ߞV|GJ˄[jnO掄[* [*'TGؑP2ZjG#˄[joO[,lA=~:;lALH2zG#BtI0H%,-}Civ U-_:ߢθY/uX=˻Jq2&Tۤr4%M6+IxpTR(G>&V2 1A3e7ұ5!пr^/h)׿>}j}ޏޕ߻-0J$i+/{J I_tv/UwD0 PV9aG |A<:?_9G\_^UN!]0XZZrSd0ͳp2Yc8 CGy'::hF۰ CحZ4uji)q~*S/GCu)j\:s&qj&*"d?$ !=64A= iU$n @'a<{rxCxvZ[zhqݻK\? yG*@XҐeEd?<JRm7$xkղ ~j +2R,hQ|zŭPU:0~p{43_5t66 ڂi:M<`Cp˴Y& f~WПYdclGC:aU?_e:2  㙤@mXds!&OC<~a  HY^8H&/ 8rG~Ӗ=$uk#֮U& {Ӂ1LjSRzt.jd ,?SR U?@{"s+ Wi:&fţ,:Od&gϗ &/&pR1=UV/U>4b˪3W?2ߜQP$L)Z`6]MLnWa/bbiWr.SMP}CUppߑ._P_'-+u'U@! ӡqˆq0(Hh*b"1in2FA,>Zیsš;BIY3\:(\.PԬi5IRT;:,04^0#ɂ9#V*aN$#tFO9Ak5~Jt\s-SX0e)=uyr0pI}"Bj07 5Ktx`px`4w .}ZZP,&X+}5 7Y~~d~Q9ndJo ]-,c$<V|V+l|1ɬ4dqafa%qUxO%ec+qt+HC|z7f<C*̨ x*;O~y{VZe^>DGt˯ȢR;t:#g);NQp NQP}4h_L]}\FrPC&{<}s"o'qH)T!Zj0?d8 k@5[‚e\R6Y&ɸ&"9}hIeQo^#VG Rt1p.3l)8~Si͞Ά0:iTfPpX(㳫K-T=~ObR5XG4#֠%S+ yيWp?-a)K2eS=帶ij>W8 X.r JV®J{L /惋YCݠqp;LAŗLQ4K][o[G+F_ b. UE]BҒ-`Vщ+9^K)v]}_:=X% FGo%z0!A?6{t{`#|c}ғhQ,~=t~Y0N^xYd8&ͷA'{&?.w3iN8I~qO-OB~|:Pgd?~ʳգkfj:d1w&HwK8կJ^3PTFm~O˕&=+{'8ej~Oo]x @Yyw5y:a-0_[q\eYEbhji6OYN{b_qWzN|ڧg;3)o?MzmԺ2ed̂-Lf~'t~#Ada{|l%_L.XF`CVZ %8 kr&v1Q3^Jp{Tڛ - 8F :rDb!l҇VRv|~HbS'}W$8v11BF;ZbQ&ZJٻC1Gٱ5/+[,D?btO!MRl?nMs\ Gs-bIl_JU!F {L Hդ3WgkXr10[ؽj|HbsY.]zHMC1tƫ\b ,T];183tjG7Gk2if} *%Bc"%&#'hXp@vEJ^FÔ\ccP81XUylכs(tb,*R1p s,̶䥈)IB#v4Uspz>>)Ԫb2B9''FwE7RI|j/Px܃%52_VEM,Scİۗ=pTሟڲq $:ԯ)RG@C{^I™[wN,R,0tBFpa$G‘ d>AYYڔaE%A<Mc$D%L>U34@d`&P'0jC #^ݤfepH RL(QflNLٹ86.(tϪ̂hAV;öGl4+$"U ΈщYQU!|a}~ tF_,tľn% q.*ӚC,c Rң|VR nv $Ed22*AH"NΤ.t^p~ߴG7epZkp+E^:6}I GkR+\Dzg^N0X^V$R;'#8:QPu̽8Ti afYqG'.Gz/7S v9.yiˆ!g嬼=6Ip\s[Z>)q|"QDcd%p>tX`J#F )<_8ؙ^WtkP}Etδvcqhcdv,g:=gg158Ys xAUxzQҷ#a. 8u:z>xn-0mg:ɤ6d B ytLV !~%R\JlFҡ3n,xUB#2X|r5,Z+TP,gENyz qD˨sL'b7g'!IVE8`BlXh- 28!)xl115_r+U;T9v;AM}?o":%`&G3v!ԁ/8:9_]Y! gA6*,I'8޴+G6 ӥd1cM5NZsal \jQ\kº `$w m:ƋmD"&UHbQ )шR&F!^r> $sV DGQ Dom/[Hb?mi_l)s!i( g % R+UT"b0Nt>R|}Q^: .[C ZAjYZhlKHMH:-_3 7lE$ GC29 i~p<,Mw‰Ky#Sʠ7THJrWl7krvyIgU!'"[Ip< l!wyRh;-3[Ҙ!V=t:*z!qeq)qg^< IV(ɿ݊] ~^Yvg/o(RW! պo2bMQHϪzW5a%m|݇(`9uNY. u7$EH}EH= h<_~M`̏9}G%"y 8x%sU<زp- . 8F#Ĉa# r^,rMS|j<=П2DCת}eEu 7q4&Iэ^3;K .,jX˴HIYxs;۝S4+:ZpT!fxܯ"y@u[w]5broۻͦh/z˺mNna8's|W&/6j^K:?> w6=!) BVӐ3/4[×ɑ8x5cR̚ma0QU̶Ϸx#0lj\H7i=~CpbleaĈatRG,pKɊX!݄7\wUXkP6trS!rT 4_aho*XL"#x{cIDe"ۃN\Ob" Db$. x#w :#l]b0gR/ʒ'NY#s E?b ̵ c!t$|ҽ3Y-IgBїT xIS2)4K,nBH!F ۮz4|S@7|Wnǘc'Ϋvc4%7O;|7l j87K-e>i4F, `FޞMy(*% qyxiyV,#Bb֫D_@dD\nR~AmG]8/v9a2z||1 ]A<…qVĈqrwTn7tY͗:#ƇN~<><G]^p{: -?E~1['ͥW5}5M7xrrHTsW`tydQϽ*ƅ=q3x{e 'j1qD1LԺqtR~z jkGy;"7_pkE-ěn^?c6wl?Z:]io#7+ M0`x6;x#I>X,yuĻm%#[W[0YXTHFIJ3K -K~ û nn{0d=1)tUz(c`Y4?7}c{ԧdf<6ćO@86LLGe 4) Ll/a.z'hY=fac:$?g  ޸Cd[IlߋwML/+%ɰw]7qQ6`/ zgwX n w;LZBx>B$\|?uY?7?&+QUk"t" w`4j}喻]݄bAkV< CC$ 2͌ UwˇuVĀ4Eypr(Q#PN8Fo,Y3ש߻) Yə?Ƿ?$FLo|}wQz$nI[ue%z_~WE Kyhғw94fOҥoO˿4d\Ƿӓ_@P h`tq'|RR;a]{q2 ]P40Oͨ:|14o4y>БG*fDk*H|v8 K-lz+i/߁ ]hOa򮲏%[L *179~__&p'kL-4x*>[ثas2,{4 `(l0)[cfxxߌN4y~>XR>PV}vۋ|j/SdQ'ma0>K| vMO_IM`_{ r:x"W \lʺ"kJ ʧ҇QjwFUm=e+þa$,h*pub`S;9ʾ+@U[UZ,ץnx}}Z#0UAĉ L0p `n'=`e_jW7Ÿ7CW %;-gm1$g3-RyB2r@onLiFG}"wpfuKBHZ#NV +wu7ixhxG)e~>&Uhh$ Oӳu;q'~&SN>s&F(s{,+U[2g -PZJw̉$ʙ{6qKg)-q>ț&j]<5zu)!MV8VB IkPJ+zV6i:u\lT`iLLTJ[E/aXH*Mv™j6pZ 84"զilRi`Hف+0Bmy %o#v-kF[eVm! ҆t m6iCSl\4lF FY Z5 dmZ r|bᙂB gj5&3 Y;>w{lR)KQQ ϕRH7ĭqaTLՄQI7|qK QzʼuB7 ^^1HK ?"_°< t:mqqRe'~?|)|2z0-(r( B<Ѵor[4h1$ymY.aQd{ G_EWgA6("69KMdx_yU憪w@3l۵f,cuUPBA%"h!ĺ@>PSLf4f4 4};ܺFdsF{3Wɪs8Ý4DeNkN" Œ偨9RGg.g`}p,[y6\m)s:rZciJͳ%ɦ97cx"u1d< :zQD+  F@^Z:6G =Y41WAih>4,", Md 5Y34 貐q@Y s-l_\PH͜^`P"%<*5.vCHg]oz~_oL1[MX/] ĎԼ+ۘڧЪ7alN{ɀz&;AѠt)~|,eB'B1SӤ^MόF2 â [J%52f] Mÿ SWp)*E1%?xv Rn@.^X}`8QrJH*€qS#3 `>.}iSnjҢѳnjSī. [#|>_?&e&u?YnVf=Fr{ SoFRse˧Y Uhl~_K* )P%bt]>/i2?xHMK*b"gly.vA9$ABDTq uEO /;g AK by''7LF1$}ޝY]iMP$=mR!&(R*6ɶFP5-B^EV)I,hbLK`Rニڦe\ B4@E \G,c$;97͂BAXfzABS@~iz!sd(39'HcI4!@,Cĩ'8a^r+ f)r@<9^Bt(  -,R0|yU69~X8~="|$Y?p 3Xbxp9$ 5Y&!҃ JSetL1ܑ5Ljfilic![ Ls7! Fsʔn20>&;O]Fx)Pw x  vQ< O@( ZZGZ#1?E!&x1ctT2d<iL3^!|Ipn.[IMX$! MyL H1itsJI33CyBteiX Zx$ `*@mQ(5 (DOdk~aCmS3ԒUd)X'Ly{'h{'HZ ЧZ6:+b!P\L5B@mZTqvS]SLM(nN,d$quk<7HK._g"X/-N#)D-xEJ J9Q[ٻ6$W"ƊTW8uXc]#`#Øgij$jIJR-j5H| QUnzRgݸq\Y&"E1+I3qLI-Y&&{o1,3~Y%040W K5|P~}Z+"$11),,IdgB$ϋkQ@n挦VB`T٧>e̢ b20G U.\j6EmuqCgkG@Dtl=8.ɩlQKQ ]ڜХXzդA4g6hÚy٠ Lkڜ&JRj{aWY"T,y0V:3"HK#yvdD%QwC1 _K`7ĕ)0 ɭh<=0 ;[mg6R!qɉ }(0n>pֶ v%teJ Yh P\ֶrV =CA GLFjy 24 ]H9[i%]oeyնhN=|/.m4a\)ꏧvXve< sݳ{*TS|G`IHk?q!0V4|KϾEqٞZxѵr>br4 B:>v+[sc_OU3T}o~#ZIò|4~U:Ѥ;,0};p+_1w_w9-her <14` rIR"4GfJnM!SOe9_O*fnO6R>^IPA>y !V\ǃc? =A8\gy_fWC|a>J?aı+iĬ٪){#S5m=9;yY';9uΖ{o8KpcYe70y&YuU/ɱ~7>ѴoSd-=o]NuGT/֧tJ}GLeRB(Uy9~U}+v&rKZ涸a/w7J]^m4_%C}t O~ܟrH7k[[tY+jzU(mrSܭ%…m{%Z_ Iܳ?/V+ Jh"ݾr[n.rE@dSP+6oP6^KxPQ|>]iq!$} >/vSFk8~wjDhxMfy|IMbn?U}}ȣ;Z9z\=|`7zPօGTg8KgM|>ף][Gֿu n߉ HYK} Xؿv0O(0e=,˲T}k{7M6 _k5JS=C|T/w^R/hTwVqk*<ϯPyy">?Ϫ'k m%hPRyIV6*Dv--??$M?"w~;z{y3<ɳgp YRi_}|CԷ~^t6Zw._,DvG?U,fGޗOJ =?pv|ß'lQ}q՟/TO`즚3S?wҜo!o(#ST">?*abG;`=lj_ KEqę2fスDzf6 ;D=GnZ}PP9e]ba?STtZq–uv`\稃;53zOQz @Wlk(dY~ُ>8Ƿ.bxn^VFmᇎ7yy&_ӏ~Rlyoi/ GԗT/wӮEXfW"?O8Qc:/*q0Num9XXPrq>]s\*![ ^ͲE2Vl gKPRI==T;[6upo|3*pܝMģc|Awln}wLzOJ ۤu}6ٜBlJx.u|rR?%&Hv)[KZt󰋑J.7 jYKR;fBRK# PJDL|WpnVooRrS41!Pp3YeAbh- +' EC'.PpGCLM^O`]ZS\ 86xx4kHUkAD[AK,|K/ͷ5tWk t%5URKL\/' SՒ=j{wOImDu}}8+oDctãhepeo+uZFz)mp&?V3_Y}f"2)fGg|&CƁ:괳s'OEfj5=~cyeL\j=SjRg\Pg>bJÃWXad LAą_յzA)p Vjt SQPB"q,| vwKA( 1o@rI^ѧ9gU[ؒ0{iUdw:Xs15 L) L}vvRcieUMʑ([@@9rUAŸ#5,*+o3*G{iXW^m->BE_8.CB= ZA^s֨ V0辏a4a; B-_؛ vnB 2STҖFܙaQԘv;/__Ժ"C4!8.䊕ry).?ny'7CvG>C낕M}4dz;Mal?9(n*T.^um7hS)pmFܙB+i+JV$F˒ʂ@\)7]Ҭrf[_}/bAsJnbf.{KödIՌ a8ILT ~1dY~ُ>⃣CE?^^ ǝXy{uKT;ATYUd,i1Q%fMИcf=ƅjEf0+,2mmVʫMuVsR=v2gz8ܥ-Sb>W9W2Bզk=_5^ x te"~vI.fhRȥ ÁRZ4H&梇ŕ+ #8*"_$598J088,%!qvF-Y# qC(DF΁aO~剽e 2;JEFe(NGqYp 4*n8QM3ιF  XީT;´.*EP2T9'PSCUS[KF@]l6iia` = Rj]5"1P RAU CT+F[ކ^/ ¶V'wPe*'ޖ6JL+ 1 SY -mioh/ڦ@*)b@ '-mioSi/NZ+0PwHPޢ-mEjL95\Lrޖ6YRIZ^ Qj8dBUU/ޖ6RN5ilCE[^qf*j);JZ^bT4R)T@U"ZF^!+v{$"L+ޖ6 C֫@(ap@AQޖ6+!*qUuPYq(c%S##jK{[ڻ kk X R ?lioK{K{d|naju۫UiZf^rL&8~*& qz6,1ߺ( G0J{5qMGaHE,(ZKχwì뵂Uh3 I eRՐe:&OuA)n1_1D6os/ڼ?¶8:C!qazսޙ6$gd#Nq^Y0l,?:[hHb#KR7ɢ*3=#XQss;4 .2rJ_# Ahin$^■u˵pQF +nz} 2^]'W<䙝l]`q;øJ 2B=iઓ㲺n=f_K$5$FK=\ulu J]JE-ڿM_ť^*x*e@dΙ,\zRfZL ?g& TA&*hjVUemZ2KZ(eM8_2jCEޜŰ=(Kqޥph8d8 q3H~I|A}%ov#z 3^匰}1IӨ}8DGhRd:8Gv$y]5呤$}ם5R=}ycz0%I#9 A& uu>Xek #]#PB! P]JppD]6/ֲ GzO?@Yb8d@Aee\AM6cap3Ö #@82r7! Kn/n#LO_=Z:m?͚xvwݤHwMW᡼Y}*ז^O{^wX.GOy߬>9q.6bV:T7~Tu /ݶ$R)2y'媖fQ lW{B\#-H0>/A&,m/zEd/Muk3[sHPy`Lv"ȹ`\>t{x!;=I!LXx΀jDSpn::|\gG1x-1VL 2"7_ :n77PI("$71CykGaQi8T>`^Md9a@!Đ :P6$&?ǐO?aPG(xA QcH,K.tdt ~[1Ma{|` l|# A&f*kō3%e#PUz1CwttSjxQL\rL9D6H{I{2b[YmMI1`ij 停u$@Ρ9 cafO۔LdlJq9>77 0 ;qAuovd"LltKT-9x}X2nᕉbQ4Ip 28XP UᵫG7VJf[ R`*,}Ui&B<XrkmCjJlO>V*ޣ0a2&$j0d`4 ry8dQd@ :ZQ2yIyè$C1<x4 ;<*_Om?&-ǺrM,mĎ6gв@)>=ئ%7C#@wz2R b)iv8d0z͆-|BQ R ߆WwG؆B $̝%~ɨu AubZ~34IKm׋"s+w뻋}I1wŖ]ŭ>7mr ]Ogk7g~vx)#~hɬXm?^L6f?rw&>4 cLb0m1oxp(`{%q.^`R0d2 Td@5l3 }VF#ޚްs` R1d2`kcwҁjsqܤc ѡhL?cp @c8tCv*ȕo}sOF)r D}P\g8[A&-nR5;ds9R>m=zp}ް@CMRLZz -45S@OQ!&#f SplrƶWPaʜ6sBqRo炘$  BH>PQS}8>dS&qɁ}R 2d:]>dACF!MP҇g0;LBp @e!EPFF^Y.*wM~4U@RkLUXƌ5T2ZB@"FV9i'!t@傚MP4\eP {gBk~6<@!.!t@MP-`Ş'Mes[[ e$xЧ 4d2}ηjhZ|&݇rșfĝd:!kr Tj?_ɲjP8Ea{P  (E]Hۆ&Xy|Phj8G-&(ҥ쀬 YKB֒Xx -7V*R(jBɈc>_c ml0'xЫ5l>$^=!tD|wt+6*jS&7nImaz2;$ S: g˔V]̔]LI;c96 fx}nEYӑ1 ]I7eCi~, N mLGΟkǍ,dD }4I@ 27x2u2P/a7Hl8l?\I J 2 zYH3M= 9Q$WwzP%" X|%\Ի~<6? ]Tlw0! @FT-`2d:`Aclb7x,%=O_CҚNOjD$d*p.'Tg^vum<<^ninn^Eđ@.lj4P FC6ٿƿP0ȫuǫFf?.V3aY ?mv‹w.m}-}W~Ћ.V{֭yotۯzC,>6]jOɺߗE5/Vwvr9Pjzm.?.(:ݶٞOٌyZfx9sO\*Ǯ[K10bs2sbt(@v[wչwԑ̴HPc]808 *Q\cG㖣H0yQEu6f \@vF(xh>33mE޶>x UM.X>m\K4@YkƁo92\\fzLc}XaioŐ-?iMɕɷW~~۬kJOՑz϶Oq=n9]DzӢԌ`sq cCdq(;3(әBMx9щ9tٗf_|+ĭEf}k̈́fT|"<Lc=;ai s`^}S@ 93uA6D;>S($Ћ~e^YwUzбqӁ?{Fre A’u!J2HL`61 }oUTK$.>ܤM%vԭ[瞺RgfP2AZiO9́ &_,"1r~FN>~)u$oŒ(X%^Ze20yJAk!TD k3%Ǥ<ˌk9U8#@GP.!=;XHORE5ެl'/GuSv=最Q!A$&ƈ#;ɓ !|KHhfe;~gί_|TJo2&LjC>'ZEASGI|L> =7?M >u&Og~n_ez-~6>O#tʜ>\J= hќv?d zbo=hi1J[ [[O}]^uL'n-.r bdpkC F~N.B-Lm Y0 n8 hr/ϗu}M#Y?O1\Jw>o;7׃?6t@+IҰ3$|f1 x}QT02'-> 5I'B WZCV[1ZlEOQCKԐ [Lꯇ`%apeFcYJh؋h~^ю.Ë_ZnqEǻu9._qA񵛻/2G.hpxwf\O?^\sXY#m򲿮$y K:/_[C8,ZN]mJso4J] (mc>Z L*<N6ɖhe`ri `.JqjRl|cZL?W;6.=ث(/h8O"Ԋ0Z1i8h4L+Fh%PrmԶfPyP^H> GQmFyMo4 _!8go9)1/~UGI-ojd޺_%~_φ1fFfWtnɗϿ"ǁ%}Ud:.ȖZEjIʳJxT_Ւgi EϹ%=,*I?o'ڎǗQKNl&Du~~اںmB0L嶣5(]L[@I2EZxlO8yMO=6KՆzR'%4gĹ)N»iΤ"ɏ݆IȵՕ3 $X@&iLtrY$1 ^ eh)`L4 c,)F9W EcE<,b+j`\9{˻Y.\f /㭒<ж6Z_<:զv,ɟΞS ZKmXi NK}ki1w#i,f-roRASÌ{ our= KͳA_<ԆZc sOAi]$h{iv9-֋ SҾbK)rBy]1v5vV]X"**^?hO))y0w*q\\'Ҫ~f#X1g O+,)`6q=pLhac%l$XW0rHazrkE mVSj|\`2B-h&,Z.y3 3,Ej͈:9(=h9 މ,r@Ҁa-%w#`bu{`hSu]eh'%SԜ{|/h9!҄ixZN4!B0'< -0Pz?/?%u'xJ`I;_SXu<ý p/D+2%lL<=ipC9mzGz - K_.9] >hZA4`ଡ>>Ӱ|gJsDGC dҳh6 WOɫ @,4@CxВ;S#t*0>9tf;4o>͡I fh[BI?I'#{;&|G9F32,I2}-m%0ymsR{%ކOig2  ќ0B1q;] ^gl06RkqΧ7G"!y`+^S}@;epEڽ-QNj3'~ڽ}4z]5( |PZj G/0(*q,Q{ X3k=rE![`QJlI`ӽsXs-LsePsd~Ygo~d)#x̣'c{[f81器l16Ԕx bNMdcp?HoI=dVczѯ;?ɟ*σt3#L~F3T3l~^5?FEjI0: Ry$Ir]o u~96{fW{u-1v쓈Rq~T+mLi0qYO_v/{荽j2 |Jyճca=j7AzOyoD۷SH] }:ܠSuE;T}滒g?ӏf>׋Żln/"gN}(RMM->`FlŜߨ$7R orp&@󠆗eL>x;kf͙Z_%'/^V1 Mzeu{7y굽Rsեُ芳߯[SXUhkl~[<$+(7轁YVt$ͼmOǦ:#ʜݠF ɑyfw{9ṗCwwQ3>@^V&X&tr@@2BrRZqI(Q^Z7ƃ sWXɢCF+"WKQ %ÈյaՁo*Eф:ΉK€UE7z*"^Z/^el c!5KP!A 93xY@{KH3HEUEAVAJhJrhU6P45!x%^ya2,7*R=[T+Ѣi57RGvÛ[T|)F+ƥ@(%Jx! d B%A  `Dև12&)z3jN WN8HLmaGStfa+z=n"SС!Ta%Dűlp8zYF*bU!lmZW.`Ecn}ZEg"gDCkB2Xma bĆ}cGݵuqV:p,it*$kSWȒ uU d D)' RMQEBd:s$`(qzm$J$ ,#i[@XgulѻCg(D52JekVjm:~AߢÖ-VHtĮBkd9*Kdx1\)D͈B!Ɖ sFQ&+PxCX 4.:ׂxe5uhCH$2e Zh#W8JZˑ1+U)^D+.hZё@ ]|E#m e*^EEJ+W Ȕ+TK q2 -SˈEvՊCpS7ШDU*}d%}2%L#`Y8E[tml[p#] ,6,YwIqiS%ֵWE|2 /^±ʭl`]n}Vb4>Vр =hv *x,BhssVN[U9ݕRePACd̆,v8 xf`؃F^tD*Th=J g`J=5߭xTB( mQz~?@ TM_oA#bǐ,G!AD B"5QPEBc .gZ{J_!|MFrwWK N> v 3s%X]c1=/)J|%RbBݝ#[$ϭ:շOWDx`1gG^'̄#/  I\ئ[C%;8P% qe *2* p2X)T3j!(.0͑Fi^-"x{x)J6N?刂*Cue"R.M}u$E,)pp65"f^,.zѾժ /GƒDji"j8J(R@ eI5$:nuH+WE7 U+c+:c"&ԙXqt/tk+F·ɉZbEjIJB 0 |g;`c-_mSԂVPYo54]: nS۹( 3 2^ BX8xKo:!s[ ' fcͩ? ղ0gi&IITf)4-FiP9s=zˢ E*R jon%̂]y6r$[ `=ҹdR@`8Xq c^ywsNWDpq<<]ϯ&VM@@Ե+ZQ0ьуT8{jwG( RH !58/UMyHY5Xp4X;f cRE㣡=·+5(kk80)Q^"o*(|樇xy@1CK6OFmiY.1)!S :(F ,# RJ>"Kk=x"b0jZ{'jT}z-2'V$b$"JF\(\1c~v>^yMa è؞P(ʢ$q,QGb'`bg b>"Z'GQnFD2c("j pN(pc(aFL=+SS)r`IU3be`2SABBGjJd\45`e IAr^%5g_j VupkJ!+@(YP#,3e: zABbx~)#"Dk`5(=!FlPb.0qHBU +7JCqV;^}6!Vr "&|٥hGؤV1%\$"P](hI)`4bM(uiU~nOW3!zUQvիmxUw{q|\m\CYP]1">qMUnpJ?{y|ll)pp1`P1bܰ_·W+J`7d+LKg']td3C'1q:|'-׋w爐'i~>۶{)9 .ᵧ;1`vSkJOIlpZ uJ.- uȰU]Ӆ:]Ӆ:]Ӆ:]Ӆ:]Ӆ:]Ӆ:]Ӆ:]Ӆ:]Ӆ:]Ӆ:]Ӆ:]Ӆ:]Ӆ:]Ӆ:]Ӆ:]Ӆ:]Ӆ:]Ӆ:]Ӆ:]Ӆ:]Ӆ:]Ӆ:]Ӆ:oVìZy0MG'#Z#޼P(BߢPGRNtNtNtNtNtNtNtNtNtNtNtNtNtNtNtNtNtNtNtNtNtNtN](ئ$`=!Ph4 u~BwB.B.B.B.B.B.B.B.B.B.B.B.B.B.B.B.B.B.B.B.B.B.B#y] z㋶Wm'ha{} TJ^>{у^cQe9_0Wz3]552 zN;N&VX'V9rZCdڀ]oJ?X㖔ZOm'fOg>||1SH!B+ }_̗KA2s{<ܐK͝ha2`w~ڿ;ka3wYmnGN V*ܶ;O%igPNqDZN'XÂ['‹lY p*QCHge1_ny<<1~rCY'?ߝ,7w5>PbJ/qېz"`ӡ'+#j5^OHNofHN9PL, h*`!*؏_%v4\ϯg}hXWTy~tãU*N?8nnYY2|Yfwfa1~K=pjG)2"a u/ùM~GչT>sI(Bb&t$_IrXv)M[ hF;=yZB%icH@h5ίGgx};asb?>|=?Z.p_Ηg7aqbҿ鏵=:c 'J4 n_1a籐QY'],r^gG?$*w0 3 W4mcKsuȇXjq`4Bj'BP$D: V['B'?+}DHO, ئ;jWk^NSN"Na@m V)=DJ'$)5˒$>p,K&6Vv*5(; fJMLg};㰡K`Ov96JavȉXVT,`Q,nk[J4pީWڊî`D(NTxs{JY3˒r|;e,V޹!!x*`- mD:BMŲ--AXIar"`XXIl5d9k ?D`p8kkNKq%JVTV!i=Sqf6&V}=XЏk Xǎ=XK["h-ZX'&Vi)T܀4T܀:tu=X3v"`&kbYUjȻGKf0o`痸1r?,| ]u))jQ5.&~'#ո MjbI6zzkL®[ojI/ɞ7[}Фud8eX}e־}}ozCd 죕0ak9K"dAF6ĘUʤZ%W+ LxZ{&-W2)g ];22[ g {::(mKN  %ʱo'J*'\tҗ1Js`x\mdoe1x ^GZiyO#+P1_U딣"cR YU]Yj`RXHE=$\!-|9hLf cmx38fխc 5ǡ-'N8_S Ai{g1i`pK@y!hVóe^PX 8iPaƠ ~Lm9u@@@$5-IT^1@0ҹJg4n=zd`y!,Э(2h`GCƂՙQ,TG4G˳Cb)%7 6be9J E$D~c0p{l}4,d+cCdXXNzQ<]2bTnBEMt \˸` QTDI e@|CUxF+Ҋuizb< ٨IhwaYFŌʣ(Zs}g L@U6h TްՂΏ;eK*:^aLśJJ!ArI v:F K(uTwFZ"Z(aԋA+=h4S"qS.k,82{rxM$POa,22\$%Gk : A~cVQc'@ EvӥU16A'+bE`$9D K]=F LBF1t^=BWvE B@d #T<0Հ, f-nX,V륀tؕD \̧O c:77K>^|ٻ_ýIə ;;c<Ǟ9e.kKi Wrrgu%<*e>DN-)ɉ`#'NcG/'=Qj}rB]{4N4N4N4N4N4N4N4N4N4N4N4N4N4N4N4N4N4N4N4N4N4N4Σ qJButM'#ԁ?z uGHkB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&yB<AOI:5d:։G/[ uGMӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Mӄ:Fw#'Wj弮}Vz0ci^ۯ4?%] vttE^ֲǯ+QU};) {BRN&]Us>tUu+Xµt+A fRLjv*ZkcOWSw$BJW0 ޚLZt+Ef'N5ғIW\w2WU]U+e#erV΂-SwAӗǙ"8ej}Grr@R=}WgQtv->:G?zP3/?d#|X?;$ظ\w8rY *HYTNokͤ=_,WӸ<5}6@J%$Y]]g@zW}r<{bg~J^k*ӳeВYRlD-w~*O^̥쑋t#d),waG<~<_)]`7 'd1]4pXv`aM!Aԯ۔)&(mڵ0 y|3o7:]7Z{lKF)o+mkyw_2cב"n$BOΞ,; Ƌ'دw_-+x:/GrﮨsmS:q#hKeek?f{%v#rF4xH1-YENg˃,W{2yuCVCT6J?Y+|v`4֘ul Uf`!'y|\Oy@=HI zu?V ?tE??W-|l[N>\?|ÃtzW%[Ǝwtu>wr&Om}uG:=0]tsحʻwmql;)b:i~ \ [C/z6qjUhr]q J @_>ÏTG?09PhP ^z؆-m@lXt".`# 5Iv'>0KF0刀/nTѲ>uzey6s?1Tfb69g4 "= $ -oБ6 :]3FONጱC8 vgow3@u")%+o0 +71HŸ _R^e_^?EݡJ=f)d>,(u,y3~eT]1Mݾ{FJVr'2Z9^M5ۦڴ^kzmZM}pڴ^7mZMi6צڴ^8Pz[~"P78mHN_0^dTj5]m⯳ՉMmB!EyJ`NgKZɜORՏ~KjmkpktV<¨_b1_cp57U.qx9 wEtkL뺊,}׫:(Ar>:'"d]o yVy#R51҂dwiSㆴYg/A.ʸ.G$%g5u}k0sЧ!JyAdԊ-NF˙bY;r٭oTϷ})^RX"$,wgcP"ow޺xȽ=P wVr``m1΋5 "bFZl&wwȽ=PZ~4nw%)f)uHT`4&J\ Eo!M=_Ŭb,PwHD@4>pm|!75iʼnW:aq;.F.?x n&\UP0,9RmLv4:cI3Vx%HR:Ftu̴ UGx߀oYUu}}BC gA*-~A1ov|O.?Է}5t3ED K0-))8+yI9;vb@}t1]9"zW3Ij50Oݵ57#翢:O`T!l%[-Iь,H=T{LQ Ѡen6T| yt?8(^rTh+u ̨jVUG#jU 8G{őԮe$)"M8gGio#<.|vO Юol"W\ke$XjGO!]eA5uChd@ 0~ۺa8$v !VAlK*ΨoMv.~E>VBf:˰xU¿RoU}j6xZP[E=q βX1|b"]؁xvǪԱɊ#v _hm+4WF']h&3}C"l.da~>.HxmHKCRWK]DU$pdhIy@aW9QJFEUI l (ZF3{9?ZOWO=i_}j` \-!Z6jOW´k]̮c"*j:Km?憑LReSyt56nɀO>C>RNT۝zIhC(Q+  :2~r5q\5=t!!f&VyOȊ*M!"̻r~9E>K4?wq+':m`pޏ 6"~qCEOS_f{~G=G I(:]v`0(*SG BL߀{{4z }~nʬճYo Y+uߛqw \/}qNv[$ 6D5~``*U6՚ 7|9V~4*gshm !J2~bʒAPcUW@{-aha7bM |DhH;gH5GG$t6&}|X[SR1^K+ZמYT(xڨc̣|vtSOETŽϸ2;(W邠WΡMT|IW4oocBwU+*\MI#+yeHSU5t^2$kb@071*}»OMQ.Vm  .1Dtt)!e_QCS_%3G`G/U2buˈ"\ƫrvZdPU'h 3yfa1ҺD~}RfJ|XHn!QC y->a!T_d*hٻnL E9r* 5|h^%ZbOX-}>jpS#[D"^Ǯ(g҉te<:,|(;?W(TAB*NK ݭNZ5jƐ{#Rxzbw"xX3;U,')fp?0ąFN!$%-{`(B 7 \3B19!Z3tب511&-^[+ʏ^!!tQwQݞ-A;%blTO`E" ߪ|Orl2Po~0e)N =U #րxcC<7M?;3r {W}UE#O9;d%Kʨ?M'"'s"IN7 7qa!1RD;q&/efcvt~M"SblO) *[qQ+~ &$լ<8 =3/ 6ka[PVm?s{YŒo Ah Y0#gΙ7SLG1-w4/n -hW`O=};< ;'TͲfY p"3NΨ G]?,j!|.7q{ #Sm5 6Dd"'X44HȬ. qr T:HgEW| B<>|:pμ`'.v }eZ$PjFLTVHvDR&( XE mq|^TT@ΉC eT1_\u E>+Pס^ 70w@ִ,Mir:Խշ~J",Ow)HͥApe ?>q GguSyks!*ף'x7l?oɾTp wԓ7_O\F+6Dq̭AOT>4b]~k`蚊椩3y`Kkj*W]c8t#k6=a+5 J2_Fq;_̷YT`o@j:'&!Fu (w6)O v.c*yX`,n<v/ 磬*W5%j TQ<2}(*|*'Lk^9)4عoJ*w@Ωzs>IxɼѭWȧKѠϭ݆-Ξ/W5CcS49bvxYLް (A&y{};d%`rCVO5P'CEh%$h)X |ٮir2Ð`.f~N^;f_ >fˇexZ`Ah73mq`+311gV~O_Rq08 {Us]?g |\ֿ٤[$*ζm]J?1Ŀ:xN[qiꟶ>vi2u󧸞-[ޏMlmbپ#AF|t3ƼIv>nHJu%眤ɧz9'|EτkHUEv**|ap"gQ>}$X5;VJ 06U h\= ,(Lh0p(W/:| YQ9 ߙ*ciSM^ RY3IĶK}4+ FyW8GnvGމs-, HKI] jBU+ ȵ$z'T{jk*g!Ѿ"]5IlO)ml;02ĸ]#RYUdq<:r ڠjOwciuP0eU;Ov4pkP^`.odtY~uE>V-|D($0Pyc͚. }6/:R-(8 ڠuT`G}2.#ۭWVpWV0ϟ`YdG»mV4H 3] V"QԖTV3 H"͌ƛ8}w-Xn1T+,lE>Vjh~|K@MT:KXh c{y鉨`_Ԝ6Dѥ:VBh;4|It 5jɈ^I uSl~2E>VJSW"qhAqy L!{(J*zAF¸n !5lYYw [)锧\cQd<4J Ilp 25 %EurFCۗIbsjI䬎F7Λz*c^Pcl1Vn1(J8 mwQ9g8vCd A'J=e]pJ >]/i2Z XQr:YdI~̦|(o6-8~ӕByK(MaZd,!+gc}\'3`fIcǪYOhAJVrqrKYol~nÈz)`PtZ\:_ : `P8>Ĵn+uV̄H^䷍lRyjՀV:h[1@yn84bTqu+mT ^yn2 &uuWq NV:QryP7_~hl~ɩç^~)2 F+MEM-(W"(:hU1cdNVvDR3)]Cq^PꊏϽL@~$4@a)d LIM#ʤ(٬ی@G1.)}Ц^W[W|AL67nQ%}`<2J.7+C_'}uj][,Hq.ov3z!{ҹ/ߟVO (x):Qu2+nf[BV1f,`'l )Hvcf K__pCD|zT%>}=;`z8)xe!K6Lu5Z'b0ndP'iJ &2 .(.d*_}36eکB+'j6T68F#8F0cUazP{M@g<oP}x׼wtBE u;8Rzg(QT1m#$f_h]d̦q:>Ʋ@bl~͑!GoH G>‚Xq|H$3;zk2,vf8q-% )aI%`r´?O?cCGy:& j.x$_zB8VZc?Lb"d?|jZTcUex0-+RNly\ 9]b+R# ؄(OQ" xž A|0#eUCi|1h;)\G_qN)bX\tFzpl5I\}y̓A'1Lj}6N&G=@)ep0a6 yb6`&y*13l ^Nec{_N]?Ӹ/@4=#*T/\L?|c+<фȷq`f%Y[)˙8#[co:~*1/&ZRưEۃ4#fά֓M̟N '̥ώ 92z|쒇 u^YuflV\ɲ#n e& hkLjQFIv zmŕepUSЀrlZ/% O`C3s䌎T0%rD ʊYd8pjɒ5dԝV27s_xs4= O0̂;$#[e'j%QFG9e(&'gu ǔQ۾$]Lt:]z) {)zQrɱ鿕h0"a4|?2dk SBӥ Tʼn֠#Xnu;%S˘J 5nR">%qo[}2Te;M pjYA\Qk}5|mTPS-uMx>^%"b3 e̱4syXEO}>a7G쨒#&B`p% 3YdT0SarxLamN\d\)TaE#3(̀۷G0B)2B C&zpД'1W. KCR.bp38$=aǚu.&: mlQ@_(tQ=#y=n6J .$IƩ/oN!&=qk'*iDžsɁ%=і`H2E0a.=r/L3L䏋#jP 9sI" 1?T,|Tu78.LN<ޕ=8.K8ʪc*# [0q(=e6Jcm]$Yr"9_%P T R0qbqv\ uӃđ83qD!̩[ G!m);)%>J1_߃1rYT#y%2e. V ͸a9P`dqQi&ns?߻=y$W@]lT.`.=AsFVOfV>Rp6,y X3&7xfJVh~׏?gEYEn ٌ܎o4qO0o晛lYCUfni٧}pcXee0_\Rca/{/Q%7|-s[/O0rI$2Wҭ6"kU"22\e4iB3CM$^lkϑ M}t?xɻvSWn4.૬}"(Ul>.%l?Vb2d7ZS ZL2P(qFUUų-AO@d링ȵg[ P<b>|c52v64+]٤{&-5-%/+aik 1Xryﵪސ-c|MmB;JdEainv[鵢.M}SLVŊb^?6_KgT y?!l~u;J6O 7ūŽwQq&Qo[}Հv> ܐ6pa'q)Lewhexo;]On?jwoejVLF?}gw0U`M՗4*@*^;G{~۟q(sd{^>Ͼf!K kgW дQ?7#WvDz\<)Q-%MKqRK ?wM1myC0G0Its.6:,VF ݾwv5扩ySU)O 7 6 g.mM7trQůW  yMvm*7_2a' Z23(>F?ѻvQe1L4 ܠ{kBEt(Ҧ>}:f 'uQ x/ ?/yq3pϖx0PT~}s$ ;ss# *]*Sh14)bF&0f^._V;jǼDKz_ 5QXLc+с64i}K~\VO]Yw{VcɄ߯ʈ:Ӿ8zx ^iIJv ]|HW>jt:M/DFmnԷQvs( GPE zoWn_azzpo`,siQJpd,yL+%0c+UyYr$0kaVvM"~FML&W_kaђ hh_^u\ 1SJU*l{>yj ti aPXfk1^c q+ex!Fi${hsfa tSi "$Y(]|!шjR%:Qt͢-q[3|}NT2eqI@0TXM(̈12ô޺k~{]Xmx!4u+ƙ2, ɵV¸() HQƎ0g a֓I_.\ E.].jxt`#ҒjO[g[<P|X@/Yf% k5< .dSXqd( ]ny OQ2d@0*(S-\8p\D*L$uwj/0v4b, v.x)[zu܇aѷ4F,46q`;|mW_?am1G9 Q˩qq@4N%MS)Skv9:y1 |km1 GeȖA㢬dRς Y-}"?&̯4K^siW_ka; ;5VfPGOia1ūiS:QsI3XfSLڜrRڧĕTN"E`k)ׯExaC.mU=jm<{[OT6q>]:x^4h._êU4:@2rxЈ1F3zFDO/ >䫆^e}#Y`tW=a]ޒc΀k:,՘i(uxf.dp5;AUe reyׂgZ;G+0EYZqꡃ;D"|(/zUgF_o2@푦q-L)TNR3|}:x g4vmlQaBj_b%0ǹUC@ep9V*\ְR2kYa{ams0W4 -T`0 x-8#8QQ*aY$?,X<Ɏ([=εE10eLl#p~\^]sB4fU͑5<˖ [J`!Ie4)?FǑ^,;bR 1hv'\>u:*d@;d xF PoĈRY&&Y:=CkO0ӳfCi-uYu<`nIC٨ ;u3#pg)B4َ*+'oep^C> ,`l( T)²:63")Lvv`3Dk 1]Lc]hh MΈC%&$K/#SXq1n9 pJ^B5AmM}O0ln0ߘI`]ڸ|d2Ns` w֨[87z؞RDWvuXpXW ,| Ҁ60 N*)-%4z6oͰUZkzeggȂ vY0I N;*0\p)BD\S7*W v&cW{0{Qm"dforoo8nEu<81$EB$u$ø4{;<C_)Dv("Y;ϒсj֗c ;S\}C6heZn`Gha"vIbX. M5w ޹D>.J1G:&%>~=a8ܣN9IIeRxF61P< ܉2-jkh]TcN!ˣe*62D,#"?qɿb48N .68lv Kv[E%˒tۖAD.*փd=Z֋. 8**}9pjic>+Fx x<| XN5]ܸ_Zpa >'J0G;YQhv(ǹnr0?&`'?* 3bVͲcЌ_Rf`O|C[?hdfS3<|liz||ϟh?MG\Xqwh2,AM>,dd?L%վ9Hҥ/Igƶ\/?")701~QQ7> #5 o' =7] ֘L0^i& !jV>\ %j)%:"""1<ͳiWP$pG> Zb ^xP\@/f6@ oKM8977~Þioֲ]_hɪpDžx6l|oϴ<[%>zkjƕPXnIߙ~9m(m8,?fIX޿G'yn> ÏXlxq2mcoǮs=kOͬͩ6Rc~|~{z$(2Q `\ѹ}LgFAZl0 o؍GPNK( L[gq pF= VA[#n%!ZSx<ry<0ed>O\ŊOejck6ޮb. Xdq)#61)wP*`'G.yZ@˰GZrY=ASf6/d>I,9@ LٖEј\0ÜדtaԪJzx\'ÚJQ1)Gt͋hX 5$afOasKvX 3 Ɲ+[Uɉ9fߥJYV5kj*4z?,o,}b;|a9W#a& J]H${4ӇP< Xb: g v49=y:,یBcE3.t\SkJvGoe%Ƅo]M`s : =B SoWSHq@P)n 9VrjV{ uOW(e`k\Ɲ K,|3j|s3 j3r{a\qUʮBV,nee]cSM;ЌO-WU.ZK@UE,խvyb=h[@:EWJ(khׅ_ictnC{+S2Eu1* SXYjeNfaGt}0_szX"SިrNDK)0ʰi^/)J;%_(^VB1HzeJ\"F3,e%ÒatV~$钬U4vwA ksf]C.uPP&^峘R82BLgy_\P8~.AxɅ+.^q)w9&˗>m &O)%t%чEY2i4^ Q#E<7\?Gʨ૫!T;K*KKbmGS] ias,gc.4#ZpꏋiLI\K˜F51I1+sTL":[;$bѹTLk2rU|Z7BA~s٩.ߩI"^Әg փ1MX 5!qϑ3MʫZt 6 s4&0x:]C(͕&lQL(!歆+Vub}7 9#bѵ:U^ҧ'Bbc A)X_4aM>_TavuiMOɼJ$]l휉[p腑JuX]3y\k5[mO*(m讚l"v?. lC!8b(d]Vn22WS2Gs|76/)EL𩘚w7 7@;OނC$_tK))2s" 92K`1aFSX`ٹWF%ReCDI> n~ n2M,bKnʿoo\V~ T\& g|NCU}waenN& / %f"r . Q 1NFgeZ]t"? fUa)b"-iJ",Թ2[2=Tw ~o-sHw_:e1AͨAxΧYcp Eyu8 ]~d߾j|̧KTFEn~{=멌_~ kp/n8=PԬ„jTVÃVۿ *Lqֈmm쯶0vI@ӹԹ=WЮ3z :}(LL}]%~x43Emz\RAk%hf.lߜHvXߕY]I?;YIGQ>]W0ka$0r ;'să-]Ls\Ve.!7Q.u4KN(q[; u)3u S/OK&e``Ksd-"ao6b GT{ GƤƽ0ZϵB<͞WH`H,!jBю9.LN&ľŗ+0vOїX|C"iҚn~aӝt򚫫KDtTyc WbSȿdFf$QRotaZ u2횃01YŴb2RS\4tҔefȈWVPNYBVZSTzW.eV6'P15ډie #c܂^ Cdc+^O>/i-Vֆ i-)A!`&8}lOH'd\6WK; _ > eY5VIc*1 O<f:LѢ,JdɌ% 5Qp;9zV4e[>\tq"d?ydXـA:q nψ&S hm_) )dIFǪb9 e(+d9#>:vP.A mD99h8j$$pyQ¤xy bHo:9.L3y|wWKb4KEIuᶏYK=R| 5"w1Id[ƿ0QOGD_ʼnyVqlk2qiQ0!ޠj%uJZA~bPIAd:8$u%Yp BrMF͘ a9 ڣ^g8UKD`,ĻhMk1DžOI)}Mm EU;'{/;Uws31EÑƐhI`H`$YKFwX#vH|`:8<<>"MbJ5R` $YQKb3E~@VDozG^p8!y֟l<2 m|yԇokPp5\fጔȊPԘR9-(v2EAT3ȃeac=s"rwHoA3vv/-kT֭*̍F "-bp]R@]<#0MNVǯ VlQw1>98]S68kI`Fm1Y `2k:?tVƆI^$3BAVJ+#":X(dl+@NQWJk~ɼ:+mp <{?mA]Yi0K#(Ѹ&ZֵkIp }|o;YEKfp gMS DŽ[w 8[U'{JrM73Dy&y3 t[3>ܣNtv@R$PGh|Џ1~TX%;IumBֵRQx6'MS0Kڌxf`lNSp2"34T)UFsn o|-y= !5Y33QY 5( 1wgʌIϚEw 2TƄ:GaK31 gT<2bU}lza Vs=*oԚ|W}p`P>AMwQ<2C@Ј e,=vX3w߮qIZ jrbT9)CKEӧm@ɬk;x%8)+ Ц*l&M'.SmaPp*3uw N2Cbc Zs3yxt<8=M?/.Z",BziU!/)=,+0<;Eօ "UͅdIЇzu0$b`r] 2僘kQY8kR-r)IRn ʡjJakA+5"B^V*\]c~/F%Psl<mַ-=KƜd'A=EiԱƙ(<l`.E[ħtQEw Nм^T-I9bwDo Gf.rޏikTDK4jȪA;3"tV`F5#38{JRӰ8\#8eA( όPAmL%G^p N-xe@"ml8~D!s̰d(?)GfiVC Eic Ҙ{ifLHT(y$]v0g$AɓTa*3Z E,5M #Ii߫6"y! FIb '2|oj^k$(rv|̛]Nwqȋ }<(x!k/djLam{1d'ACKE\fSm]%0ħG\fF C$UyQ);xdG'ZL[djQ:Y˱bbGCHgF$i6)k9|c-oN$#Z5"s.9s);<} yXVnLƬpp2q] 'xz#x s<~-ݨ'jtK,hO`E x$WЯʓaUDp&3J Mk]9tTy j,Q@n{,@Dzy*mlUFn#Էֶpl~@qt?8, hgG/7˗uRg||?+ttMpPwO|[\+r69W'@6pO=zsx@nڋUn[yUv ~3 W(, =(VXgG`̖o&uNn xjeJ$Y[9D-dц(Y8C]q>7H!V9ۼRt`Ah?[t|z930Mi~-fa#tm7kM^^9Egt{Y/TW?1,45՚:_'fMl&sTud{*Q֮RNI=e F-X `AK`:}etZm ^iHxIu0Wbp=k'5hF{^;Fk*7^vzjaW bC!:LwZ8D ޙd i.}M:et8a"u9uܴ:`8+1/X+jomm~fyζ6"z)<ȍFA Qh[ y:H~KsL7uZW)"'+Fh !)RY"%L3P cPF0-5]J0_C0DaKj]QGAA;_]d `TU-Ï]]Qn%/G]Q*%OH]sdUGh{*r+x}Q0V2 WP\_Is##D]Cqg58ϔ#_{vz9๜4-|flv#v9Y¨|̾{ NM1`'~k:(6of>$!,|ng2-kWwf/ݷ8*.|\L 'mX8aCNݽF 1,kLWϋgaT=?Y~sWj5sɮWl ܬ׸/S~ovU;/|[< ڬ+vvZT,}BwJ /Cb^.@|Etncpi)#H6\ž)FKc7(:~5žS1d_$wb?b)tH!%H3\!G0.16W-_E: zAz,óv]T{6hR.O2wLs{*s݁tN7ixm :ʳfu 6cn:mT t+Sc cU#uUR} -f,ў;pA${`E}rsv?\ieԅ4LŴ 2X}V3ޠN [-IV=Z[!.-W+G(LnPg_䭿oeRAS]nۅhA{g N*5Oc9E!¤SUbT1RĨӷbT1f?Fu&/S}뾠T:7G&.ýK=25.wlubE (tftR1: Ms~=d 4fbZoc4ܥc4%~H\ G>Rjl٦\V&W`2 Ss DoxfIU, He F籞^vOj?yuĐHs, 8kmT7ѻϞ"'ɟ6JΞPb@fO&1 ЧU<Ā0ʭ_b 0'p͓y1Q)_̘'`OrQWaL>uFcWW0J쫺ԕ6W| KPbQ̛ś^&Mg?m۾yvtriĝY9QMNּXRS ogXW ʑ!Iu6#|AUrl O! M-|DI#(|R6hFPF _'`>)^X^ C! +Z=GPH!!,i#|kIjӥ~c(\=GPH!)q\QĨr^; >تƒ%oXjVP˱Sπ0ra~c(^3/8W"pZOV`5rଁ2.{ I#91zB pZ`DimCLbRK(Qx1R2ڃRkNHo d$bJ^U=} kR1I>+J$l)\¼qC/9ڂ&v}GPH!ģQU )q 8'l;:_κ uoNAhq|AKEIx}1RɅ N2<[ c]i }SH!1$uC90RE(i=C!*zqXVkibְ2dxb˞@ 3CR+<]zHjJdk os*J'| S{B{h0^"-guPH!$酗0-aέ?{ƑB8blZqg{k#FOw3E2$% _ IDR##u0_"ϽhD|pq2õu٠|PZ+nC%('!aCTW&Xm 1@e9B,J{ߧ2JRz^c[i( .DWc`<7-SXW+"jQ\i1,MXt:.m: =ĕ*Jaڏ(bR_"p8W֝,som *ylUt5lnuOB.8 $Cǭ2ETZ}ic/ N-ܑy9uLkO1&>1YF2iuy.3/VhUdU){7x,Lmg]R<`%d? (۫J\bzw*W!!>O[uʔu\k2"(Fh:tZjgqV3!4'8KS,T*NǼc1a .2F*+KtaeXZ ٿ-Ny; 7VGFt==sO# e?31%_ h7c׳EV>ػ->v.hE;Oʯ,z~=R| z/ZhR>vQrOoW_)zͭkiև.Ytԓs9(:9t;*m) hL'.>]v+jl_v}Z-P~Sm,0 U]T5~ }~ hl'pu6t#tO1{rn]耠Кy&`bIβ8?/qE¼~gq#X>\4&-%}-| )Tn&0|: 昅N:_Cyvy Ǿ*fS|*9xƦ/`uPmثTdDxX!޴JCTwvmRz*~ÓuՂǔ)n̟ \WN@gmMnU55XgVk͠_Vt|9w9[-vS`4ҤPOG+|k }~d nc}t,e^DΧѤ{vF:qJFqnj+\s'iR-*{N¸T3IJЗ^JЗ/%K Ӻe dO J R/%K R/%K R7JuJЗ})A_JЗ})A_r/%K Ru?A#蔔%.,))KJʒ,))KJg_ 28Pn /.0Jܫo`S@O?  (16ЕCR(lϐd9A<K9 Y4Δ`S?|~ 3vʘz:MzUo[4eѹx-NՌMH??n:ޡgWz>K\16{dbg[ZgngIEmA7N+Ż׏ֻ?f܉]TgP{#9gS  / mj̧5|NKMv:L )Œc_y8knvm98k$]o:x=%^O_c0HSG{(n吣#OUSGE)J`(ƙq*Jqln*jŽ'j1mD#Of/Ss{ħju AYgu265߸ՂRb.Vov&W64bIk:ކ/=)g$J<3k;c*h7V;-e7qfqh%*q/by9-^ Xyr1Nf d>Tʌ]* z S.%S7ޥm=*jLӊ^ߝ lTc}YUM"iy檂§hդյv6fQuTLD 0qV+1ȎcmfYI$g n_Mۏǯ{A&"}lo|R}Ԥq''rGY/J_.F1ݰH VRX WTmnD*є:+hc}zCѐH@+45!u=d]!7ܢG[pӼ[47yV)*HTP%UG{Rƒ*[0 ,pkfa@Vc_4- x UX(nx -*JIX'WBjd9(.m(-WQlLWR(i_u;;bALOP;*rt,3[d 쁔y5^Q1J/F|I¡?70ˉؙ|F˹cv_g^|^ow5*l-K:y+ܧsJk"E4Q`ӞB xq(ŌlR{L2 Vlxvj: "[Ipk-9-+X<̦*KD[*J+URWO6+X`RgWQJ.Ӄ+fH5G,w;qtu/*}p&+*y ֢=*+U[*J@Ji'WT 6\;\Abڟ"\Iك WQ`[W .ǭңRWO4&ҦQb!k \Eiѳ(%W \@@`Ң Q\v*J)P+Z)ƒoN`C W[+ g;iفj;)[MpsSPPq}Q燷?8g֦|Md#q=JXDTklF7g@;/-E?yi6+JEDP)tD XN$PL5m9 f$s&bEXS-q?Jyl+B?o5R/Gvغ/a\_Pdp;K=uPTC(OtȬCϷȍ{CTP K} I YE_?<0)xm6bddR9c<>KPonoh z g`GWG-P˧&PiCStک;SE-.)`Rq$iQؼTe-}_V}M]/`}$gV#c)G;ƙzn+7OχNQQ*]cCu.l@epmHЗ,1{㐛{ v,pb+E I{YjRiٔ(d,U]OUWWW Hp+}&~XT@.T;'+#s))A6wNIv뛟'_f'Þ]|KW]j}YwQZ#|-py._*O>>M|i/ZtEe|ML'Y$q S1x9hv0+Ko8gM.LYSKbJfs{\1FW}~.z>9[/W:sޖd'+Ynw}B쮡(kLȳ;پ5D#ʣ+XݦVՙ@ ՙDoj;k{:- ɝ?eX> 'YxVD6i8S^._ϝnvOKfK~VS#q^)_!-N++*qm$]ŏݯ:]nʝaG ֜m, yH lN2OcT~'g77_\:Ԥ z WI.L؈WY8N±?P毿F~qYt)l0HO 3饥T'^Kzc=|,JgFηNv;᧑VeǙdU5>S]PP?[)#̭rMYIe@ɲ{ +Y#^ VI_Y+:$@ekzz먨`Z鯣1xJ_2\,'Y׾L]g7Z:z t8v{YtS[]W۶ǟc:B" XF^Bo~қ}3~+hL@+D4䃖@%A6h:+MK~0w,@-N>hD6hJ!ݱʘ&PĶ)zIrq R3.QPMK !.`+ǷGٌ\qdوZXfTVTNrXyubW@~j4PK8uTR++({+ p 4VpԲciFe/W*J fx8pel(pU [^\QwK \ir1 \ijÕRk+@l@p l0pUb(p"}Õk+C fPwE.DWpE9 -ԩÕH WGX 0S 4x0ޕ<\i*^#\69}% 4:\i*Ws)J|Ut(p"tvMecp5t5EOxCMV}ǁf# 6q3*%=- Z:TH`q`J+PJS{ p 3"Dβ )\d_>NtW(A2Ss-P(i̚t;;_ЛOe8MMEgH 6JڽCeѪ!Ҁwg;^k҂T aԭ}Z'Y#=%e$xyk0sٵ WrbFY+)E?K;O]Eqq&~o#AQ!_Iv[2X˯n>͐Ff;¥z& U\fy)˃m$:*{^]MޱQQUa"Cfp9X Gb~G;+a\{wA2,ΔZ+KtY߃yZ"JYVQÿ 9{̂|ie:ZmaLpŠ? &ө$Ӭ9g7N? 5^RǜF(S|SO3\1uFփ=jiJa$n &rw7qCƄH,r i8D@wm  Iz$Lrџ\ɵiO*X$Ձ- ڣ4t|p>wetzBRGuFTn+3W 3G%&z }=,ӰC&Yخe#^ gz2TɠC@/-L]M0h< Np>ղՁKֳ_R&Lj!k|稯NҎ]!S}[ >|w{4O1iPuږ vXG 2LBuȄ{72@de}Gx[6Lt Ͳ>D_Fa=mL(/Z.9;sBitrYĄ8PlVFq1ɣнM/y<=P% hZ/N0QQT6Sx9`$'2=KB^=d\?$y@x,`q1 WrP F>|t۫ TQgJg- w848%-sk^~p COT2f&1fp~y&S9M/g}rСjXߩn6ʰx'qs kZ...ս܄Kg>bOMi!kxVtX}~SiƯu2ǙEIM|s.P=2Ȝ#q!CĴ?j|uQ *)IF"rCA4){>df~ȤHQJ4mʀՁPL `5(ԃGwעAXylcx|Mv)W\W-۱T_ӗ47{|ȃ"Mfׅ_h6xm ?l 5mc(| 6 :]~d}Wk=-QmwY}Q-ᶾִC$W++Ri q,7ttTzr_$nuڢ5emvނt^sjާ6S0Fj3')  ,P5N%&*P  @&(Rv5 [(R"ɍ.^baS$D`6̜4l-F`Ҕ(lR8uʌo RNvpPKeVۚbRu-kt &PֈvƙӖenscx7[mj;X/T^ @ #,W]ܢ6+ftg[HxN(vөT[3)oZT@|!koӦC5Ve%jUne|-cL DPIMEyCv%Ԟa#ýwӓGbX$ FXNa@L:{X'Oq< &@u؊` LO4jGnƟyvnڏI[ndQGiAPgqQP4NZLW%Ty4z;*ivT.>?"`g"?Ǒ4_G\O_)gY1QY2d-ߎ::)D*=XS픃]Η,S=M~2'J$} 8>~`s__:݋(.*, כyLְZ^ t'S-wbw2X3,|c'%~qYԃ[RD$~|!M'9؁(2J&(@ lRtU 8G"\ !#q\Wgor{8=$@P:\?\;ۛ!_znZ2ǩK/%U?Y !~sꃘ u:_X^>X^UtN>C=RpN$cő/Q bߋ/1y<9ܰ I;ԫv*%A=0p¥y2m'z|*< ztacOy1DILg`Ĵ=8tJqO̝޿9Y3.G dFiބ/*y{fqآ_BȫeLpN}T|*bN&wN2n]b uIu&°rrzfU"TqBZƒ9aXwJvغ +07m0dm5}N5A:wQ]ľׂT^natv|X[A,L}gYk( vv;:d>FO- 8! "DŤ |}  Wz38Ps&6qiVa@q]a+E) c?l@(t f姳/V߼t?ojyi;m@]|ܲc+>Q%N32SI.bN­k;4 +mվ-=rt_&ɪ/X;4oޮtK(&QJpu1OF)Qz8RT[9~C~f6U{´0UaiCaPE 46smK9k=_y uyc]:տxYjZw}ٝ5v7n''=|•S궂lm}Gr5$Ry%AcZRXѿtX@H~^֚?G%t< :䆸\Ngo%37tX9?M8|3wNj:fOkw+b(}p~7{N]c:m!ڠ4.)Q^TO sz܏A_A&ϫhAr*#,)*/c2y+-h)ef{IOF:裫tyXT\~#0DhÉ+kѦR(0ZMIsi*x&`s 2) { N+$F|>/6;mwK;X7н/6}Z -J&I$ ZG)3K8Ѫc@"C謃*R1 *ַFT/] њe٦ZY"kړ`xf3{_9ycH8#hY=:˗Ȗd#D%Vk@2G +c"X+LY(czydܠܢ}i1F22(OזofEMy5 .yg]OKj|ň]6A?3:UL-A0 tOI뫞B/Ra2*;p.fL1Os%Hf'c$I>BEKT(˙mGp,#.lG Zط#eѴq8;kNHS/L1uoQtupĸHnJ lSy0 "}jx9Vq-' غg[jĸH#%V#BEXb"m)azatf 0,BdNDW W ]\%J+B86"MtutURjMiA4$h! 8`8| O7І7U ((TCbB5e)|Tj 0TB8+ +\_ ]\uOi!ká+#3%g@Qf(+V ]߮$NtutesDWX9"zy{B Չ [YNbrWV4StutR C\Ws/ ]2-Uoڂ]u~\U7>]uC #L2]mzx|V.߯mջõ:@+/r5xY*sIU0Qqu$df) 2ʊN(3O#=JE1(r.cI[PEi!\\QiOZE6k4O(`)+mBf#K28"2ˬE(SIetLrR\rU)CIXCCD9VEe2]OYlso"eT. q#yœy@'cJ *#{н Y |Tx_:g0|L)*PuhJmLj'ѓf.2D+(bJC*U$sMZ k)\לKMXhW*Uor[H |C6丯lŨPc ݎR9!;4o.YֹXx^^5|rtNHt ]7 1?9>1đO,(uˡd&0d&jJf`2+ vOImt5,g$ 7{':x:6nO,DžunI qΞN ݞ5]75y:tbpctR#≆\ ?=U}KmȊػ9x9EϲczTax}R ||ײFTڿlYUX > ?m:weZ`F+HERaPP:9biƁa`v:=jߙZ,Tl.DWJ\0_(ޱZ+~9D'2@FEb-Ns:'[7Вevs4֯uS'vi},̃0!*lvW,Gt;0~5iGsۿtr 3q۵H6HVps%YjigUgmuFTTӪ6ܒxy/ߕ_z[Nvě^0-[ 4EFfBȵB O79{ ruqq38^m{G~z]d@Wo8z€̈́]Pg@vvبo?#~CYYV@NCrL)\|˹G8 qz^q<mvqpǥxz40{WؖGS߭ 4Eӭez-p8j_ty$niv%tDr%2;nN:ׇ\eDc/mUۍ{tLTa[p坝J_rڤ-fީ9>:w[g藚mޡ#@ 0b5bؗ؄RL+qmE]궴N('#Hp˹Mh;]J`] ]9c +)"+kU)tEhb+@Ż[TT/DRКї!SɜDW-|U'Gb7b-]{n( l]mzRG"GG8q!IОUVso<`x9y.Te4\ ӏH$-H-m[ۼ=~ׇ3\Gh#nr}cBCAteA:Dvlk]]I',VP ]\K+BkPmyg+3%A00] ]\J+BF%zJZ Rz^b"ftn+߱Є*g1p-+&wEh;]J-&:@`4+)"CWרRH>0fMI" ltE(T +R19`WD7zO箺#[ t&Z :rl嚳0|撲 )0 I U\c9rI^#ǽE-qy|ޘc]:hg^kE~8ERWiY~2}*!JՍkMu]6 iy!a lvٮ5kD-ɅQ+Yf@odv7V뺋obෝZܙnW})% %ۣ٢#yR360g=' {FPE3?.]cZ®[l?㲶vS848Cl'O~8E.nMkhkSj+?.5+se.*VD:gʉZ8snMhְqdp?{Oƭ寰2@ry܊yx\eIJ}+~MnQ9$sp Cgd}o]ϕb{bQ˹]A{6ɜU>BGWh0bUFŧH /-uں}sƣ{c3W;;ܙ+H6!HvMJ 22YsEoeU&+5aOpP8Bc{u02Ch2EfzS&yMDбDQO53d2N8B_ %P KI|r㑷F5dkfY /\rw~#f'}gu7CarYcYdŦ(3;?kQyg*❟)yoy1w~VP$PXK9s`%ݥRJbξю(Ц Jl^*uIEi5b7ٙNЉ@6G:u4bJRfES$D|2nVJq4{ A ~e";xdy ˿?|15 V|dT#Jd Yg%>,1 GHfT%L2p"E2D $46793)2L ayW^_׳lߗ(Y9c'+dza+tZ8 =Ǵ7ɃI7~=w\}YvI1g.4G. X9&gRC g%VVP0Y25$Y3`F(USkivgicy6fi SAShKܢes!rZ˭$9r!re8N4ӏMFt6bӱŞ=x{֩wo ̟-\7R!wo@y3W^>eEĤqn޻7NȪ ۴2mo.ܯ0. Ş;R0-t" .6 %ŚR.r%CJ ዣ1\ {YJf994>&?:E-3ӜٟLʦj gp5@+j 0얞`lO;߀F( 9f6yB cVX J*2jrs[[ĿKcT4Qr8 Q}N}H)5 ЏqQr9wTkkL.J /G۵5sYociМP)'tM7,>zk?Кiqriv0LN9T,6z^o Y"pޝb9LLxI8^hKwbsZ<\p $#Lj$+RN喸/ˌQn"M$wfVq;[rh^ҿ_H&i \mOuf)6i"Nfm)IDpNѩJfXg\/!N$7\氙L.zvo.X(~;_™Ѣ!^_,sM T}e ؤhy %|@cK*1pH9[ƒ!nǙ\Sug[욨KpbZxk ;b 'V^(/P2td'$זdRB`mF ɭdXguM ʵV{bЩIԠuVTHQ;TiwJ=a(([A)8[3T) q{H0T_Hn*ɹWpKQ%  ^[`Q6  f> Ŕ;7 Vfs֏ھz,X?O{_p>Z'90ru}08F 2,<@z-4S'PCp=xq=ITbRjq/^/23^L|UTLVt:gn/lmS `rEbkYæ2$'5~b{]QyQyQUn  p?ƟCۤ~;1hdm]˕|6VZHTDMK29e3?xC\mObCo*"\)!\z1;L!K`8s('KP 'Ҩm&b|TP:(1ePɑ ԉy"9Ҡ1DT.jЫAX+i7-EU#_ıgR-5&a3BإFkz܌}y~Rp2UY󥳩ٴrr~V{\~>^6;!]|_=O WA'Gk7Ѹ%?6OX՛TQ8y*;Ԣդ.hB/ASю960{̹xѾZcnFjw`Xn+A6"o"#.tl|q{=j00ݑI?:Gю9Ox]kDtx#>m=TmnKc{M~960]=]y=v6Bos۶<ډzf\_Jd}*u -յTi6|Fjk(WveqBtוykpLi Qc87ct̢KG D5`_qV{E_iL[LũvwۡE%WlJii?FQBI GP&~O v&"4HDN0Tˎ") T6m]TVDJ'>\veC גn<1u; ![8-< %>E$iu<;ˮl}z< clAkB[b#9&!Šך5GۈEf\tDU Fd઻%nPjtTh$Tff( ĭu s`q0J&xA6e'h=׿^YuN* d7Q(kہ] rn3[C#Ckbri _~/~-`=vkġאm!}#MJu N=}Ci^DVeΓvv-/[_aܒѽ[k-)ơ!V8AfWg (%o UDoz '(wLb:|f9ma:z_AXAtэ7v ]S[>x$L׽Arܟv 7F$@va,יMmD@%um+~Ҙ~eDA/l/䈳3t{% )H4Gح6sKKqQAbf$HO(@Ǫ̇&eS-Bo2iF+Yxl>}\7wKCkκ2(䄪g5DkV4t7lbhC r; pHI\k ~50@Dx#$25kʦ# S.p]lLY>Ĭ0h'shͺ/4@I "v\oDXOBV;sW*>O0Ɵu]B>ָ2MFs&YN[V{}s2]zU|]_^rʼn.h K4Aߙ$V8ET:=Q2$jI-^Tvtk,_1&8(A#Rn)UfY=Ê'@La|~ć;[̍r]NOս9^z.;2cXd !Q4E[T*DsaS"AZb#x¼b&RWŽcmb ,Q n.=L]pk]KZK (r`/Jh2EfzsEMRJP̐#Y_z&v` HWeJCWіe[-_Z)[*BvU^.&ii;OI~*Wi?{*c:#Rh"ΪIDVny3 ]ghNJ99/40uVܿ+k~'v"Eؽ}db\;$gU(l0h g^96*G iXaR>WVO] GOn7{W۸"q03aN?qy]II 4ЉUd78wa "+Pb,Yac(0@e<1B,#]R<޸Xב5F#Ǧܥ>k\Ovle>) IśkVIBE=1lN~@<^0:ّD+΍KbܨXȔ b.N){y?%F?PypH *dZ+”)+NZπ@#Pqw~h0RA 46BLt^I( *m::uv,f/ZNC@#t>>A[6YK@ں!ś9 O;`a8tP32MچZ5Ra!D&@JLbZFWFZdOAB 4l(TC#(󥁏܎>I}2QJT! vMO7 2=oSxK xeaAKMLVR`1tw̩As> fh < 4o2n).`H+$|k60&H)to-NAL9cz 1mq~@bd;뗇C$*,=GdzBj=`M~ neh&{> 0 ̝ZX4VYj Rt*ra_U-eJTd@#[cL1 .l2Ens ;ul#UL{VT]Fcű1$V՞u 籢cTm!i wr4lr.(Av?5]gֿ.cQ[Zr}E ic$t#p/u226k>̊5|aab\$sƋE؞`4RS(7"Xyt҄ijlLI@;.f8+ )Wⴙ؀<+pf6x]=f"MBjTh O (ɧspjZut%uS 2c@:dWm݀G 4iDArݏ|ݓ S>x2*Ǜ_x3cSHa]R'H+Q:Y;]>p5- {Wlx=BwTƢz{ڼ拟-iNg$W) } +|,^f|n7%ʌDMM<%hy-{02Ɗ̸H]y_ݷ myTBvK{CCG)`δÙBfj#¸,JoOH6I׿9? ^Ap5h6vgy( YnfJE*NUfrW:#|atG֜-s)3Bgz,ru̿-Z>GRL% J,)4G ![sΉ @c I˓s) F%L&<s1ΑX *R3SlZXmO;i|0HF\ /3e^y DVsJŴ|@Fq=۲UC㮜Acj_ۖ<<zL_C (Z[> 2eyz҃] @2p*^jG~NvjQU?6X$e4a٣ bwi'[̈́i`~-T:+;{.jKh:Ol)((."Dƈ% 8EŒbhW] TE!UӅ[,w)J5N'ėF: jO*.w^lj!0$H%"QD"քƩx.Edixz>vB<+e䉕cAVȅ7j\o̷0\#;  ݡU xDՅ#mӨ*%gb4֡QQ&O[R4;x5_L,L"IYt/=&~ ;Fj24PrU Ʒ}ƷvLTjѨpf3˰ꭈm"E%h}j ئ) PoJ8YR8pN~ۧi9I߯%F QS | (eG}vzд#h: )fuG?y˧Z S\)7rxZ p 4–2{^/ Y;*ͯul^Un^OvYMDɜZ4OJ7tᣅH3!3v$hRJF1Sm@H#yO8ED zZ%sƵ'$K V٫PbKiIA3X1KavoP+4D uɟKТHɓ,d.xwR&CgK8ڢ8Z(GzL{+ӯIQm/w8|` - eem7+0P}3;`,=*N;iLcELdT-Fhw )o]d]aq?c:z`ƒZJw/j|]*{'v,^'tɕc*ɝ|7U~Q܀M>-X[x<)cc IVs&Vú0~'Jf;85Y7p}C@c\;CVͼwu͗h#"]-VڦKӈwaq2dłkDZ\*8 M}o1]C,1F$fA Sk1FTH@WץF*HQY.c.3R4;Gl@s k:"7k)mP6:_c4 4zRs5S/h@ 1dQ$H)]*ͤ#wL) TܟxEk:Tk[=dz:|xv'>evj͘-'=AyIhHˡ טy4q)[01|64o|?OђQ u9{BW:`Mb=vE"˅;TJeƅձSҵ-asp ;oxSj xE#| 1K2:t3cE Q ^n^߬4d tח ($$Ǯmt\9O5>Օʡo nhə.fGڏpMC +Txy^D\-0>X牦a1I.VtNcrn˒hKEtZiI 8H5_XIʘCECS^9ܙdӱ4s8iA>pBS8.No峡@-ui4ndP0o|3h\ ?~dZ7)߯;7tww#KLVJi:}zf"2 RM tmPܹ5 f?n"~(̋xW7ln6W "!AIoBtN$`9҄tqp%f_ST flM JFPHq6)^=06KRڜ}B)>kI㪋Wz;$(s#.5cdVׂȡ"_hм k&N}4=͟j?כc1rє|ɔ\)Z^FWX9,˭Whh>y[RSciiwj.x\*XF<si,k0{Գy^"ݲn#Tyr%붨Wpv%58ttDU& *N_1_C{%7qt_- @2p*^jSIħ+d7;02$㧢o;cw/^~qeXSv@Mp>Q thB|f?g;oKW2Bެb})VyVZm %;&-k~P`KVې|#?P"dµ6*mV4 sſwtd<:mmD F:UuN}b U:X}jy@V9Md! {k5/o(~OM}ᛶlkm#9l[~0~8ln"EJ$-^~|iHqȑʙ_=@A:&דBI (=s//ɴɭ-^dֈ3퐦ZX˵\{A)8ԈT 76AB46[?g޺g?!0H԰qTB FЂe=5d=@]ǥ=;BTPL:Wol1$Be=Y7 - +c_C LTԼEPf \t y"LN!9VTԝQ~l@7 0GOZk A-1{oM4IrdVcE6`D! !*` ѨݷᏗLC- ZeR.ZA0u_Via[p[ jVoRXڴ1O/{ZVAi'û{]ZVA6-S'wR=hO &յ@ DT/MZn$Vk~tKBbjQB.?Sv[7r: "\L#ͬ@9CYU"Xt- !JC :NB%vvz 4$9eFEwpJ(GmXW; cb'i~j!V[pe]fΏٯ~|qCH%`7?SX>@v|cX`Ya,w` p 7l[2cBLWaQs;Ǚ Y -4kK<0O{X"[Lhf6%e6(Z?]9! w,Q(\xv/߽:_*k?,3;;bVݓ^D"$NzmBSMy/L龜n<Ő'ʞToހwJ3}^LO]@S3[ωҤ{[XiZ75e ^U8HyNN uK{U=jbQ˜^]Zhؘ&*[ֹew ۬R7 .22:HIյ}> J9N='W}%Q"͙tmY"XngrMyfO~hn LU.b~07L9Y$ DUVƺ߂u >|) `51D0TǢKLC H˦Ο83%:4W 0TQfwyHa9Y^QE=P|(k@Ai%4󉚯m͠Fw~N81#։'UIO=~$NJy)(:4lDOْj?YĮC8ߙvhF#RۛTKu.PUkLMygT/v^yVBcpYe, r:Gx2nt01|f?)g{|~3;?s!UDs98#gcp3HA[>sQ/{s綰s:~ػ{Fw-JT7DSfO/yŠxt`b2 j<:švc/D3bzo,GâETQ|XLȒf*QžY "LYs ?)noBwz\Z,RR*\ucĥz} ! }wۓ S4 ,{N,GEK]X9{W.h>/֎S![*$bg!7A^ O:-m!3{>H0۠ t8EVXB<5q-y^փžBw vwOg45hZ[PThoBL[,STB۷4*D:5MiRcvo )!҈TtceQs1\&2,!b-E<Es# AB qΘ'LΝѨ%PӶE!- 4^_q@r+M@JB6K [m %X\`<ݽU;j"J=^3$$ Ȓr wtp=:0檏 *s! RbOM25hL 6Z .~DESI0djZzzڬ%攕K ՇO\8UMQ.呪H4=2 .}hEe&bzkʈRv"*WZ\ Ścd'-.?C8q{Xx6 jTX㲘>ε4"G7H0EH8K|B*h $=y[ؽ#z]kH|1]wۢ:m(VG D5 E1"T ?o@NIiPFˣgH d؂eb lz\g-9 b Qo95,=2(+|D7[qB3j:-01ԝ 0Rs2)BPn!ƨzkr?Ml1%W|k\ Bk1Hp'#tY;i1[#> /!Z|Ԁ;Q "pl-V%?-0yŞ'<ڊ56K.۫8jm$LO(2(JVl`ńab?A6~'q@g(>-9_zYr ~c䊯>0ܑDž+a>&kQ% ]t6QdSo>¿ MϾ] Y)3IJcK_.[g(mg-W@퇿>:;0y=y,rV~c?ݗ񸁿ZpaH (xEjSv.;#BFg!9uT`U/aUn`w'pʟp|Y4!P?F>OZBXN8yK<U Kl_ԕ?]KoH+\|?CsX Iږ-ٖGbeMI3)J`SV2b8v%muY+TAa2ѯG>m"Ph87Oa7SG1(2rp;.חbQt ^C4,Q!a¿{|i9 =vdnUx,UJƢ『^xI`ScX/t9|ȹtDu@CAalq6r(۴Jg´`\2G$ ^RBJ3Z[>gsyj$U^n-k!2q0Y^ehQEÈLF cF\&6)JGBw jzAWK9&PVlTv̊',͵@Lgq(DT3i۞c:9zEd# r.]ddc5մ:Ƃ Ak)--SFo&Ӥ}u"NmsUx}{B)fH&l* Y^(26MgXnxjX{fE^55֤B WsSA[.~k2t_zwp7>-Oc)e " 4|Vlzy.˞*um4YFZe|85!_tEL ٦Te~X#z%`Re9llش&rHXE<;U\#d >KEWm^h2$;[T.UyZ[y P~q8MVM„S-;c̆Cd,V]@HO/vǙGڑX*%L'gT3i1{YijeoOtO"c~K{^ _)5CN$SSؙQ`q\2t;.gwXw=r ](}L#x$YG/-NeѓO&G;u%n#\l@ Z.kp%u F[*vpEƙ)ɐ^bvO;M1:}k5ΫgHs9j4Cs: ٵh"c̆Z\WTXJ˚bG^xxml.03q_t}1IF)52B*8ňӵGv*^ቕ7M4`_埿Uw郟=.=ywMT>uQ= ?.Vʕ1Y`:N)֓۟#%i2ͬRvQzF$KLIR˽stw٪ Jy7Fh,^ GN}/Mm0fpthFm9mוW5goAGѬ^A@We! KC0mQvwNm1F +:Vwfj-c6H@! AQ$C8 8V#gHLhQnqlx(o&ΧS#y<ǵT@K áYՓB*[$X%I #*Ay hreš3x*_'̴a K"n&xqH'8 &1NY(8!T&\J8 ȱ8Bb܎"F; "8 b3y<;l;`QA'J⬂>h|+8Kq 5uCKdu I c#HFp47',xq"eߛ!8/͏?ӄK#";P8j#<{}ieZQd("E&ՅI86;JƆ*K+-6X~df= uO[OS=@/?f~pGVa~-inNn÷q~Fxl(#Vzwqq1JU\^WVϲzxx뿿X<<ϳULaOIgx@dMO9Bq^cL?JոQjm1͏V`~g`y:O(#3!D f/DMuG(X'y{5}zO??n؛!/^2_|^c Xn^o{>~D>o ?RjiDz`xJd M VZ > FA5(J:0[~H2I& cxR$R Y/U=˓ _퓋vi]T'8Ǿ~L8 /ȗҲdRz[\?&?]4g1D0̦o˃W inGm8K/"l 5ò cXH=yx) >d׵tgהcB YL`9.,p=`~b43|^:H,Je;?ݕZ\\sz>ȨЁM幩 9l`uΪ|H-&;EbnQJ!kvPՋޱCK+[L(3a_E,gnS UAEQsC EYSS !Qy#=ezO2TZ9F6M-5#թ%(@<8YPq0T*%׮B RSZGB7K9> .;tSW; i`N3v͝S$;77dY]%fo7֚¿.UỤW]EMSjW'P ^zO :9T٣UۦyVZ.Aa]|5=.oExapJ0U&oJT+I~WѤbF>bQDSXV+R8$ J!9:yI FޟAZʟ-Ew+-̏ά-iaKZLr<(R^gd_^"w Bƪ>A`V.o6{U#ޓ:GFqǮn)JPZol'n2F d,ʐ#NDh(1TQasʬ;*tؐPQƂ)C 9}m;.dzIhv2&i'M63ꆶ+u,5S |#{ꉕD(cM޻uso4+nWtYڕk`εvNj!x_Q CaRm9{ fKWږlUQR>^zꕬ#%92˔urQl7Fcۨ,VHbkP *L]}1٫F {%LOYwjh{흔ժCYUz;޽*0Oc qiwS=XC-/mDj؜SV,FЪ"Q)ڢ'V U5[*\CV"<=$|@pL{{A}u""gU;EY/f^K2z ѹl캪VE1;}0{}jBxlkRՊ|qv6wwv&)lVW&nI۾| f_I .GjJ=0,U_:R4B"&k&r NらgrY6< 9J-!g3Zh>IadBuj8f󤈥N.AXIuU sf$ %Os,uctq$S;˜@@{1 tyN+|5)÷#ƺNJnÂJN7aN Jo9T;!:-t;̛g4)ԑ>'dGÖ6h˂dw) x X}uq7`])}Sg<-F\{{w'b 'g 2^?N.ތ}B7y|QWSq}+sg]c7϶?0$$l.߿s|Xm_:h:7_ Ĥ&Ϗl,+9{gYJxs~۳3UG3'~69K1_ײrx[϶[EwawGao&_o_(srZf/O0 }>Ǣ}jq5-S+^?X>>= ٪-I%V 1!zb2Jssi+ԏK!Iۧ+! iK&ߥCoXVYxTS uVo#5'^ˢNְޢcW[ة$ٜӟ뭏B1.%ڛQ@j3vU5cņ CURtZ!&#f]PZf/YlrzbŘ:BԥSr.M!vCXh= 'lsQ /3tTFㄢN|`[i5rHFґ\#?TH|ƛEچ; :xvmq{YM).~/x^ևWz݉wgߢ)l$}PGeEU# ]RVg ]J0Kc@7]xDgL;}Jo5ԓ`MD&,@r{(NꛦfM7(PDHn]T{E[0wl|H̱kT_  f>ѝ75<:wU#FZR|:G.Y˜xh91$Ƽ`pƖ(gŤ@%ʱЫW6CNZօ1{uܢ9MAYL>)bcꑹHPZ 2ԖIamoeƕ/8Z6ί&H;R{x@Ą`yy۶pчˌ>\f2}&5%ٹj0{Ϡq4֩,xbdO\Uu1âf=Q& ΐ(Z|ɫ.ݑq=21ݣdvdM8%^𾙘rqNfXK(_od,]Qw?iRge"/K4 ;?%oXQ ,k41}W\MdWevDó2Ij[e+fm랲2)\:{K ;Sn~J?{WWJ(R$84h,; ʱv' R%(Hqyf!b/LߣH]zT 7Y!=:r+/C*VXFT%ЄK5 ΧK(@߿uK6)Imݠ0Q BsY29lJdnDPiؑ\kŇCZ)ZCbs\ڢ=[2bq 5ʵp,<'VΓOy0Ŋ2ATv\qǜ"b5~ۑ|]?zoh0 UGﴦy1AyW#LA$fM{<.鸃4`PÎ=;NyP\%> =y1I`RnV#JahD[ 9شDB{LgVឰueh nk@?iۑ? L^ ]4%W>VTe|NO=%X5?o} z-!W]_6F!__˽|2ٿdq53G|{&oq{ik X`{)F{o#x u1! $ ԝ(e[%A ՋHpgVCTUr.5`2mc[5#mZcyyAu1' 9;hWAFL*6:<z"Tc,bUHh4+zP5S &"{/(R9X`3sV{=5.) νxH{Ref5U:߲csR:ጿ%HmpPeTG=ؑ]G1*W%n#MO)ToF<7aNv[{F8i)cn6 _FwbFՎ3=(2(V|->tmvn>n\+Kh !L(0`QwuW |B{60W#3;jc0h#s']`2{|;)؉! .\l>lyᬩfLn2Ldmb^RАI >bT*T)!)"#ܐ\Kx k*_? r9^LA:􊎻攄䜆`_j|ϥK 6BnJB&>?NKob=:Td-yP4tK0|#z_D،:uKC0^ 5r\9]YE$[11&}Es\9FzO{rnQpƫ 􌠶{qR&q}=B%Xa1I"fT@ 9ĽHi_;gq! 2 yq+Eݖc?Rt˘Of$qGͽ`\b+*nO )_ | 8W/ǻMoڇvü\1G*za|,BNߟl>aWpcD=8Ƙ1!k F)G\i5B>a+hLdy*g80{Z$5!ݻoȤAwc4\*> !0C `9`Ȼ cDll02W[;twM]zU46 <I47e D!=lhݗh,gD$3LeKa[tN @5>g/ȣ̤{k/.4c#j_ަ:28ġYnή3ࠈ8hvi/-e/bʐEg;sg tgQ p^ |Og`}<׋U&% Gc|f $X(f-̭DONft^sCg*l#қ 3K(6ꇓw, qwiln憩wN* ȌRE1~ 5ffGHmT} 3W'q*j!#U^|1:m0fHa.f8>)hK0"!F `cp6O)'XTE&L߁w_}4{/2&LaXRH}6mOϻ_s\b͎Ew@A,RmS;m !P1 FpMzu4Y:-a6|n?u,q梡ĥ6?h#Wm\ƨ<9Lm ͕dѭ" k 58g}$}Z5M[)ȌKF][~?ah*k }FDӦ3{ C[жaP֣b n)p-ȝ~y {c(_u1₾_]29y3 ܿ˄0HS'pل;5:߶:{U})_|x{_\Ϳz}{4 ` 8>bU8v1츅B1PSʹeB>6GF3hMu^O]No۫^Mj1FqRىfph٘r_l>nKp?m!_sS."eoCp}aƴ6}Xha!+V̑Q%I@s A= ;\;vEF]8e^+AA2Wh@;H+[1F&asB!WhBmGζNNfٔ`M0%B}\ x֧qV6h=Blyx2|^:vUl: wylw[?% s &P)!vySx+%4>@]eXxON̫ZJ*I;T?m4lG]\l>ur֙t[uo31)2JlP7AjXØ][s+ЯΑBdn",9~/+`9tM0Ī[PXCT_W"ԢI PsnFJ+9d.k}zF}<ۗCrfid4(5f!bՖtPfRPLo}qSsJ;௚4jJH;dW&+TrlTT jYdTf\Bz,dE ѫfOS̐ lT|lJqdB%9zPi 9:"bHjEUsWI 3BqգZcP]jД5r]wLŊZl\7U9ř_1L#^ V{w'"؜d 8#3䟠*'/SIQBL5rε$YSS*hj9gל/иv*ŏ2~)ǰt<'Va,m£`f-3k[̵VwkZ9g=2<rXdLlMHmbK )2@pctz Bv((}.x_ ^7C| C>7TT"1+h~ f>]/UbVp`GQzMt6B\XE( &IdhtGEP0cM_;4opU;=:!Fmۃܞ3_&sIϹ!e@M5`Bf1>01ʔP5O)OZCS-^1ձzjr7ao*b"g gHk9BUZ{ڑ@٫llϿJ$U QHX-BLSHZB)[4QTjUF'Y!!g!xJzH W5qX!'#B o`RC!D:./+t;f?q^j6$qG:TW(yZҝ ip!G**Z(۝?.v3Fsjܜ#|/15|oRvM W ]n^gcvf*hj E䤺#$jࡀ!(qX}QR5pTP"r4$QrfAh|6D/V#჏튐#sU'czTڷocEXbȱRW1Y_O(5]{hr?8OC)n̘}8:#e NrjrE*P!q1*3Y-+qjdEBO)!UK4t:YJ ! )S"RP,P*r4T6ޟmeyxp)B; a.nHr2GG.k-d='9#CBkh(mA6 y~XJGU]{D{(vrSXRS "t8$b '+WW'fD,rtcZרmX1_wP@NbL3aK@HDn7q2Cc,XEcEƑ)Y;=zyJxb']+ 3a!Q=PO~B`4{Kȑ9{63XfY4Z0Ҳ~ k!c*ѥ~#T}]rW&ec5k UB9IB\)V]M`T 4cl#]oEJeMRHY݁oS\2tCrbt琕apU:/MYBT[BQX _I?Fk"}!bFC"EwDbqh5&RB,dRvSRɻ" *XjlU{CSqXN~*@ws`X=u>grmgn_/?sf^u}U|29E_y͗]W|?nYd^+1g?nei8mokЛsn֬er,>-_,-%e>nna6VgKIy>zT-ٚ Fre}eՉBF!@e4G<7B0U^OPvү/65\B4K*O5 r2ȭc/.Ȓ':0+ɄHNE5mFk&2#5!,p.jy˅ͱ^/kRfQ/޽+?2Hq}kfʬ~.Рj0\zrɆإ 8ʃWBwkbXd۟`j`WׯP<֮tJw-ᥧ.3L=g$(Uzr)rCX<ËgxL"8͐ޑЗ,;a5dRװ!,AʪMa%ȜSMia9|1FT@'ɇq8*,^(t/4rIq^.ǿ_&W/Gg=F/:Co9:8/C7?BI/)^7e:t—bP+OL,}{=מgʍj_Ke޾]Yc[GoM!,TdqdL[lpR4[Oο2{5tށOuQ:ħ!'OUcZ@đ[lH%Ȝ"0T+d,Tڐ1|7@0Z&ʡa䀇3 !G3mct[؛;^/-Jp2U8ϱb8C薊#szÜDb ,>Ab]+lmʙ5L'^[]|O,!4RI@vCՐ\hýb9Xc92g.HfhLgUu!M{ZTm֢h|n)8>~G`t7??LO/שN_®;0@_]VCN8z#.Ϩf3=rdNfbcoeb5YYռQbyhanyDb|6N@c<@lGjR`;Ow-ANa};--ce[bN@ڥYt.ztJԄ*Rryևz7-4=a3 + Dc[_.</7LjsԂ{|#pc3n%,ޥ;fiN C]rZDEQeu,Wu<qŸ?Ƀ_2y^Z%#=wmLEܷZ] 6g.EG3dP!HPF[oT0 w ֖k#x[W 1K{F8~ S9f; 攋,>ȣzMBd% 1J&JDO2 z%-5zTdQ($‚y5}'ӿ#;NgE^Ws-m=@] /VDQJg/[ոL]`Rא.]9*QE)yRZf_]1R*g͏./7S/c|H ]d368D/YZdH )W:ޝ\άD<:2>t_b.e9Fl h/J(`KP *IhY $¨"& U.T[/v*|f{\ FuY_ 3baH;"X+F3RJK5ȫ԰QA 6C"M",%DR4n}fa-3zNNP& w|4ZIY!R( t>dfMD`{ſDb-4!͟q9C&闋P@cCfβ lƈM4 0*$ca]io5"Ifȟ^2%)$!f1Ml^Sa!Zb*Cku7ޟ~<>9'E u@Nc*[0 `(2Pq]n\rtfu !knxVR,B`ʙ34m;~DJ>p5gŊcfy{$kH)D#J$sawȓ "!ު23[7d$˖dغWzSB&OMZp7yA]J@/W6akk\aab.Yœ9skIh6~ T]!&UqJI]"[p3M`_KF磌h6bQ(p.X~<KEƁbhm;m9(C:' Yz27SvT my< om?=Hک=ZĒyYXZj{Hm1g7n=} +hnrXckk# &-w~!KI 8ᤅۡ02 Hshջr"CyJ ˹{.x'^Ͱ JW?JiSZj6ur6z:H#yUוW R5:taU:5::antf`hv ?oGv=T^cO[mNJpZ)K.|`N7:QmvTۿoGo/N;&ۨ|eAMٳ[y5@sUv{ef}^Z{d4: MѬև{Z?J{yv7Q3n:t&|eèZ0FбTb:'Q-9*Qo,(^秿յݻ.؟CjTدF8\8w~<3Hku1:)qh&ԩݼY>՚LtO4YM`(F1BTR1J d!䳌8b dL!q}0/{nAYd_zd`~>C1SJg:,먢zʩX5Cr> g3_Zx/X,*GxbFAWI[X(L)1-zVtn^n康z[.lڻ_g_i-]/+ U怿:uo쳖>{z*emRz1Kp7Fãy "w[%BWH&@u%;lQSS 4jwAK<;.N;zzsRNCa5OVՔSFbL9LfyN`I?-,C""d o,*ɦ]P*,BGb#ӪQ5z""Xr:6E@DŽ#uȤkU2퀦-hSˊ 24h]g3X F:h-hmw'RGzQ@8Іh"ŏ>l D?r^7Pgee#? jUvXW}4kɔ,E1{wqUx?{<맙22skuOM;aKQ&IL "„BÎT|Y]vj*-xmLjeoHĽ>n-?6BR93Om$kf>liAJk^#Ab+'èI 8w:JDK|~VB*rg䮊zrQ3˛°OlưgF˚]Csy!ưwaHBLcɰq^XF1H1`MB;1O J>ϯ#>B͗瑉fzSg _nzW|yRg'qxz)uѭrxJ0݃Y:[xlvf8fkvbpCFhr`kcf2f,k6yHVHGc.V}$E/I~`U2QU~K]9{ 8T]Yy,fp 韈+9^WJ:xxolɟ?Ƀ_2O-C\ ?s\rC0`z"k3R<0J7$:jތ[A >xs]2ߔk9[è|%sDEa]vйU,Y{ܠ`s'A8\7U(pqcƨ e +K0uG`$06kobjRkr&`'3@?@Y]&HU+˵sR-9YFlFR?N V#760G$\+ϭ,B#ke",:Fܳ`Mc4 +"!N^8P[4}_y@{@o[ˮ 3?E^5&\R1wsi䘷"A!F-<"MC@gFZ9AǓ+ YĆ[)@[!w3Gͯ H`-j_׵(L]MD Wac2l!iƏWt6I?ֿ*wr4Ma^͙w3i`_KWo9#&f{g6B2e4'hXTޒO/*LyzrO줨fJ1L?4( WLvmGv[}M7CM|zsJzj3 w${ksKӮ6Wsu|byop=SK᝗o_PI5R?e"ʮn D^.Lm9Y? I&I{:a96=ƃ.tX]7E{ֽfż&?`[mE~L0IX\vIv$gYvI.JVc.EAvmgO[ZWjG ##G`#eQgA1J=TR' ޶ffTπvvW,7EJ>@jai4FZ;K|m Sk2LLVδ0!Ӕb b^-Ikh 阌(xI8漰̂CaR%&KT^ Dj_sL]뫅eP Mˠ1_`9.FE@Us-т;'^B8Pe^9gia~!aĥM`DW$V-Z8ݍoM䝔ZqZa)(D*=cV^=T>_إH2};!WvRDt E|<ȥ_Rxt>8p}nx;Er>k_ck>+x"cy}a{2=ٕhZr6KX_ʕXc5u:Zp;S4nφ]r=.~rECKhUG$2 +@,ӄ0"eT2y{2#RkQnԧJ=Na7 Z$\ehG1hkjޥxC35EcGk(|ɟן(_gyǓXsBФux$>(oY7uX7RQPDojHbWv+ԢTo"=*ӷҪ.򜊖?z[SaZѰjװboHTm#cCH8-E#)[qFQTuLcx(lEL%ϧ)1R0U!r)0z)%{glзgeYqrWKJ%E7 a9c?\I۸w~s;nkPLєrB:,$t(yOOOMnFi.u)CĿ5y 8ɨq~g82)`nRvXEeN"2. ގBv~wi)m2M^jd73OM꜐F$p"Apc ,8WEkfrARyNf:9r^d]R`iثOʎ{ ~lLG}L)Q{̼@ٝ9[ [2.S 22.S ȨǥrZuyJ S"Lig@$ª18lEغwfkΤQ,8$,r0ƙ0Lk&Ȝ|ٕWEQyϜ5+w/R:8E7o^Wj9pթy /dl*y㳫tTz1at>v+3&ғqW.l\T<~U0՟'*h"xќ@p{YW|dE$߲TwMDU`aT(te4'h РJ$.z xEh-쎝khM w~<qּJ} (Ž Oa& T$홍K 1,#-lfZ4 SFQF3`2v -6(@ C +{b5D#AyCH"MZ/h:m26KQ(w5"DS?+tDM{m>NCk37YG2`&( D1hb |vma}5u5`¸h5N05(n !h> )#K &`qհrFG% `^aneɱP$K6Kđ;\xNΩ !XrهQߌ֟YЦh?v1}g9_ͣQJ_S29|qM|QFdQ_&6*^rňGퟟWwN-;$K v'pf̞h7 'BǛzmbHEh -tc/pky%wx7JR7h~Ԡϟ["myǛgꨓꊘSr>&B oܜ@R)S3a}!4C]i_Á W/c950sӂWΕvSGꦄ1h}k#F '/b=v7Ǝ:̕h]J7`h +qۯ704f+v3@Y3F+P̙ܲRn'*J}Nf6eH2)O[nh^yxYJa4>4b5NHVjaa#leQGhkj;O* ʈFQCjap|ϣUkL*Z*b~:7:! qN<˜URF=&QK.q,SƲnzhۈ#WoҘb5y}#PBSdcpl߫6H3mܦ [sI$|3/setlg:[jJ+  ` IPA9V2*bd▩/jZECOwi%'DȡJ{VL槦EtW5UUMw;ɁjקvNx۱+@Пߏp\ 9܅/hˆ ;ipKqtMΙmD"LFzpAs qQ2+WD=O$V/57ӯ+w<:AQ8:HX!i2%#%k̮ 7PgӌDpQњj9mzk$CQOH@r}-wopZ<sZ-YUzJʼ$X'FH `rĔ'[Kb`Ѻh*; 7r.,A/bn-eћ9޲t (Ubs5h"VP|>A'Gl6WMH=f%fjiЕh/ܷ0P%,pL*fUО 3pJξ-8!-u "a>"E]@HܣsW8#fpsv͇þFgF6ٟ"IRKvpymLDZH rsզ#Bdlfݫ7Fm=c-"7vU( ]d(#8+,& z !vF.p>~ V[d=%`~ 慱Vit6/4HR=&sjqy.P0u[@nS,5yRR~³B4#Y_܍s.|Ζ-č_rYGL?O5!=4<>dh)YvdoCGOEvvs}݇?76t5?(sjAI ,|^DBѓVh.iJx?ˤمO/BSxq:jba?;0ApҊnp&VJ bcHkۑ]c|eÅndui¼Ӂ1hgI74G/+qkYƋM= aT;i+o&_k.{6svy@1KnI [0$x>i3fk_?ḿG!(2\棐QH;|q/N9zts 4g"$>OGzv/$oZ׸? I =W"xdЕuژMhnا.f%&TNr.Kya+ʘ>h. SO{3[ yh%*ᏇR\ rKA.s)e}!YpAZ*Jk7ʁ\ى&FAFzDZ$0~6W!վ ;`OPV2R'gXFj6#OnP ӄĨ!%Zu<=1y"uk?,:-TJNEeRuUQx'9@I8>tkkH#hcaByo­3t))f=zVE(.{sӹ]_v%exÿT@Roe ZՈSQժ8HjK?\PDI*RVD,}?$@rJVe tjdm REx|X4N9Ou ,R &L#UfO2L["^j(dx{>s#B5 *ZmaUTrku&<Ӧ CIZP}V[3L3ʉA!tQ.'dTlt% Ɇ$eIrNmkzq/ЇV&/\c"' ^}ƙu:JD%u G- sQh \1u" ^tqBd2NOU;ry .+JƬ%vTC݁@/Ut*[!gTQ8RRIpHvB*]4wL\BU?%o{?W=iԜF$!r5 52_}he5jyHmno&żw\[^^;1]iۅC][:-k;sGÝDr־LL#Z˰!$p^!nr`ZG\R9zs#E^9l^rrI_KTȩx5ЄcF{CnQ zl*oi3D%eoW_~>i!O@w? >Y30Jti1o7^dzx=!.{ +3ͽp?-o!N?p>VHurh?w[ўK~3 \s@sݺ[kX†>MQTCz17{ރc[G0.zNUΞ~7+;ːq}X]YoBPr+m̟6jj\EDbYՃ-q,ДS,0DJ׹[0 :I#[^ FN@ҪNrwN䊩X܊ t,m xK7 |uQNJuԄbzUQ))h1Z=^ 9`/c€lk3f83-LrCy1+Z$}Ē/bu])+6?-zxN<ӊͳʵէVVzϷm/q9P}t}W놈}tg?rkid oeS{g_b^ybc[t juAK}(߮A1 (;L.WtjߓBp9cR4SN vQJK 3B?]tNu9Of Cgn 3h݉e!l, !Uyl.Z-{ԪTP#%Gy&d"6Z+q$%2m4PV(BۅVIzJ}q :wBZe6pjLp"V7.ܭZt@}Ù?1bK50:i^zT$22"D}@V+rbj p4APROLLB _6}}p99rEbdUHbNh ;(r.O:T.:u:U:Ixk%8P촦t4952?QyVNFwfjiw(1}gqXպ*'[n9ajF(#~ݏ ڪmu-7=0 ̾N'32R$T'MݧJ6 "Ь$nZw5^0A@xlݼ:Dʶ=vAyy[AvLѰ64 3n@Hm<j 6DL'Q E$.Vd]Igz}lA<KsrY^<;WLݓO\&57QM';}).Tg[Jߦ|Zm'\vj/5T Yg d ̽ Ӆ(pꂽ CmLwk;M::(^_ah @ VǢ\4&wE (g0X@uY}x,* I i 9kWhF8q̩Po;g<߸ɞ?WaL;NJQh %J T#9ޅofj7fAdNɬB@YfDFtL㭌ozy' [J'K6IJp@YB!@6R<]ţA R؍l60bX7N0߹ђyk*I=9%mq js~'*l'+y=Fϙ M%XGBq%AG 2ee1 -Ƒ40L"+PF֠-(G[To!dq$,$Dzd|}q4;kzՅE/tMLd;PMCحMs+OJ"ݩTՄ4!RffJ'`v;sшB3zX C" YI"9EkZs~ESvGB.Y}=nUSEqZ"2Iw+ "8$ r/MbxX, 2D;ZlA0/2&[Z4 =wTen\/`I)v{:>(Xu_}qijbC1 *k d-}OXe^KJ?eit[ynUw+j+V5mnKzj!N“tqBºY t>䮻$Ld+.LB\껥%إ[ Fw#.8\&h^dyP{Gjz~qSEIGz y=>="` ώ@bM(yq2 K t,tόr\hnTdtL ELK0=: wٸN ?+{R :.[ ^)MxX3w.gg:+6y?8o #zY_駭tA_g> ST=eb;[nƥQ;owp'ODŽtw [޺@;XkOڥBINHڽQNkZڡ!%"UW'o|%0: z7iDnY^j]h.c.>|Px{sW++ʲPpt*TkJb2w:7t1l'3D?^l^~s^XB <(>T'* ,CXQo ρqhyՏD)ۮ)Y[9J o+j-^=^90brw; eVu7QKrr=6XYV$ 0LTBD&P&L8'A"$Q"aI5cRb^_JTc/$24^1 %]Z}y],-DR9bQI;u.&6珵߇߉ë65 r!+`͹_C xa+Zqb滋L][:nX!kF&wnO \ -RX8׷S@鹽%:YQH7/h!w5"ԒKɍrͧ^^h 5x<K(ka2Cr{J>Uo&Xt6 QlR{60FabB;õR5&sx g )ٖ%*MP&YLMBTYCbecDDsRC Dh-G9҂͑˜Yv2c0DP i*n͆$yöMXS]g7UwۍM11$+._gR0fVㄗ!|xU8\C(,WCI\1B#Uc}htw>I1tڢX\ R[;BR-2=_F@/ן,y#Kh~,~ gr)m1[V_O?B4q-c~УwQB-W7S*^ׄ Uь'Hh@d^øyԳy c(܊AJ%oK?N7C`"?o+(FL|{ܳM#8YMKrճ?unc3:I@M]xlNM0z VGU$r<uK +ss ݬ0w/(SvjYoW {?z5poLM{όaqlt)n?-]~_4{{h~ڟ_&3'—Ïpa&<O/퇛:wlq9p.}_JﯧL[IWKsۜ.ppK`j2Ο@M^NFc.՜z1䥍n`huE(:9XâzVuWf(7XUiwUO=Q_E9M^gl^\zb)ٷvr5`hfcw8#IA?=ܚ;ӟfD崙 _؄yGTI Sho'ϟV?~gÛOcG4kw`g/GY{KǸΧd9]~niq_B~vZ PΩdYhR ɻpf O&$ܧWPQ g0&zz|nz1LetދC81}f\1WWz&ܓye*͠0UW4~nvYM佝&0(( f 48&*XAفͻb$ LM_ [[>$;!AOꇑ#w1|˧`μ9Jf Lp?Z6U7G3t7+OgYFzK41!\ Q(cƑ Z0!lZ.gV`M fQd֭G<@'z,~Ly+ټl…(traIKY:.p~i)SyN|-`_T{6+_ܿDb?'O LWݭ`]}# ~qyg U$[^0o^[|f`(gLC(D; /`tB䪙EW~o&.ʰj{?1\ t@=idz O>R"yrLaүzR,B(lX-u3|v3L7@Oq!f_@Ļ9ӏkoc6cDIa3okry E<~Wm3V V97RQWN8Œ[{nJf2 VQMiolLvk"Hkռl9",?wOFPJ,Xlp"[ e1\rΘ %2J'0 LH)҂l~..lS| ޠ"Bj;IKA2|OJF$Qb =8 @[FgOp#/O/:1 XaQ[C֑e8тECkfcB[kp3\t `,WC"ZeChQCA4ac0cL2Cmmw%41daBꘒ0I(Yb&ݽ0U6BoIc @v*i 0đѡB"Q!@*$ :QB Abr4W"FRqxEE0=oHظ(I:lk(߿~nwЗ/4vwr9/&ܭirAqYsK?[Q.-Q./)ߐoxN5d}++\'ίX距gs1֐3k}\9w4NjD&L`̿Tp`}bK%Ft/D(n EDGJք#YOl2V65i:dNf=y:x)BfjEx30@?LXH]b_*My?\]]a ٷe%л7UȒF3v>Oӥԃ(y3=wOV8w07m .|H +v== ZpU).иS#^ײWYZAUX k]hMg]^]a-Bu-44APJS 4 "`68Fjb!a mI$Ch"I mle(kE\l:^ـY? 0SjyGY)" [$Q@WF@EF1?!,1 1F %{6 / 0RW[|Y"w_rzak#KZJ>Ґ=!-6p]U]( tѲ QGA<#QtFQ U:JB\l]K*R9:X|3(jcO%q(4A(Tk"]c0NDp?CSu*U ,V`1G1=S j&S>C+wZEbud4^/ıKC*&c%@yVY] m-P'0DٕZn!sB:P5`їA"7CD5d o}%ldE%lĬ׌ш)=Ǝ ER%AhUhQ('(uDEOq8QںVy&@ldXYUu?e>6*fV1D6q&GK ׉Y)Bp54(cDthukt5Z"a(IH1wƇHDf *oĶb2j% c;uWVz}Pd_߹ƺP9%|CqU=b``p,=Qs%GO1YQj5e!ɮ;MA7(:/QȿEM ߢn W7颢EmZXzhr8)Rz@/kO2`䩧'^ O)" GJ0hRg]<ʲ1@PTe$*ENt[Rg%T1x{ϥۿrfhJO3H oj? Í6xC^+wěz:"N0L sb\A)HZ?ŋ~%;0.pᲕ6+Qvn_޹uuwyqQ^C;F8i[QzdᱬBM۵N+.{X&qXi$mS6++m 0* 26X \P-d'M%ct V0xeW$!LT4ݗ[ l2x9@012f0( f ʯ'0u:V/rٷ :b۰NH0Tœ۱` RHCE#E19(hMq $tOGVqhv#K)IpFzynO-@m22j6X" !zyܧ]#)a7nW[{ZO63OQ3),0y'͜cCuғFb'VgΨvFd ]kR/LҲh;+m6zeW jmP\ NಧzEFdzUPWM!U(puۮwLQC香s|P:E6V2҆ʠ_ x F&H2dVE!ЍE4i{\0 nfe.;EٿD`=S1"9qgVԽ L:0iNC,C,1"ƼV5Ԯtcm +Ye='U26DgVEwB82<:]hfAx0UD0r`h(4H;]Cw_;9u)B'shvۙ.C(`bCP0'ph0@v84Hǡ&OxC$AB~<2J43R"VS)1fIw InA80@`@ (7M$`DrᲣwyG`J0 zm;n_Ai8I.d8yT` Cw;5gɏy'{wg EÎ_@a\1e9ȤONײ!0 ci1Az7a)\J Wr*U.\+@[={4CiO<Tc5QCs&?5 Qs&K*J.~0kM)l}ܸ15c4VYZqIeR)# Ya[t 2n൲}P yZh\a/Iq R@#h {Jf *#bÚPՕ7d=lxL(T!`Lsh`L(e#ajZ'W/5Am r ? _ VB޾âbt𑑂4n#8,GVtFk]@5jXF 1F+R`b9a ` q|8.5*e86 _\2&h %g]ܴ JWځCgJ@A sFS$PʈH*)ՕT6!41%b*Ԫh@{༳ aL-ф(sOEhfc ںΕxwU@R3u߱~ UkFeγ=< J4e*p ˩-4v~FYD1Y!ݲktB'V{L]SO=NDiTޛCߎf!ZHZ7B| JX̪ (ɘQ(kF"MjZtJ`)b KԱFШGcAq.\bQW,4FK[ܝm@e4MV5HF"DwdЁӧH+t_\}pd◀(znɄ1R`3щ뉷\nc3iqvݣЃ| l Fl YF,D|gFB.ÐtxaX_{;9'Di ߥLM+ 3)3j2ՌBOG~廌 4W`@w@jw:LK/@w1aУ)t$Jnީ5pP\d)ћJeΉ0Jzۙm W {G-8k.λ0;8kn׷3x̿Lt[ jq!L`Scu!㻳/v~qs]_,Hi٬&˵  |◀yC9Zj`VYVN @$u+ד}Ls1WwIeez]{|Z rj,I8%3b@23(Ts{K۟f|=4ٍ FE3Z }(xX%럏72zwG݋}WϬfd939~K=qߋf } 'a= { B| * aЃ~Z%JתwISDt+oV ~s?W2h~7G ga!{Kc휷Iqw4`w {.p Ԫe8G˷7hwI 'Oy9e{__,>dA9X]j&|E;ow8]}z{dv}ćń*N!ۖz'ފ4sFnrYW>UQJ>UGiww 1`E Q:"< Vq|MRj!K*_5N=t7:f9m(ښ[xG=n$7R"< =1g_h5%uO2tdL2R]*q1# ^{ڷEa1T(X#ilߎBW O©i; TJr^py<ј-RҮ׌Ig"Q½r.qRݖ˸ݶ5.+5X 3ܲmn LF [`)1bѼ-T҄ YƓk]۸oyw'Ȯ! q1T?E&xEeŖS )eZ@5B, Dh TV닻#JBܴkJjne|tJfyC!Pi B4fy4 X/ C@v:IIyB\TEB1miAc|nfog࠰s+E4@!PyC ŝnG`oA$|qS;JK[8&rVk"S*J{ljDuܧ]^GMگPBp#g$[ml[=5S2M= 5Ff\N&@=ZI1]Wi36}FCj{xPQό ;$#SϦگ# UkVqVQ7 _ "'#Vd?A5*OAK>AIU'rBӖ1ʒ&1Y TfA lNy&JgpM%B meawMi":W'9ƃT 9ϑ$Fbh:U'S0R)P\ #U-DR%"q 3D! H۰qt7NA%&Z- pcemcC$2.D/ lluQ@INs[?ћM4^pt'@bmˬ'oC)V`1tQ3)LV{K'S%X!HTT`!CT,Hh_%)C.bu|s|HZ\$K$3:ĭ|{D5Lr=W0L%{i{O %z T+/^\< F}^B^yգi2%sx-)@0+k#NGPvj :澷 H)-LbV>g펵ʦ_Y4 h D(άL,=HsZsju$~ aFComDwSƔ|PH9PP/8DU80%Tnvb4_!DnHc )[.ʢπeOF?i_^{ƐioD_4߮|[LtzEMĭ[[Ug ^x29ztu.Tu:G s+G nwӳƃ!!x!I-ƭH[D}܊4K$m-Mڸ?)Иxj@ ~фp2-‡%RWٙy@d񤗪IC X67)L_#7!Cu7{޻6iׁM} G cڜa-!Ҹ5$ZOsEV`TX`sPQDbuc5ɱչx:&8:VkIҩ߫ԍrFSW5\;)quRX@"D"3x('Jr I1RZ 10c`^oɁ 0&C3\Sk&IeB5` >=cXܺ|i.0#BP%@-b 3sXb)d3΋ E AqAq:-B)T. =Erݾ~Q+|aj*r--owkec۟~3ݺܸ`R& [vsyӝ67Q8fgtGK%QgI7 KLṚ̑q,`Y%iuzy꩜Ŕ:)Ց8y";!L)yHS(/'ENj9O )/.NmY 2oY&?~}3AO' U ǢLV SV77wmMMM:yT~ط~eEwj3"4k3MHܻ62E0K { wLQL& quTH"o_ȲhO-\] ^)%Rw#l)ȟ+W<+a]\}&Ig"?VKǷ}s/م8 l(JF3MB;RZ7X .#vk4[^̞5^$ok=K.~O-*̡GcmmH 4=k<6AuAV$I._ȗ9ϘL-L|y%pE9B4<<%~B-71mq{OEs,͗Kuݜ%*Ϥ]wXy>ӍGGO<,2 [{"t99) #Mİ+)@Yv3ѣDzHj1z,*v3g'd!)c" i([NUѝ(sNF]}Ęy.C#w)W3 a9 ¨[]W^M6OB(KrqT% ;C9[!8:E?{ܶ /H}J)IMmqˤT4$eQ!);N*}E$(6 4R.;At[w>pjWn[;ZRqk ) dH< r 1d$H!9kGD'1'1N%C,*'W*~ 8//t|$.,\}YĨz%f*^" Ljgij :&kI0&&A/\[pr$!Q)'8(ɑ`)B'ɁIhW\kq^|yDIN4㈊ Ye0cH1$Ž 6˫ ;|jҜ,-ęfeXSOTj ' pZTr5H&ӄ@(`0%JˉځmSDp+)qjeeTgsSo[盢zEROOja̐הR}MbM7ϧ+^i68^{xj"B޲M~|5r֧OF Wwlygzzxl[F:ftɇC&/>c@!gɧx`g66S"N)#BUl B0=w$EG0ZTl7CQB*Z ؆$y 8.` b~/^pڅl[!^W{\.^E|~p>}+:/ //8cPwv dsxzXiٔ1"3+MǶlu7'f6$+e8A$V*8GPǯz%@#eH*Fȓ-fqc2JUP$SqcDa!%)P2|[yvYN)(%%kܑŅ'fev%:pfljk`63eN5f埝!f>ߵ6m% ;]s>7OQ6[\FT{yjbf5bn'=h#|'>~0?4b1rQLVzb^' iFHrPH1J3L)KxB*Xy892d}yLp$fRr!w>~Zi3KXr^m?Td $2b(cWO˪:;MXJzxL/ Wד)eNO^z א<y~וtN 1]/:J,~=r5*05)63 [[mnZiQD%4l˨K*ړTSRҷjUq$!_v)ɩ'IJ}u=q_8mL3apr[ :AAb q 4KYLǚ@hk(P(jr] ֣ KzSN#c,,тX\(JC}eEwAUt(2: KY @C OlU7|#3;hw`SټUйkBB@mA@g8ל)l_>h&cﹺ]-/uDzC Z؏"]g=Ԁ5Fk c,⇬بy%1W'ok7 9d@0$)A-R0A |{Wu}9O6KzBB MODXz 6d,KClz=68׾5kiBO7XEQҍ_w3`K`83Z=iQ6;ޙPpe#ݸA 晅A^zoBpast5H $ {=#R{Yl=O(bK,bp۲hy=wKm4ݒ@DZcYZb_]sB-`IfǚyCsLpP 2SQ cHFNoY`BCZDT H7(Lp<\f rbXm5 KmC}ԸFJtp< '5RZ"*DHDo#Z>\/2z릕Q }U Gl د_LL9wnؤ!eIt>,\!4S|VE񛝔7HHPPS7ԇr/ MlOj/ۇ}R ]6I d O_4TF${V8 6YaC%6Ih:5hNڨ[AR+G $4@RK[K|jvxq(=I"`wXv\->c-ȷaz:*TDᐚ 1ȬN8V?&kes(Uh. `viEje:{|*nV/ޕkܪ<)(L/O@*f1yw2'5Ut$ q6O߫EF?ܥp*L N֋ GP hE!ϋIjCc" $VS"~֢"h/>; :G7ϵmҝ##\0l6gTm(z"B2VY&zUw߁Rr$i$P LxI&(O H-d2u'C>#xN'Q6-oz/@-j!F熧IJW E^ipiY +)[ Mr:82DSt^wR|h\|ۘ(yRǘ'ƬΠ0^ g!1x< xcrw<2%c&W(OnurLqE!3>r*`t̾qJj wjcsYqeneL?h8S8&99oi֘=VKpd{Amtb!@2ې>!w\ ,LJQ/q ou{"[񔜂@5 50Q|W󼼠ortcB5VZmw7X>nAbTHϥDU%3'aTؙ9|D}3{M.3$EliO^1Y{LӯDWSHy,%9Qq"ce@X Tp"HݢLZfmЮ8-~Wn&T$,Nx" DLRr RƘe0'TE.lYr^k<2~0ޜ徻yG Ap_o!AEpw;4z%C-\mOלR'K+B{A؝>F^?4foAEYK~a/j3qX/cf~\It1F5ۻS}Zq\j~Ԅ.Zw;ЅP­&D ()p!wum.CC=^ vpAOF$Xc i(qy]Ga+Im"Dm?'INu\3`a.%Q.sxy#-}U0vc2JOLn]Ŋ_ۓկ- ך@wd/I."Mk;cO~}ޛ_.3" )|,J(ɒ"*5[H58Mm=XVG#4b0HH Žl/dQu-6U?w%L}_N@ ?8ڶ+c 1Jj H$,|1[5@|x" "P>n D^!B!7JǸ}%u}]{\4H 6ڸXrm섔c7F{HBhmk>(57PȌ'@x > 03|F+0YF`@쐯}QCA_2y5Ǽ/o}B1'Uz|Ǜ⟢fby?KGV[h{uHlro {SܛMb6 S\wed <"AeSd AcKɿ jޕ\?&o|V@ܖt5yWO ڣ#'m(IYe1';Ulu7eփwt(atp]Z ]I{U-bby"/-,vDƂ<p7u_ P$4"G"U%MagڛƑWanj,|w[If'3Wm\ &Ȓ2K_%&+%@n4VJ_Λ{AHl8BT M;jAv`%c+Am8q'8I)6RR$ mLsdQpYgJl|pF.E(d\/.W "m ;"-a&t|NIj\+"佈C&p0{g#7-Rυ//zME>RDsyEE? @>{ݿ0ʻADtP=U'@&t0%x}N!lSWhp6fqߎJuarŚ3qLa$-gspdz\DPOބ5$g篃8FL| #V* K'/>jЏf]_ef}7!i[&8KTH:#G=7Lد_ ۈه"`ʓf46`j8 OkRx[8,cϒPttm1b_g롖:qKѮj+h+*7[P`Rsbgd:EQXsގ㜅“'XeF"צ~A\ *_V pi7Y,:DFbAU~ ۈ-V{NM*x=Q`\&S~ (C6Kp aP<`%$X}q\jZ*YQ$db lԠ^VH0"_hoC4~5jLAtFdv`Z#B҇n~V:t8Yy;+@jD@o0w&,&RaeWQ!`eйCba+7$شd-sd#Ťщ#*a3ɻhS(z,hwi;m[@ǚ"vil.,j&kMLrm</ģA?ll0 J:z +q"LzF;"K7$>8;`TqE(yX^1gmXZߊ%E "!@;1c ₂VH8IDJUپ_n㑤~IE0[5AW L' 5yG084$޵(MMXi5 gk#եv<-3-c5M|/J;K?V7`Bb\\"~{ w8?> \+7w~89jSSxl*ŭJbTBG(\Jz)1cJQӌT[hA=[DVc!Tt&=uFQg?!GAoS`ۢUSA e DOe.R6pqӴX)QK|'%f6 ^q'Ol>5KdzshA<1营GCqtW݀l W"11,P㦳r7ז<|8q!'>d-#)xCGvw-rhGuC P˙O-H13xkYlxL+)vJ£ezxҘ.>l%sj焩ĩ3jp˅#I,%WUg;Ծ.9PW Ԅ-T[(5Ζ&%h@nsh[.{^]l> m60zrۜg/<+-/gy8~G3'a~0}*gWV!|'@PؤYRѻ2߮K$'}].tۄh&KEOEjJ0%;r}gѐyEdG.5捋OF'7/ouH~2-1o^eN>iB>+eetVq+>Y/^1юʼn2F23rT%3ta022M44{0PG :hbx hKd E_A5&@,ʼnO$,qKKNXږtDW >.,֭zaݧMa<NLrںRw~2k`Ŝ5|^OF|?U ̛.|uGO0FWlllnwV&ǠD;bDvyz;?z!*,=>bQ囥E?De {])=eF(c%:3w8/pU,!a-?୷ 1X'}OfZ﫫ZQV6:5c|U@W.EH!-q"-Iaa'H'k4cG։ Rҵ^0*no]^1hR8Qo"޻VF)u$C ] Rh#tQ iZՈ_ 1ljΏCE8|ްÆE-&ȒxV_G`NJ/Z?r\ڐ+Mc Vj4o0svk6ic[qZt]<ܜyJ2)|r`1?7ႝp$`LNH`mEˣ<Ūq0lW7hي'40T.ŘVCufH$7vk&k-qH[uqTP&$zO/4a{b`~Ѱ,0fo7d=pԍ<|԰).5C2jk *y cIB q8 hOD.6` uQ3VJ{ͯf/EVr\SV:( |v4b.·1lO 3%MXg[`PfOKfō$Oj4}Ϣ徸,0di]r#!߹6)N^nSnM1":MݎGJv0֘Ъڭ ELRͩ$kڍ}#jT}n3O:vkLhUֆ|"D'Xng*Ӵ:xSJZU!!߹6)O>ƹ6Ġ߰ZSM6%gڂ|"DuNu/d]qOA5Š4v;Rf ?C1_>F &,hX(BN/{O#7D1s?_.A/|f'>C? l]v^ox&$^3T{Pa#|9;rO}?p4H/{]T/mn`¹_a U'?S8JB4LAէ+Dwnd*%Rx"%eR,baҌ[$@p{$galz*GKV&pқ˟aZ_4P‰?Bl]4H yi1O¹ ndQzg9-ٽ+}QK>zW7{ǣ܍Z\>d`ǃX<\;SJ_u xҮٴҫj^zI’Q,[L{bgB) Wp-tJU[RDi)n,L}7ӪgQljʏ- 9Y'*?+F5FYG-X, 9T5d@ %p)@ VmHThZ5Kp1t=Ma=Gf2wIKUeV[`P81d'" 3 e aR`> ɺDx#SXXhFSYN<K?Dqa2ZqNZl9hQ X[TDLjwMtnj XP ÿ-k|#7E m6`0&lb?}+K'D~h߯n~I}Xri?󧻚 !?.ïlz 1!Ox:Um<| mk,\]/(;] =hѪ+xFkZ3Zg$KϳQ FiUfiF ٍFVUb$9zwX{hE0i8?9v/'&+uP? t`XI]{Nͭ9Jy>(L]4`%0]pOI{3_Bzs%%x>]7YRu9@Zor0䟆h=kk d_cM)H%,YaVȎw<̚~2嬧{2#zjv`!!@na` K6IN60eP~Tl,B<2\2pO/Իhh54MjxgF&ROW7Y yY}otk 6uZ ɣؖe!uIwYM GC,*iP]?QWO-lO4>g3NL `&TpWh( R(З^՝bĿ =pQ ߝ^BuB%rdyK[βzcm3 m CBqZx&NLrlWUS2Ý..rLd($PuB)f̓QJz< pޜk|'g=Y) a_Ng]бoeLz](pNcjcUc 5:! ZB[k4i2 h8ft_„JX^HhaTn@k>Bk@$zNx3 ߘz׷fJ=0J!1 àgv()J+м'<;kxAGi|"ү{^k/6̰tHa7͜i eJ )"rQox oޛ=඼ +Z͡K?JowG&Y.9C])"-U@Q"NGM!~e |4ݫtPD>|{CdɆIbRX>eLRԫ`=g$g,;eeojYU{7&^ԝB*mͩ+RG0k ob@va2H+7E< dzb?/@W gk 52?UK+ La+Fxx8miVRSIãxo }W$C_ZEZHïZ PfZ^] e2˜ Hᄶ yə ^A[{p>k8r}poWb`g&`&V|cR(- 7Û q 6}b5#huU=w "}8/̎^he2;I*S4W"RWogcĹx)#K{l6k$*w3¦:S6C] ȞqџMcNBH&RK;T ;׿Y%iT2$ JyC  hdxGGU\!:^7G̅d"G֙ FʙCV`V07:\aJ5%'/%rizF,R1ۥW gDj ЮNEO ]g-ȹҽ}R/b87h045ܗjHi=LQ=nw.3W*OI{*(ڐt6M0g:x^\ҩ13Me,r:MQs\׃ FLKzٗ/EB'"JR艹;5.FaT.ݕQX>p&]zTxmgmWrT:AɉV'_oo6!n%]Pw3)O Bw5=\̳۹2qcsd9+'dn}=.7Qvsʩƿ|dF! ?ڽ͵'7g _x#½?t{Sv򯏓 ꖏ*nP$Ywڛ|d=\3Eo8 Boo/?uk>!9Hpi?ݙan|^${6v]u/fZ<R{bWm|n`:}q&@<; 7T؆2=MՌ6Ī݇5-q6䙮 B1cn޹}y5 >(\?=Nj=sʄ-}|:}vu(FGQŤ7" %:,.;8Şo|.=H3)/{(K]Yy:5%S@wz{xC~[۞Z1}̭]뤍c.Mb9F0} 5Y0 ۓJ= 各sw|W`XSshu\CJRIkz7XItɛdH"b5|h| saثUk7J紦v͵d|Yyɴm0ey:.zm4}_Tdc{Dww,qUwfRۧ#2iO h٤/.nG_璱k>jT4㗔KizRn&hmX;U#˯CeTH.M]l6nD:_. Pޚ5VdU{~_eЩ4 {s4%'ہ3ۨb|¿.Y|IX\J!JhD2P\50*bU>lzPXtrS6@)Bpm*|6.-d/՚2?#\67OMr3Z{f<<ymSK@FXJlJHipgB&Gv>8n;0I( u, |ߐ(0FNj#6*st,(p xM`<]̗냸^&48Eeh'^e &JIB*j|7!(7`z( $gF >IݟT! ;lYmIXdrF𞽾CL森y_"Kr]1F2|"_}%[7MMx5uA" zv\X&9mt׭*.rP\'Tnԏ 搩"4R/Tx)8o܏ERD~@la~t\fW~3m=xɨѤpl/a6h}q+hmgm)dMRדUT^X=5^*~]'PUzXڏi-ǩ&ymEtT'Jì!j6Pʢ4c|jlIVhVXDGJ?9-+,>?-xWAbjA78l"1yCj*x#ATq3Ts`@LP60H|1daQGW L\92W15A,:j41M3σ?9Ĝi3!D],C繛k^9~Q+qQeSntZP,;nY*PTǠoo_GoKœcb:W8x{ ߎחfH}zuiSғ6n^Qd{͋EjE-0HwgZVrb - UPɰ1ޡb;~]h{؍X*$w7a4y2Ye퍣-C8ne=u'`ё?}1ʾccwѺ8N(GFꢷU4M *帨z9آ7_y4Gg/v䋾-AX~ :,LCt27CR筅wq-<>z9|%y):.aMϛH P\ExkoUO-i#OݞcRh3g?i{u mUqS XsW /=ex"$aYC@T ~ʀZ%?uQڡ^n2*g󡚶w8_^y3 7 s~. lJ(vxg֠na,k3,1,.|u#8v ȅ]ȅmĭ 6#Y8I9=6Q¨A'KC_7!%]y4~g yLE@Z#Yd: `<_GzfNG38RΜĔ1f{MKx`<*.HZE=~!/)j0&NI^ ~>WM f|u 7spof\f7/0#Jjuv`fo-8x'zݢ+ZLCQN e#Uq]h·K6h}% XP0+dSR9?*\Eg3>ᢞ-z`ci '(BPƐ"|cUrJ>k s*(u[sV,y}'p^\?Z8f5VAxdڔ!xÒ vQT0֊?tw1ZtYMUsqp~|_]ŀ5jK cmկ+ J0\Xͻ܆a[ x>%!+W@k&bh z> HE((8ĴdXx|tUJRꭷt(KJyJU̘utw_.jfҬ|XtGqEIn7d7/Wv ۍVn03JٳzB Rz~kkq} ɥ+L™YkZLt:Ay`(DjqA;e/摚M?I;FXa 7#%U^YJ×1vW;9Bz4yz Z}:&WkŻksTϓ'3U<-RVZTE9fi}%ϓBI,Rw-Ϻ-x^"{T*_f&Z]x9#M/xwkMw1f, `Q6w P)9Ʒ3hS}KٳQ"y죺lٹȞ43)yiRHGTFrQwEi$ficoRJ6I9otJNi1*i-4DNKpDj!ڥ dRtr%C$/gBhVN^uiLkt01=Ҝԋ٬ 0%^n[S߰Xs0wyꮸnbaB2{ d6P0hj ,2>`?0[vrkD[;T a}_2Ad(x7 _I+A%dnW%p[W%i3g> _Ĥi~|-D8 ;2|qusns7p\A瑢wƓZn!J{<Ѫuvs7hJa\FOZy.<8pj`%VGvw(ޑBLsx0ض12oKovtp{ظK@(&z,Sp(=ȂsI;3z=%8 LZ}8Naz@XE{[:-pVyX3VLb~N:ńӒ$O\6)yNͤ#UR=W#KA(Q[;< F"#{+&`C=aԗs`9p^bug 9Lw͇+ѾavkǪ)F1r^Z͹Xj婅 0~eI{8+ fM],vqB]-1(8}բHnR>}իW',A1jLGEI_05^C*/ǫCgGr4Ǜpד#տBTﹰٷɻ\x'W0p.PHI1 ~3!d\~b`\8 "ims#%o;ð+`"A)h#VےrgXD[PTt}XodP5$^8}q$-PwJqVzЂI#AOW۱|WSDz-\׆tJ1`xSikkLB mVm4()}F RY[XG(z2h`;* 1u)#JiI2N2+e CE zֆ`=v{*ibz>E-mOij&m &sv\Kfw(,Qa0%Fx QPS\t:WbtҢ+i$`0chIPS_)*^FI1UlGE!*Zs>]FVTy{[Lk&S;pY_b>NJy<wuՠܟB> ] o#ר9Oyk:D> oVO"uȳL3Lmdlݻ>n侟cX uK;gGAɽc4f0Ig.)oLR'+[0Z|8򶘚|?NGg˪mnx,39 OAEjFKQ;[e;,QS}ivV->>0lCڪnQXNX E|f]a #В);Gb-} MKS&1^i4_iJ-Aޗ-@e 'reũއt}qީϫ&)-[2eH`{w]jխv̳)]D_Bգt} n:a:|IZ" ٧7U_MԹ¥XTH?rҦC TwZ=oGfVM?ĵ%:>ZwJ|yV̵Q]>)ahEWc/agrumv;ePõ,$HB`_= kj'7张:i Cm1&5+:d(xpOSe1h;Y\SM,Q/na8ʼnl8U?xq1ʳ(r4jV*w@'n:,HIOxn-9Sۿf8vzW 扊 j2(/4}xB}*v7jAH(qs( XnmL`y$7kơ7lPHK YYay-Waav.\ּ%K= 10'~gVyrW۔7]e_) D]jMzLHtZk>B?K%ڵ(JkT#%j}# }C6V ap7$w~Qwt44iiҤӺ&'*7Ĝ QN=Q9A(Jx1qe O ސ? O& hm_f:Q` mv3IC$b6Ի̪wfF(~#̇܈z"VlO}s~|i%ҩs.F>JY"Z\$;OJb̅HT6$h2g) 9Gh*҆4|B/r΢ݦAέgַ hiȨChcN6/Rz¢5S4osi+1mp}ӑ6zkT en2&柭H(!pICfО(TQhb-ޘ@+]"%鬖47S;嗨(2(R /ĤeKpǃUa E{MyR5oI,޵ To|RR FД)Qf=?3E`wҢysT4 ֹBJ /1XePº'HѪV ypn@gүLovʯ\e1g;QbѦ&CcJWN׳~ic`eϵH7ٯoE 4.hbc1?O#_rHrFlRlJ9 ;o~ . 3s{?>3tVpVhɴ nV )k[uOUicy{py O4L_p4ٷszqrh8UHǏg@ F f?sNUϔ0 G ծ ߺFЬpT˵Nb](_+ yÜtؒ B |p3%Y BH0%€NƂwGUTLC$|z$fU:jhc\s08G0bƌT% gsB(-Ȩ d\ףAg:rfgZUo?Dѥ= NjJMU1B0"tJ|ʻxy eܠz'b27C!0w [  DwnP3BU\jm7:wIC[?S ~Ep+xLQ"JR /A"@oimh4%/#2aB#R;bЦK 1CZ#T@0Qs׆3opJp`lLV+e@5Y*? B=˙ Ab00S D/fqZ\Yb" V}86(Zʚf@UT _ȗ&拝$)YQ`>. Tҥ:KsМ/yTkh1L$ْC\Vx1V,,B"hEGĉ&pFJ1 ar'!pԥ[HGMkg@)yߖˑ$(l7/V<,݌ϣiHbCP XG됻]:s'q;F@4j;R)jMOq?Rec31CS=>1!0bhH{Q1"џNi$yU,(6;{ج8nwZGF77cƐ4bX/NMK雯qgt=xlt&nSfF+16\~p[l]ŶAqTN8'}}i.sx /AY̍--4G0ze$G׃~Q< gwF})4$ BDŽqEn_(t8(,6T~ yHnE1趄@s[& j L_N2j%reL!LR#'hsŴB3ms^HuKՂ\<} 3sa˧<\ݛ8E5#ȋ6`[7>'Fmjr.'`62מ\d^Lt"P.Y%w.xW9S|rZ"SZN]$\d&O\w{T.w06n.N^%u+d/)T&qA*Hx*-Qt ,t_I5BkCqhIɘ~\2y=krtQym{x׽H3hzŽg?ۇi{>O?:c9_ -+tQ"my+| /Ұ&uE|BƉxpW˧|yiVQ>.ǿ㏊\jpW헵!_ 8ynR8OI.2&?$'bf_fx̙lʾ2uǺc]6Ǫ`ۇgl3L/~XGv"3tꏓ/Wwwls/Ť;2nԫe{/n͍~v)).vuԎpdȎz|‹B!?{Yku#4lPZ{僓|,JeVjMܻ2Ew˱čo*N;@lSNtF,>䔎@m6'ju6"P tQRXR$U"$Np] !X͕Mk (8 ݡI-)=4XjQ2?xQֶtlA $L_DSfWoqɌO3&@CLhFȭ?F?մăibl <;8<-^AۿtiIP8%\]kbтi=^g=̂ &Vw>NsѤECM\!T TVPEyrB0__Xj%%F B/2Rt6t''oQ2ohQ,>oz?M~>"-;IЏǯtD7?/A&-$)Z p],.Q$VI%"P^JsHF4zX $Ι%[Fh!YxRυ(Qhʧr;)3T5 RXdKO/k!)e-KKdRG5GN.6vmL~ilfV 1vw+ƻrqZPr߃GK&Kblb 4`l2\N="{fic7@:C!.AkR}=%|LA Z8 z(#Rl<FC#')M&JOH 75R/y%A6j e="@_=Y ]>Ӹ ,Axυ\'ИAA"g.rV9&F4avϱeزs4/ N$hRxH.8J0Jr(y 1 [w+Ӗ2/4XEՁBz KT A2Lu`w0 |_"1f\;T\m\+ꨐ. &XO"*Zs5ԂV[5gT tT"E|r(d((G=M6`h#yȄl^8=~N|C8Eh{ӵ j@Gw_hipĎr.Ѷz=Lc-iEl6|/}i[rN)`#mfvmٶ>|myf9*N"%G)woo1 EtQ7Aa=4O V5*{:C7<.L-PXKe Yiy*s~z+ ,Մ><ĭv_ѵN&rׁ?#aVTJ}x-fR㨘ؒp%%?\vJl1t==(Nw&1<#%d(B! \jO[8)niUl|(F&'-AI`'-yaȱFn9%\ao]4Ro<D;F,/FWs]AvFo[D9 f @em.r6 V =59n6ܵOvjUx9?QMXƨ:#?!*r,2ʷ1%•CcԠ7TS*X9Oۍ H|2:yM%?O)t6Lfk涠Ƨiqhy0%=EIkYm) GiQ^KC)D' છ|FQFgO>0IY_Rk>HS}'ίSTvM^z$Z*zD3 7>GJjЉQULӑyt1TW7@^ B(C7pZϲvIS”TU.iͭjdʔ\3`F渦\jЭ=E*P"}=w=clE_ɘE_! #[h~tz9HxL%U LJy%bc4|+o$y0N2THaFҁԎ5keɒbIPjR'ub ,:)V,J Nh#z $G*J"&ۂh-d_ olL4NCY@ UsΌ]hC_N b\ ϦW>>~\cBqMW*2ˬ.JlmEADCfpIJBp "PQi\dTNR~1pHh f]I v>Mhcs27ۮBeN5j2j\QU#px^Pـe6.TS(2xB)H@J#~ڤF y"uN"uEZ'sB^[f1<4-Iqi"8,S"he `K(!@&Zb J|ӭ#Xb UvBP*NCp]G+ŎIy.%(j$ң>H,jq[DLBQ7L#E+9iWR>WC«!9 +JR\.^z^\\$)d17+TbJZ' t؅WpwnujfoTӓPf:3miv+i@>߮;L(kdߠ_:b{U3\GOoИ͗"ʶd T.QOߞ\T(3JgͿ[ޛxu"4יsFkv6] *Ok뤖$I_%%SgM};n{'.Yl6 wH!Β )s: T>BR, &TF n%&cRwԜF TJ:G;m]pH6r Њg,!,9o$ ޵qc" r~ ৱdlbJRGR; PKuKjDVjE RΝ8 ջ~E{%}{]LKPwCV˔c;v7X?_N7DNfd@k,UruR14{ HBhI]|9 0K5be>}\ÏTN2ϫ__n\2WZw[ {|'֓MkJՔ9c?(x{%F te£NW5g&S. slv`oeE 4 Uh˴ g&4^}7^.ʇ6r\7麻Frh+}bzR8V'y9D߿(&R중JF`Jn";k]0&L5Թ6&\49w$H2H*ЈB>vBhML"ֱCp >#hDp3Y>]Rl6ychB{ɉ6*}:0rK&yͷq݉JqgiMh9m;p_{c6*/4Rw ViHzs} oIoֿ lZA~`a,{G~h;Via}\뱻EOcZ#Dc3ՍY}J4_[/=dE2 SxEf̓]9@ʗ۾{8fq Ҭ|մ{p(C]&4_b^;h`閟KR`W`;"~TχmKqU#ZFaOpx~흊&#=бݔW#9ROxtLfHok=mQ ql,C`b`w>>)Ut~K8z S0$jf5Zgnv47DIyffYXAP6DVxNiGR֪jS1f1w}[=}[FIczMp+z\*R>Dl)%;O[g`/1R#t;ZQ҉ yympQ=R䜎6豯!Xm~y}Ta=șv%i0FHܷ?)v{;)(L|DguI+82 )2G#NԿSIwN: _ '~J~w ű{<|D_G.'zn&),8^~w ItH/ ;tvVڝu0=s=՞SDC#N!" 3(&.ݖmSͺ=#â_xx'~thJhN=R1َKRْ Ur{`1U)]mG %B)lMtmsxڕ)EgPCut dzbO_w$ tb}$ N'$]nJ$Ш[)~ veMR$}):_*m__vc} DPt{Ve(H@rDKs+p:Y"+1o N ZTL;ebyrXWJ켋{ ]JH ,ICBɮ/HIPΝB1%o9or!ȳwM@"de]9Йzn^|Do9IhTCDن ܾgP{kc夞݄{u,k.qnwۿgwiISB+fS-Jy\ͪ_FoX<_6b1sPf/?|~563 V] ߾{{fWO/_Q*WaJ++*\Uq(5Jʨ UYZ[f&o@߷$B4}u}Psbq^{4z_}vH?{7 D^3kGcg bIrA^|NMYsiVoa_"JQ!SR[ eN{gD#BVu%V;SUҒHC Gqv3 Fn,̈́J* B¬5FJ9T4cc`_th*ʩx ml4#%^2 $`FI-l)ԹU 3kc%V2kI%A!,cfQdɬZJIeeJZɬ%~nĜ3F%V2kI%A 4CZɬ _fQ*Jfd֒J#̬qLtɬZJIe8SdJf-$1g̬qHɬZRIPb|0%*YK) {L̚`= %v58_fMHQ2k%VRˬ uɬZJIe,Kf-$PǗY\Y+3k+YK) do|5EEɬZZI`t5k`1UɬZRIVo̬iJyɬZJIT0EP2kY Yˬi%J7ȒYK+ ܊50* Kf-$OQ1GZɬǗY8d.! i5~uYaRq̦ZJ9 ] 7GKa fml#?1w-o2U۫knjY!ֽ5sZ}yg& 2is<J0IAY*D^HЭWF&@[|OK@=pȗsk/B^s楞 i!f @hUk!g xAcʚ8_Ae;`݇"CaYd5@qn);t/+ʬ+,(݀Ht O%CXB\haeB)BAB(e7s&J(O%~~uIi9K/}\]DDhQ_޻_`2-|8.NS\zQ_Lg0D6ZXp{Wg5<8;'f.-crC0kPb+ElQ-Ϥ>RU.FjΝI2 fX7cA^Mo>^~Cό|0A0;7W g [>i=|39On%ӳ9ʼn95ٸpIo¤ :LU_PPy> ˻ݸ7`ǿ:R=gF= ~ﻰH~ƣv,CD*޼Gak-=_w z/_7e\|k{q2|Xd.l˺D}5~$x)8y.^FJ& K1gQ  (﫫h*.߃*S ?&!|,T 1]n~YPbP@qWIFʲ?AXRV0"PGn,VR^xWJ!A+׫DFWA*ֿ2av=4.!BDI5tcѣ4b(sP$4ji`+rqu%Es;6 mNL:i@G|8赛 ~09UOL]iWpHc\q;-c. r^oCs=Fdw 9OѲT/|fc=;_Q4 V(7w׽ ٘vbL)RA;rx- 8)\g3yL>K;M<)lڱJ{Z J:I˘.<ͭEΕ"诐Li6QO:+0^XXa]ϢgYʮHsO?ðyΓ :O*< Z&`*:AT?QT! `EFL6U{o&sےQ/П/zS7\zӚEq!UoeX/{kž6jl):jB2<+& @Zj+Hm%g!F[4?8~(A9cp}*q)31`$UpR,"6p$81-;K_zJ.~pr`VKkjp@4ՠS YiRxB!ʝ!TpPd8A < R4*TM4b['b2ou(-A vM} x[sq *77,& "A5lmn򝾄Q:as4cLS|m8֗":"|D87Hׇ!'T%3q6JhI ` J$&^m]ASz :x8( 5H /?^^0w跹USp?k[sVP1 ´eT} QG,LըbΎw,U\9<ҡZh5anK)u)04 %KU01OއuT PuƘبTrK.S G<eSaT1[sTFWOX݌C(r0Vh{`Y,:öux^W޿Fu:v/gr߽hpOQN5DN*x)+>,GtH[kNUHg[1r8<GLsz<ĔC"sjuy*t1e)ǘXp(X榘i,w|+]k*!7]@җ6X6]i}:SÒ(:$2 nj)qx᠅$zs6Z6{"-iV͘|9P/;LS-t߼nܜF m?|nkٕt+?O! Bl ̂V6B)ҁ4c+rFzELJdN^Z| )R4 #%(FdJbEps`LUmr@}gq )dWl{Cb:FqI)"TUkӺn~zy=iB{BkͰ\quj1-:L9VBQkԃ;qPު5{fX6{f;jhX;q"kk>Q츽?{G(^=2:( LcacLjb4gϜ4bV8xd9:^hudKF.<s+$'ۏ挬@ zNzg') NO`a>=;i~`E=Ӂ{P{> f8t 0wם@~fk3},ђZ唛&-ׅYzٓ! c,bqha8 i `QzMS* P4jLeFCj@ɕ"e%iCCE0F%QĊ8 8\0HtRG(MDE i ,Gr}8.$-b-yI"`_ׇ3Ij [dT)9_Z\3p:Y Lww;|d:wzuwzA$\ ɧ2̫;A1~qzڸ~17$䙛XջI]T5!wRݦQ@4*y&TtVY|C3|`)+>rNdM:A=hm|->[x\7xߖ8R| 䃗!wA j:GjkNi_rNuDhppV>Bѧzm'PW ɧ8[U}/l7cшji]cJju70iIƇJ7Z26{@3}!ܼ6yVsG䌰hYAPkp7yG*Wa6isd{j^˪JpހUz?$6ق1?:[ַk041G#)1G#p}L9ic pqd!XoaEi0 p!aV HwW؂^2ĺr6>{9/q#+Ն,h0 Y-`ov f\5)#D YYX} qpl<<|W y[VxpYK"^ S%͊&)WD9.TwV *#3pkf?ܼIݛU.p؅՚O׿M|]tEUbEb [Pi"Ha xǺ_xE#K÷(_x6§ &qfZz)b"o`ma2إ*yKch|3-)ypr<)ٙ';ՁT{^:gUq/""YIuM(B v ;r~wemIteX>̚ñޕ׼x]]uoV @CFʫ2[0Y69jRpCM}ع9q.nO(oNaѡ̳٢% 1+VAw>}㍨&;4΢"K DCʹ pCZ'$ᬥPw6wç[umcRwRОd ١-71HKBxRP3c {aۜv͢(kMA0NAؠKSJ *u,p!1>6R6D>r=X[rzDecO>aUP&1Nʞwq7.1C3y!}]qN0a~tAD[fAnew @7xoFɦQCKƾwMK<BFZ>ah"Y1gJRGbߛvL%? sJ715flħ'f U?AUt^7}χ~9N!GpFC;qBa Ga(y7NɮFyRbnlfٺ4r6XWGy<>iM{WW# z^¤ю/[@tWUAki^ 5!%r>0q1|ZvFM lLO{8 .1^M]cLub(CfCȬ |Pl)~R/qI (5-JY C;$|>ФltAô)$]"w1msWN!Hwu6sv.ˌ\Hr.mmѨhLb;eMSSCit[zrv9w(+PAuM҂"]*m /RMFso{20#_MX *$s6t97@Z MD+UL-Q(Z(.L>^mZw‘#$X{Ք 1Iu%qZA3'RDTsDNskBĨЋc|Lq z4(S:7w?~r("44Z&@/ZyyMP7ELǓaq#&#J0iD~ )fp]S%r; \so\ H8P::$^X谀d ;+x~m!Ao Kglz}-&\ עoµ"C#RYqyg =5Z[)p{u8uk18뾨C} a/Y3&-ih=~xWjƳ;gscj굒iE2KeBRL#=` Iܑ#E]PF $$uW iN{z rk#/|,c2a߬1_y FZr*1ii bdc-BB#clR@p_z95sn-Z(xC`z_Cc`-3m2@beW`O#bC)Fp& AGҌZIe{7Ad] +~5LC"gNF1^n1#;"YP`iiA=}2c>8jxQr*od*R=2$.8Hl͸'N{k8h@x)՛L )f0õ) Egx) `&ۦ >"8nrqx -4  ^,.Az:Yז)<#׃\ 4}yw~Px`97SB<\lŇj́a+A2U7l7\?.0טsƶxqk|ϼƒ1KٶBCGQu?-w@*~ƥ8Đ%7c/;5Yg 38Kxi>"=ϴk ˉ ^7:#dO$3tIħ@Hՠ2|U$%oŃW; т*(CY2 -g)rsذ@۔ oZGu{)vlnd΂Q1G*)/CH#30+ < ׃Or>b R` =F"`nrʍ 7jԕULÂj1VL#s(Cv fQ/N(cM0 wABjC}t2sٵhՅנjh.4Ё޴)6FAbhu7w-sԜ_|Uim6nU)Ls*G +B>/LZKr< 9-6wᓍtjPAH7r'Ʒ㉖PBES. Rh!Q,,Z Ygu1>%QXBL~2\ 嬄*/D}>sv8}Rc #R&|.Vz-?-g4W7Nc/\nl31? C\N wKD;0HD1z0YdS9nDƩe:9mNmf ƻ0Fƣ*MLJ^~V &jP(jNn*E:."L3/:> 0VdV0pL#QƐN+6:pL%{͆P+N%eTZ4 bסp,K[d)= 2F/8LeNF(|w8"QQL5rhKFIh˶eEXȈVIl" ]TZp^h WJ1]WVwWp ؀1Aq~9Rmx,ejŅQgW4df*W_׏V4/4[m NNy Ձx4r6e@!Ωg-.o[5+,ƩUU;XZ`wXax]aqtT,FC+P0$;WbwTlbdnbUZzat5ZSnDзpq5 oke"rdag1>Z3gM8Qԁ Exm̗5,T;Y !nߚqq62bFؓ#y[.ңWYBt(FٻFn#WXru ҧ*uIsw_Rm$ZQ]۩4fHi3CJuwE 1@?F7x:1w %z>W "A f,BY 7XxJ/‚hK#YIؖD˒:"E`K΅EAS\R9vOa~;exD-P `!kGhI$PJ,Ԃfj}iHևeSf@$0"a!GЀ b , ,۩SZ1Ab3wrl%,_,]u2Foi)ũ'}!'y3L뽲o>ɏ=*VSJ_}( 8qTW^>~q&Ȃ˾57>*uW$v,T!zQky:vxW)Nռ@xCjwsQاX+<}+"IlAtAIJJD(Ų҄y*8SOyޫdzP=,QMNN?qLBz;ЯB5׬,t>xͭ"e- /1Mb7Z -VJE"Ҩcu^oLU|2$z;NZ?M+RM݈2I3%OYڔޤ=|$ibK)b<;Gt'-#8w2 -S}K3~ˣTE}ûRK؟m H:1 _Avxdp> `fjvcmo!QPkZ:"=jσKdr <ܝ_OnMw;%r׬<|4f_²G@*OȤpG294Hm>8oo -4@nNȿ>~^"@c*IqŢP<&:[W.:[.G9M(XJ|>R9cuixEXK]!i^E*W7j IR $c7F0eFbZJQiD: $ m  I/ʪus vi3>Dqp+ӟI0RxͩFj/i Tu_YUKO!%.w`4/dd0Xͼ$JIBEd ےfJOÈ~A5!!^ufV#e*N" |*aתyX6)Tת138W;;q 8rq(ig7},lDf5O֫Ut\rQ<]lOìVtP`V*2=GPoW&qxzk]?ށYWE֞95 VG:%Đ2 sSE$CǞ/I<|iʗxB<|L#E9AYfaDICF{sD'JL 9  *[UN7잭?2_;T*.>7h&zf{ Ƭ|T(pj}Q$&,ҽ SEE&Ա 5 .PFis2>N5i7]9d8JyQ\B{]0DaO3CK]"0].h')/؛}ҽ>@tE>9Ww'I0xǛ aZYHhKU G؏T})?(PX t|BFJ%/y-2EAC({N?[{<݀׺i\9vR/^^EzyfTrmU,leC4f!8"H`NJUJ] '$HKU ,ܗiNTf<኉DDY;(Bg!Tki)Kikƙ!`+4X\ YZĻ ڊ@"z$}g W|"ȞɝYRQI [E>!Vɩŷxܚ.W/~M:֩%Jg (0c–J XX"L;ŔF:]iAGgj_L/՜Zf|}NDIVMX O t %2qa)8(@d&Z1llAnOY 8hƊYR\Mjdl(68F\l;(Lll~l+nK FS)_G2pf8XGa4{0+zTG{McMLIl۰{&ЦP)%*`eהўP"KBFsz9X՛a;A} $BiL(]dB , $Ln>Yp{7:aik(ͺ1ⒻꗃUu (ҩG }Fs9?ϠL‘zѳ5{3^m+iFk 5/ܧ/eb=5[6N<2JU@V /TTR- f{ʂW%`=X$˙DU}~YlUʼxzj_@—*-]FH1xu)ŔϪ&ރ_c}|DDhL?}|tww{;KL"?HXpJ9ldBsl^oNNjx4]H{h*k{sI)z9p%\q]rKnkS`I8k' 9>}9#+s</4n~9~ n~[P- A*2hf/T+;bߧ$Bc^,cz[_)yzbqu|Gުmy\_b8Wj7nrKO1߾[ hp O1oAIwɏ0I_bmdy n@v z Z`g\|“+"kj6[6xc0Na U΋]Վ;q.nJtu79V}Ңu&_ÞW;pfd U)m7,W>␾TH>@4b`sm pNh~UAuׁc +a2eD.;ޱཱ>-=zپ2Z@Ei\f8Ari'/rO,j65I~uG,5Ҋh&ɴPӃBAKZA!ĉِ5b>?7&w`xB` PsA=v H#h7` qx!Sf`ib%E jKZ Z0X|hrw@Uj4 e 4(%BjKΔ[aȩŇr_U|(DlN.ܒUVV2(4Yj'84pJKK4*Ҁ  uVVEt5e1E%V,9t섩]Uډ@7P.kʹ"oB׵ZPÊd#X H +oh6?0 :ߝ̕P|04|cra<|֞95dGhN_7fܝcZL$N;ZY^)o83[p֭ U4KqnZ7-)X\N;XESMn[ND!)n]W9笲s"`<=7տQ %9,NJn6tOj@BI6ϺpXti&Fcfu!ktލuHlA賵EgyL2!~=Ru=@CDQ͌KD0}_v eVwvRK|U#cVP6PD5=wHI8WktҸۅ?ww1Fafj-rbk.fbDkX97׉X_ʱBkXD^PNN_KůaWN_KheB?'Ir_~Zʑl:LMQ"~ sKt Kk2YYºYl/LހlhQ#EhE)}H(Ce۰BUF:Pȭ#/QmDgʸDKE6(aB^ϹgUĽ]4؄1z2,/}fLCIPX3fAt0!FvIcFFYRhto;e W|vse`zxNm\ "Ԑ\8|5 鬨9r:5}!]et3u}x U6cӹX 21vRURX|I1/.VJ)~]M@E8Ih糎򫑽Xq;xu]ksEdަaX8G EB޵5u#T-)qw_6vvvJ&lpS=m9D.WbQ /02e+'`<=[;p|8W]~TLCU\EShKyN)fKB_|,Hq~vn{ i5jo> 9.?#NΏ7#/Z0"!Ȏ,KB`5x:ZHTC9 Lo-t"ii-ϩeXe6:s?C#h*ݰy(zk5h!>O>B}@= )[FڀҧPoheYvU F@G RyJoMTHpQ@ ,BkRDWnD ٗŸ$sݧM9%EOW -Zpиod^:[XS:0C,aa7})t,C9ح%_X8} L]RӢhchO:jbTڈs"ngM+s9x3WGеDM*7RX]ruA|T!y vhcoT7UOq6;aЫ@s-@c Fk;5ȕkEV9CG| σmYoz6;~t,iY}~Ici$=|'!YO),>Tu-QкAҟS1 oOQ )=" ӂtzњb̙|Y!`%k}CLXe橾bkCQKYX'PnhKaN2VtiZ>UӋUᆊ]"KFF4PDSb@H9 Ԇ0'Qf!xs~bY,D˖Xm.) JWmW% DZ"7%E˵ G)8E?6Ԭ3҈ժHJ)웛;9е/i~[hh70 U50ܜgؒB\,;`PI]3ͭ#P.H%%$%N:bKҡ"ޡ9sk>'"J޾DDtkws3-anZ|SܞSQ> pcg9rfc eG|JI F+Q˕p;륶I/Ai( 46x>Jtus_| fFhNQg%/! HVTVTZ]\W+zʁ6AO>QBt R4Qʻ…a̼W/Jr 3ޢ+Zt? i)/ڋTk/M2ʩﴔSG< !q0Xb1|&F8wW%!1qA-L`h_$/R 3=qHpR'jj1NL [g-BUoLNkx2 B6[ZUdVh\;Io4f[Wz|XA46.ϕ*EHD."c)u4i5 =Lr%$'obp(2H)t (Vƛa&1bFV0_Sr i ,iR\Mi!RPEh0}ϊآ0(@%V?zFJP m؆1QT4֛9J)0IU]oiU P\9_ Ӥ?Ic{nq>;͔! o*,TMeu-r]ո-:!/5YIV}o~Bs ]oo']+tgKv/G-GiKrBJ3W.$4>7y[OY.Eb[?eƜxz U\: )ޮL8Q7cw-=wfi( Vٳ;ˊFY^3T}?AhzH{F>/oHRh~1}do\?/~^dyQxx۷BN"jm$Zzj2XƜ#M2x%cxNDh}ܟQxCx3-kePUs׶7*Σq)ЋUkTm&W7 _VиԂ,vL4Q1䭌kkA&&js3ZF1yb.e1\Rg;4rQ9RyΡL5 _>Ḧ́'1PϴΤCaC<|w[ `.sBF4kA,|JG[[G;#Yl7q0\ xx#IinP|[rӦK`}ӥ&$A7\~,Z"Di*mwwVMh9M„1$+yD%)g6RA%7dsŸEcLȞ[6SMd:睰R2O "jbA`o ;sFb%92#yZ1~$}Gߙ8#zmMmpNp7Gv՗<i?FۼjO5s c$b(2Z1nZqɀ.'h8(D׬bijknz'<۱jE# |./5 əw 8Ņ" F4h4r"!&+ֿ";` L _љC:=B1x #qѤFMq|ID%3gƘ 2*`.'Zag3@5TɘHXT@t\'IT+ r3JbFjsezuB;LyBLi4C0q:!u^DyIC4$Jh>1M>Pp\L0C 骴=a%e3q9\V:G5Eנ+Pa6>L}YyFxw1ƤO ڊ%v@DT>}d?+!N ^d}sUvr*G dJc< K cSdpRRiJ#R !*I9p}2XUa3*o 4a(%0"EBXIo8MA.-i6*lM%@%pɿF *JG"d@~}qrMoxGr—]ɗ`6芡%igm)/%7:od)s.0$PJ3b TMڐUѺ<rធf=o*hVxv;%fB+V6oYjϓ[+5l"$f˶-/R54zCm(\ˇSv|wU;ك0WMA(UD<޷J~ϓ=;|y 糶Wvrv-d|3?bmLY~syvZ1+)QCV&O<#0ߐ+f|neˑYS7<]O}cZFyo_37`v@SCWxl tSvAkYM/vk;]~k Qw'n#Ok$03j>l$ۛ_AzqpS'eO}Dߐ('~IҬ9u,,nAczd8D;ncc};NcGN/?E*څ" *e/.xf*\Ix!C>|c(nEpf. 6"ۀ =4 VO3 Rd 9UID.i{^pjtYDi#KB80LLQ;UeN$+{5N@4ݜQЌ__im;&hth#/ziѬ3=~|c=wnBMBoi)6H z=/-}}n0ȭT{z2NjC9-bHCqmSdT;\x芵լP(|v*TR*R2`>q+STI+=3H5yH3W̰\u]R$9~}Kr15h )" (2 hW"*!3.1T tt\./6&+eqo <7J;jzIt0 &ҞqMq'p֢u|M-IRLNP6if~yΙ WO5&bA]ԫn,Â^s6lG;L+%^ZҴlmMrqR g; 7a! ɮG3B.dO#诟\h=NV^rMb 7^XVˋƭ!*W'cwýE^~R!:{7eDdWc>N)|G*1Q 73Ѷ]TU{ |auG̴Y4nʴ*ka(b{ '?S } ~=nYBJF᧾xCb Wר;H2X]l6M{}ׁĊ t䆉޹7_rC&e4ԘŇ28?[b9Y|NcL?d1u3P>i,rX/ kƖuHA~$pUAQ} . W'ш~>#/z=|+!m;դm2Ѷyuo?=e F%t]?]<}>lYtZbR栰"l ]/gE!I)NX;IVslx!200X+MC; _>+ȃg<+WF>jB_U FqB/:F: Av}Xb{k!%Vy2ŴBU{@ddFJ%W/ԩk{VAOU~cU2W4Bà5Ȍjc@ (3:WS0r_;x[ct%Ji[s@hak _eoM6.ejoh*e-*no{|PJi-,&{m J-o}ٞ1ۉO ÏjZeN{\ȦۨL(%tHX^]"PKx z"xʧKdɵTXۆb-#?4کzxcK S~@o Fzk #?˪%{We{.Љ:򢶽f{x%x6 Se7P,u%m.]|B)>@A+0 5o; m8@15Òּ}c [15ҋ'\;}zLw5L10e D^uZy2ƖYS0pPӿvI$j0F/WmۼCy'ퟧv<5AFc%o~vsEO"}g;"ϟI^Ͽ?>Ŋ>n G`|lZl-욻4Wހk[}HԴrffp1̗b!ߵ)QEP(Z?əevN;&F᠂D}pGEZjH4O[-yy/4t/~I=3ph1G1k|ycKڀߔ[) \TQW` |\ ƭIVKi4DZ :CM+dgxv˾mNuW۰]\kkE.ase"saeGyD<:hL#Ah5ٞ>\9(rVȭܬݬb/b6Ҧ1k2iTnoѷkX3x, ^VI>LC똘% ?fyY+Ă%`FBtp4;Y[0` l#`OJ;8;8M4LZO8Sv1m=Z~c8c8I}Bw^^N0ok.)呁 eGlن>%Lal!s-^?> ZZ51% 7eqmy+37'ZC0ծ"X V\]j&\5>Hj‰,SE&bSb9e>SJITGE9{ew[̂bJth`{{ 04C[:'$Mkź$tYH~ɀaTpTda9[PܪXi1FaWw9)0TMW7J({a$ܚ! Q',9vss{={lZfфm*PFYo[25-)-f+w .Hc)y:(1FTU,Wr]1ۘcKdɀG.-˙ݛ}좐ɷ)q+z["W(dmBmȔ+YEZQ2kAjF[UWrm 8?'SW0wŽEףhKi.5kph;"O51dqs]bl1%֛NAcO$eJ5Z2[(׷Gy~6EOUR>[,=u4-\/+t3!mbT) WG|cKjvsxCp5\6XYaedRsvh-mibZ4рwQHs*RFFT [Zת̼F06ן&Zi81NOh-cN52Ɩaq F8jF=a=_=Mhui)0@Jb-^ٶq|uOm:&?9#cOnI ޡw8pTavYY8VY޼X*(1T'سg\! Xttb]ױxp·A>Hׅ_5iŠZz׷. )uM$] $LgQMOcm߾B;xY'^:ʀ$歑 :(FpzK ·NaHvu΢)Պ ZVVM%t2Cw>Pz-nǂn$n'ߤgzJ%~RZI-J[F=~ws[毾7C0n"=Q4{\77-*Rz ]rΗfFK сyh%g:gL+og$D)^Z?xV I-r#Da%^Z9&"޹}bx ++[vnLl-U7NJ@_E9HQY,ZՐOQnezRon_M8 iNq`rMŝ[]0= }@TPCg6)1ȔBa#o[?Mpbc1.:OrۨUhQafO0T w~!&Z|nM9JS#Tqndʦ6V8F9@\Jc=$0p@KXHJʨtFzFsXM}nOibi5%nZ dNA|+N4[P";>nu}c\*W"h<ߏ>JGNޏh9>5W#@4[@629 &FJLHۻ^k: LJZ k0J|,^HZF8$^8ƖHBr@2g\S .Cy-CiT0fPq4CDF0G*Tp($=3E 5%h-kJthMtʅ Xp3p90T;t81QlDid#R:u&2Fh424)Fєx} (7-QjM9CH՘b^JZ}"40Ib+ˋTzI=OɾKn:u5[c\<ս0xayy0 2pSS.R'1d)Of+ FRyaMח@L?0!ḷ&gtp@DV`A", Yey6X L^9q඲t4%H:ol%tj4 $1'OwuUQY/ ŕT( co+ h$-Fwy3I6fd *+ZtxQ|]oτ1ӰY\o{P)l} £kļ+0e1co;N@K= g 'Z.) ʰ]>H_-(_L1Qտ j_V7Cw NOPɫG/SYqA_z0~eώAOY8:p |8P}]SN|5!Iww\ N\0'v{0|0^G[֞۠=?^jo-a |wOt~;)bVũ7ȫW`0aL҉I8LLV7xI3r7&xIq9+8z KK`n)gNN`>oAdֽw3kt'X _AѺe&8!Y+89Hp|z2Q FݿfO(-Ţ^W d4z[ ?$g^Jeu>M^zh|k!Ԫz`Gpf EćI>4\)F1+QDlLDžZ*(ŔE1Pʜ Ҳ\Ev>єP#hJ24Xq%lш1dEqJI`@!XppZ`(rjY|OQ$~Y|!ugY^ga}kJ\D:wg9R_KjeDm\FK0 S4V҈X0nA9WZM340Y,uYjYjۦVbnF0ssM02Z#L^)>c1?1D.jyfѳEӏZ$vnN]iPF:?`@Dl@u&$뼯CaShOg;I1I3 JAG?IL㵏%**|Qk'0TӯRWSS+5Z>ƃI0 ؍ԬKQmƄ# [hU"̚EU/is (RD p 4z㕬:vK[i˴{dՑ\5PJyz{Zrjn~#F3u?zRaɄIpfPáe[ayk) \R Xk,{Wȍ KRT U7T%竻$_r)'K2ŵ׾?ƐRPp|kMw?hU#[Ұ8G49MO.z֒7RF"6c +r)8& :&S'bcUwQ"mxDh %+ w,z+s\2VNs[mAY=Vؑ0{{\$3̐gΉ5+ 9AZfh*fhj{4{CK͎v%o[ɐOn r%}*4*WUQRe]y)BÜer'mjVt[U}~y*ʒwc@D';w vπرկvv|pa`ћVÙcװ願Ϗ/fխ`wY\嬆 pG*f}$hr헷-MUk_u:|\dxQmٛo}h+KmD2cGoe=~gZYEP\smX[Pdd#9(gLy.l(-l(2\6X g/"YbV=|a>3GGU%> CV:哮ԶJ!6>sC&-.*H&^?q}~3ɸ_?|WCɃ?{XJ[عޘzH!)G 9`' S8M$ &WW=L_hr 8wxa:6 Iuɬup5@ejNحCz.A ;.2@3V/رV";h5U:\YBUGǎ))ys伶RveYe\Y c1s̆$nNHv9Cy D@d(R-C$$hz<w|%-O؀ug(dJ2C$^?a̝ >f(Q bB\/+Uf@f.c{$^L-jٛכvJ96к# |y>i5q$|sfy('}Ţb+XRYZ.X] \k]" /N3}oBi}wЎAks_>[ V۰5[M)wI MC16z2tmwc"I;&N4F?Pszzzzz1œ:7D"ި/`x;@^.0bUֵYZ/e\|V&z@8]^\ݿiY)٩_Esv۵z\e~zUiЂVm3sMGBN-|S?[ֺNf|U3vڧTaMNh98(c]'Ę >& vRԁ$1&-袈L4A7ce#)!*4u0Ս>PLtY 0!@5Y[GP)I/0SIq`.eo“b4p!!8Ļ\0LpӆݬM*!&+gy=%X1kb-)6 )y8ګc1{ĮxhvdBIN×es:GHy1V0 I"L r`gLP$LcbE[Zc1)!HK!d]8Uʘ6z݅A gpeCrů;Nl!$\9PW|'`hC5 ܙ9q""(Y7Ntc4C!7IDcPpLN):e*/fu-X )YHk[%yD6Z$ۋ0)4"sm~xck5E^Nע1JG㼍:g)bB0 9Pz+Wϙٳ#&N\!\Wtgdw8@6r:'ziI K#YH鶚O >6! eh9NaAn(C\4ʛ 扫ݡƹPN@g(gdqzA](mh8+CtkKd($ F@[F qbC[#􆤞9{9͆DMx'ר8#LqŃjxrj "-quD@э=ή/o/`'$LJ0i=N|SbAϴG`; &vG:#Uf80X8 %Ye(-36*%\ U2.޺7zd[bLj?⠢n?џ~2K)^C:gE]{s'36DK#/׿;z*GZ]t[@I!9X1"rH`țϼloGL{6nca;7nކDjIek?J~FD8*r*W0>#Y2)xG=A-HtHm9'ta|7kI|:f]f,v= LRI,,UN4TI:_?r`Js,0yJ 5vs2/)KqE wB5wi˘4nw s'y}jH2SJ.?b_^{z7X $'bl+]dO$P4Iݵmg)\ NpQokSaTY`OVy0.B͓Q&E/i] NMB%)A,H LEvLNpifGELC_3BhlF0>=cFL*HZ$bl0 4MrivTMsfm+@ǠEtqNz2e K$)6Ukn6> ma_~z ?o̢BtŠT[MQv|uK%x#_Cc>j'QZ|Ëut =\J5=\"۷&%CwCrheNpQY8>0b>*˯tuc+\{~_~ݢ6W 轧!u/-lYb V]ϴlh̽NuUQ+ޙpdHEkexۏÉ l^h$4Z  YKqڎG2PԸ{n@'\(X]uڡ5(MJ&Dwfld6cd 1TD,Flt@It"_.IcN- ;BhiEۗD.Rj>ʽvž\ [7N2?}|2.Lﴳ Yj5+j|3aif7iklĝOn~3یE\s´{7ryog3vf\2tu/"y"XKTշcln~R/*m ml)=eO7ubC*i0Uțle5oF 3&'K7O^_媹[ٶz#zN%;x"WDk6߻{#V,C J)b&wMt):i«~q=ϋnף?^~;?{qzwӽ `&YM|^ׇo/_>=Cn{ÓG'aOz:z&5AWЋgg'g7?}$d: :,:apsN{ Uq}'{]獃`fvO_|;NsiLhz{΅w^c3Xmjh\^̳G7}qG7.'q51t^v~h=dO`R '4$*>wzeF/.̥86% 8ASfz|Uo;.2J>f_wdl|7J~lR oj/;ip_]7dׅ'^c"، >|z27{v֋uD> BQ;lL̥. nfOW>:x=w>>|.\\}92ae_\]~9zݻ8rYgx`rU>O88grśywub ]<( (K;#_^Aw:ixH볳>uȴ~fLh-WΌ8V<QCMb~}ςSh6~CΌN|rt_6%\dlAYZJM7$1$5BP%1߭cMD؄ R}|:r7n}瑀cF ,in1nveJ]m׊~4,dJF1qPkXL\ L"BSfD: MUϹS)L2 3RSn &!`Rn YnLl9{f_ZZ*a-[_2~%=Tvt~x$GZv}=Q'p ox34IPDBN^ dgu]qWp}@k!q2j$(cN"DSYԿ%!1RJ]cqB.|Ftfj˪?L>KK4SK(5x,r^~Z `zf4a2D@CؒT:C@)<Rc)*yВ>FOw,/]0@+cׇO lFf2 t)p/ \7q7q]|;$u{X/I[Cttc )"T ![KaboxrB[_H;Y^ifnCi% ,iϒ(fm6Blvu| x;njT9d6BsDQS<5G&3&X/!_ PzDF*([zdl~BĜ "o"@iSŪDœIb6+2F8, .2(,4K3M_TJ)T*)DW]ZLrQ`2 RUrB%Y\G*ҴPZIp1+R a`SGIXVH 8 oJ R+-Hi5R&䌸଴ /v]|@E*IMi䝀@H kh6Y1U)O䜟*oA9:2cJ>X XT"`,C<#d{#t@:$;>5+Nm,8œ%oS*<](IaQ񷧰M=bmS۔ bIJWR0$ Z c=N=иgף0[*:Lq$bסdi^/]';6$@+>LI|슥Z("t3d@wƲМ\_$`BS9 iJCQ0n^\/]X._ \jַ*6nͳ`f@D0O/#szƋ~Fqܽz߹޳*۞w|vzh}fo=.}\!O<{&K k}(_< _] ,Id8ʲP9m:y`䶏+@ \FG@5B<̮;|yT=6&rp!@pղ &L6WGoêdU>;XR(u`_,{67vn*3"*g`^C9э Ni# 48p+a紧<"POed'edq=8&+9Rap4+MM D*:"=NaZd=!ψ!uI&it~Ȑ2<}rǶU3\rSDz@1f}綤*%g+]ذn=1֭"eS0wVi2Rrf83.yV*ZE" .m*)9cmvfwh25$˦eH5gvRPz)о7 <4GIxkU(W;:gX"8g#|/lXFD*.k ,DITsY:Bc PtZIxRdd ^Ƌ1p Ymi$2m1=)ATvocnR>wgis6g92`P z[ByŢ{%*&y낶%Z nĔ&9ljL/Uc`z@+WcȬ"xAG&!)H@ߌ4fzuY JԔK)2vV$YEF.bVKy`:Ddn0 ת_םpc<.Q|j IaJoHv.qyX]^[ z&~'o{Xm!{+LMl3NAy 0 48w($ԏoD'iau-vg^sۍ0 32{5ĊonӚ Kk#bX+JPn*<)Nq!O= IN<܈9 l/>#Hgw;woiWFv 4D;M%u+c!pC{NIJr)43mVẍ-ƊK,No,Ƴa_ec[@=5ok|7,x@ ,h;ZvhAE ZTJ%IFY 9jXA:MFҥ6SM*6TK m|N>HI1&M%e/% sJ/UF ԥ݂%j*^9r)+'(U60FXj01a ["ZDxv%Sl&[uB7eLr#eLpa|Vcyz }_0܎ 5%{Ihq,U!'D)JՖyj~xxss2B]xb@)֬hgǁV8sӊ]dEs?|T ݓllǷU(/wS]oFW0E@$]pymܵ?,-5䈴v)٦dRZZ\kE?;;3;3{T#sIxT#PvKL@x Wl! . !AXzɟAdMOh߁tP#’s [DZ'k&,{IAxDyDJDs)Q(iANt]64ɭ8Q7+(-v*Zbf`0m1 y.@C~犠m =E.k.Ǜ`飒I!{z9o V=#{2䔽69$ਤ1XO\;gEH-!@* `*CEbRT̏a$zܾnc/dYj]~N(ص頻5V <}ZEH0Qzډ^"xF*'wNҕJ6uZzm$#|'9C \x,7AN zI{Ԓ`zAo6=}~&a(Ā3=|9 ݦB ry" xbje [ M{OݬXimlɭ+-wlmSBj9C]^^4WTw)_omYeǕ;N &Z']ehpi5/-+-w=YJ^6Up04 'i|)飣쑠2KZo\mWK}ҋ;k=! w (}0Gc8_.t]'.TE^{1nܐ8jO)s` < #=HLK?̔ BߧS-Z/\}ɽܧsWNeǏ: 2 $J$-%֢~iy)駀) BqX#p2Q,|,#L"aEȸ!>5w4O/ZH}-]];>~z  -ܯ *MN>H::8ڹsvu|HH B/JVpkt?l)E/H~ DD#d37Vs\8,Ӡ*$kaC vWFe dN)&<{?>\.pKD"DOxCƒ0 ?OϾ;ydxryfÂ/h''fwgk3'.'3dޙ`H<#a}SSaYKC/gEܼTqV\qt'M?-lŒmn᮰}_\fvz;vbrX9; x{Uqe]HGLWAg+`\V2XT`|V \@)gׅp@0Ny>|!B(ObnPJ6.v)IX5Z݌Ceuu %nҼ:g/UW7߉[=LQPvv~U}ե-ݻ8}7w!W|ixy=pgYBin}[C݋2:~Xuݡ\?]ٺaqÛoWzk zhz$e_s-ߖ*jrf n<@~?}/N5e@eW-KƱ oۿrffBUl2oogT(*-^-Vk^W~`D^FZH~<}zw>wy^B/T@ͻ/Wdnӗ4Z WMOq2~m&Z-d7VD-ZW0cPw-Og_gJ{#_)ԗc2 w0ֳm=c ІZ%7Y*heUժu1hU)$_#^`O4h L- >tm^ޞ~urŚQ'\]ݒ8?"~0f?yiLk6O3K*qw,4[?ˉukw-,wL|m\;XzпB&w8{97.4R рA+\"O'6p Fs8=5ગG#H־A?;&7]9zOWWiƉf -~3mE41£ߴ>;=Y銅}IV 5]#S#uHݷEc>87 Ñh.gY?1z D3lN|D=gM^;!*ߟXjR"!^/ahi~b\O~c\1Wl<@}D!76"]0 $!6^G\}\(^\1@ځ;@1+.v0*}P>yF*"})ʮ) vo萨Y=!m=ɤ r CCx3ou뀿:JXmֱnP]Lv{>vWCKG!ydzjЏV0E&QٴD;`vOr% -{%yNg.3 //&eMey-E~%{SGo}[@jnuG?9vbnG,ނn  _aq+zAYO%e=/p~Iw+#slڿ=cՋ0*l#uwL_[G_ l8W))'ՖGDoBc"Hlh ng tdz r;_y(!1#:4:$$@{&.@dr^Kt1>:%c>3V#{QNL{uLD 5}kxb,KNƄQZ,}틪}TaNaս0mz1 J8"ApbQS\6ƈ f4 MBE/!) U9G%GV(4q)Hރv `$qɔ:> b)C04AWɤt9DyȮ = Z]L'GC,Iwh:ˊ>k>-K\`Ԇi#Eg*MQXTJCe8 HGUI܀ĝH|ˊ>=_3J3k$Mc y*2HZk 0LebRs40.Br ZB9P#P/,ig,P , ɰQJrI B&[njI( ,mT ,*{`X x^o=JTHM%2, 71[sH(8ʚ@)0Ҡ %I$;AU&WPo!rwĵ)ĵ)ĵ)ĵ!R*41M"J=DlȝƒX^QBrAe9YIot4NyrftQerȲƩ4glM׺eڇ-5e9)aMYka|}RjX,Mw\\z~;v6eeKUH ==oÑlnWw-r/wrlmEiGrhh* |LN޺|JLH%EuV=9?{QIz:9s7_v׷-MqAْa߾VZ 4]b{@Iô[+x5;xW6j, o]|8_KE7*@dN^& qcsd2pe$#YD $ v #Eҹ $Κl1F+Eb5!{dȳGKX/$#v^j-CPE #RB d v}Yuz͒LalI^WJFNFuI$=W94Fjk>`[̄!O--K< 20]٨-r^9<7&J[%0D]Ȏ:ȅSC$E.J +J ]-G PS "Br$TdHC-:tYBM.p_XHUCN]e :,Hd)-3'n"$8 P2{NVd$aDg,`F,"H&&"fdR"j+3el᯶ɫ>XQW5^eK='{"X!^/5¬5`E9&N^X?wU="dm+0o%~$qDǵ4Τ՞JYZ vw]Z2O>muqM~&ѹw#O^]iSVȔ3VR1С"t1169Y)W46l0"%Y9&I:B.kߦ_Ҽכ6u6{%Z;]fWy8j Uy1+>ey>זŒZ!vr[nX6|ח^dv˷OߞE{ZToX_Pj}G2xL7WRonAaП85Z|? ;bj~x'nwCW*]DzuZZ N;XI$۶uc>кmBC*dŐr} <+)41SQ0 5" ($㓦HP)dbFIDZ K&F!gF+t鏘@=E]ޣ6lqY֗9vi~cU5QK۞9r#Γ(7W( \()(.gR c2pˮiA3pmEo14sT, RzYvdn+(ˉb 6WLaH k)H9ppKWjxȚOFpt M-ֽ|53jRhMVp隅 XrN;"E~If0:sXG&$m2Y]ÂRQY.fɛcj{8_!尷߻KX` bm ̈́$۷~$%ȡ8yP"`زHN~91E;bMUeA A=VPƉ5VM+E4*"H"Ȍ8bܵ(n\xeFΆxl+C29TÅ1y|B0#UQ֖D1W<QT*$/*5)!:>BpVo+xZ9boCǖ10_k-4,;V}KqQ"+<ӊlGc޽(?݅` P1&{FPoZ-|(JÍM)\ZܹHb6%CֳAgW )54"zljlTvmjKat1V@4זm`)8{To8`!wg:$o[柒ݦw<~vz{ܗ:=-`Ȥ& .ڋ:@c,0\EI0 ޼/|>4Ƅ[ U < L-uД2Ww̾(ƃ2_3/u՝omyEW|{/:x(6&D{"Y! Kox3(yE y rbʸQrG#od#^hTpcjw YH󳟍·72e+c-ʉHimMFK2ǽV,dw9OfKnzgټX8uXJC;8yEpUGXJ3 i۝**i--UZ4)X ״> j<{XVYAf;UI e!jo0l4])!r~vN!-I' j h}O&p)[CطS6딚1w { Ӛ7cA};'`KR ) Vέe]ҶEUҝ*[t#d1^O7,*CBMP S%ӵpϳt\gH"M2 <ǾIUV\Nw"7"ʩ&SW 5{0ik2Ld]GXcWQ-yRy=hwPī%\8-?_ +jhsHKt 9Y@i3fq(Y_nfيɂx.rb h"2 `h:nW}4=]9X4(/yPmrB-Τv`r54ʿ-k.TkT.>M7{\VSe z,_9Ӟ%dp;[xtw%3] PCUufR0}9xAwnR?O -ĺZj%*C>A[o, Z[eԜ@@|iGgȑ_T(ʄRWKEF|5>ߏ&ezw{c\62|onhGzZQ5>./{?Cr{>i20wЦ4H5MoYK\4[jnZ;8olT/O[ہE'Y)rԆD\ (A=ܣ.ڻu>(BЬI5Ȑ&ʱ[a8,$yEuVf^Ne[\QeT WǫN|;R1+JiۜCg1o1m).Gd j Ɓl+AѮ*^a5xiOQO~W7T EFң_,7#o(m&84em9%?G%A%16lNtNT{V]JǢ ?L- ~u$K?Y .ӷ$|,fxd> ݯ$+ WJUp$\$`gJ2 Lx9\f 4u`'OVQɄ1#_$a$)hOraY?8Ag~qEK -1>7`j~i N4GjVALC>5*0{R\թ1@.m v{n#ƴr7Ke_0ۛ/l 8xՈޮtrf֨k^cZ6l0p~|i4Yvwn#s:wғgzz:vkmkmc(ֽCw)mrZ{ %I@/wݝX4zJTRn:j[>{O?{X+Wb*V\+V;t6 IƂ< 2eu) BzҠJgup5%@tCpu7-YL?|ާ?/YgNoSr(8'[G)ep@m8ZNZsS+LI1QX?ӝ+Pq4%6b~wrS8.\/Tw1Nw[]?v^"Wꮲ9a.π@_TlU@#{ %$37 2S cʦ .XU°:ZAԁ0Ӝ<>P)ȶ (o=Ae=F)A5 3+ i^*fy $.wB9'MFMZ ='WmS|hJ!?ό녺gnXv2GdEO_E萩RJjñࡂk1P|Ԑ~|XLN?R 3e ]kaf[rV x1,։DgN|J!URil|rnpt4(@Ie3E>;f,5.`0j*8|0^Bi0@^+rfMO\ʫ&#e'Q؀AJ$ ޢ΀Z/@[9IgkGYRp4TN1 ;t0'Z@*tH 0/ cSAX('AO<oK̃_1 A/Tqhx4d4 4P#)(b@6 PQ(D1e0[cCT {ޢrn.]Pá)Cs\H-ӓxBitwox&1sT_ *ҳ0.ueDZ衜RFnh*Xp632dwti*KJeg 8S)K4tD LEJ)k]p,z pvZ%ozTEx^*W1ʇ*B)3ZS W4,LdH)R H6uUEXwCYme341ĒfF 8i2[!&[YI9CcMy=1jU;0WI% QQj@'ow>%OrF}s V* +p+ T'\H& ј_[{ݳRe'qZ`g7c:x9'Z{.hcf6uѥW$r/ĝw [goN +;^n2U_^u!_ VOq5*;ѿ9|~䐷qz7ھ!ۮpD(H<|kiVv@Q-ZU?7~K^زD]x}Gnmbr;j;E;Q8Jm Zr}!E-HU-~~tE4|@ǫNmf}0F|;&iE/+/q J(jP`z(jr59 =vBNN^Yg} lӗvOf^7[Vt= ˠ]ɠl) Gܽ_FwZ'{oLQ!`ܹ ۘ{ޕ[C]ҪFP)^ԡ'_X3wPwiP?&Br48$']:q W1<;`h&m&oy{)U+-6 Bw A9ۙ9ba!C.9e\Ϻ׳klr,WZ\i:2?_zEWz8WWWY@G̃*\w-6{Vܫ{WӲSICOzF'PR +oى7kOy4PyaN9M>_|y'Z?ou)8ݱ>S88Oagݧ.Bu[,1L# (j'.)SJ13DY."8 d+- =s姏qP3%m57ZWqa-+OFәҟ 7C {9 "%Fz`- DVqɁ{㼣 -h;6x2hz(HL <_T%Ŕ3Ь, ڽ Kd)u*3Ir Y"t%㣶(t.[[sn{7.v@~؍E^KE b7U@8U8C w=q##;턈 qR 2aPR5KNTPZi 3ZdrBJ RNTdKHَc2+W67|Gpe\LE=t7N&7R3ro/X+6E;+P_9t{k:tڝѿV9ygR:jQ?;?&cիTU[eQ}{y`4tc)۹s~O/*_:գ\*X,$->DSƔm{+ܶu [*!C'vBKں%w4Ժ1B^mNogl$oBx9-"_ݗl>syh#-M-kue[E}8lΦIi`i(7 1TX|)g;  쑭&d ]ݳ&- A6J_7tœijDa s%5[kk4u Fn1ivIGh{^: 鰾o?TeDX^qyNhDͮ-e^:>( n~rif%(UA|RT5,ؗw&sağQ19W74Y4}㼰͛E bia$<{$ɏv#T2Һ)Le8Vy-*]:9x0K1%gHXRm0+Mp=n iTJ#NmS_=m|wl+}!jBRheǝqx'.#8z8ZIqw7Jou )0rhhaoA~|I8boE.\\$ꪻΗk;R@hdê-p7F 225C [s*~i["48].NGҬSWZˬ,2/)2%F \Zs;:u |w19.d\jCF,FڎZ$jcj abC 0T,3JND(^qW/ 5!8}LǀR/r1quLܮc:*\#b]!l5m B s`H %rZQN$ϵ.3T֪B΃Ԇ7(#4,d?ɸ+ fPΩv%E/GS^j.BĮa1pDUIn;J ( 虢X$ ڡ q379P.*DQ2bEABsԄdLڏ Xu>!JnDSFk@YK|.F`9dbO{<\Q ^H پa{5wv*E 6KV!-o!!#SSI8%U/G MC)[.99>rDLrŽ/-QUzu,m[\K&8H9JAӥݼ}ETgH ʱ,5YI`)cTʲxOKc|lCi8c2C07&}*|+ kUx$o.8DA˕GrE#$,]hmPKjM㯟Zs= 2pW@L/wme׎hx/z$Nh]((8Db"Caګ)(QA@V圲qkJe3}W8k׃VB>9}ϣ}>0elu?Tһ ø©۞&;G|5a>}`W u#˷X=mH̳tfƹwL̮nX~C+ҡiBjB\#\ю!^;:RŊ҄r'|(@=T`)N#ur4".n!ء+腝 F.dm߿\|8 (sqX\ m@m}/AG2FJ~~q /Me+ LdmT2eI h2q(2q(?e噤c^h)3A(sIQIrZd5v<I4EDsfTkCZC i7.fхl/YTaiJG+'}F[,j8b-[&n!h/D* X2 XҒzХhI! &Kƴd X %Yb( ky '/GHJEW@`q9eO}[rQoOX<)ñk'1'4W䡑9.HZz"q*֟ MhQ-TU A4_AO>wV5(4 htsmiTjT^#W^W;@8>8s4.!QHAbp'ѳMӳ\"#@ suE rJnrYX3s vd z(=#ud HѾ~|ӭ$ǡ]H]RiyD_+GxH(B_I 0uDحW+/܅eH]J{@Z*F4Vg[!!Ƒ))yd5<1sDY")I;C]R s)yhKSDE +Z|"׷ah ZHUW= 2 \y2JJNC)>_\Sq4XԳb!ut,("b)+@$n(Ʀ2lw.بuġR-MGU0JJ%>G؞lS!;_v9C:45pW7/t!K\Fܸ32Ei? 9;ASA[gd}hUufV j}L#5spRCKW_OOVff.։Z|!qǐؘw  +yŞKC$P8ڑ~(N;)돒p#$T.Fp#fLrLinjrdRXI@iu( l9Đv.CWK݊˯$_ ͞6SAakPڐ1Qy9f"mY9dVjk(*d3BB<8굽(&a >K^I^ڐJiiM:Uqr7{w1V)6\A6>E͖ (R%͞^ޙhPU5Z/ǓDbX.(̵*G)XW^90r%ɀfκv3H7 %IQ z~p䒍C5|L'$ZOqc.f''cb;|"ĠǷ j!h`И)#]a (w|h!MWBܮ &]h.Bcz]X☹"mMHHp $ nUonUrtfwU%?];(cP]s1<UAVoERJdsUoP81CQ w|o)gjLkATe<%Owp}E!P0.bl\cIsaWoر9~xusYVs#KęI:.,+*yVj DK\m7U^ci1@"B~úxV_Dr=qfO6{o - (@ ׂ91҄sJ! mɤ +Nv - %UhIm):P9!Rs9eӼ\*miJeI *\*1,&MZvLei[' J#:r!7}!e7hQ J h-܀A"Wpȱ*3pO24丐gL(a 0k"… S@с?{㶭쿊cqsp$-vMz xmn'wCIךvr6-[o!9IEDwb0BJ(b G y!.i;aepWRn1ÖyA/FfeHÔ!NX>A>\wU:WnwJG3%:QAWck6XXN{P*S0] XkbS!bC)GrC4;<*2uqݑ u#rU4;tgV†)_^B! $Ȣ ؙp iZ3'Mӛ :Û <Ɨyg0/NnӭmAv͟OP,1Xp:>kwQlSq]O.`YF`@_Pض K ڨuHr-ﯼ+ûR&%! UPU؀6 ˝H@nB Bw ՞wsd[y`QB9*U+MTIl}xxJ7v툝S 6X:?yxëu)XA5ëfT畞,nj;3cܛݧ3ݺK0tNIɥﭳig51)b5(qχ;8]g@sjcYg֞G#*+ݺNF]QPNjjUY6&Z1%kٻ]wIXڐ+q_tI%u?VUN^ΡV+>y)rlzlPig8Rw™(!uLDT!`^Kc8t1f.q"40jЩb +(dKri2SU6Jt@"&g$*MPl81 DaLː' L2$$I"&UW"T-srk3.&#)5EG5* -8DڈXl` Ea"6\zp4 brGrD.A&!ĻRdRO2iWH*F4^f!@ tG:qg4H?;՘b>k c0]܅A?țp=rܿ>ydO ;% ۫%z$<ނ3OY3,-kws!I+Aw zoדHګXZpJ&P,䁕 SD=ʳWhnH!KdKDՎ6 )VJc;#:Z*e>dgwBIƚN/JrE381K^x/ނђSResss+Jc1<F yB[tBV`LVXe0 c&;|Ն*Œ,8uF+ʫ>+ (3BZcL "a(*$'̠"IB⛮eXA 7t?"K2ʴP?$'Y122V%C[*2N0].?$qɓFw`S؇8@zy &5jhH^"NKs-|12G[<X_bΦ*w9n>~ogyvqgk4egꜭxtB2.X!kPw2qZlY y'ĭt{}qBy"]}zp=Hy-uHMK Z:a|f!҈ :kj]bᢵb =󝅎khྃn4#hâk?>N1_(A?f! QW%uYu);K)$Fۢb͙5s|\N?$ t@(9G `HGSZtaezQ vȑ yj$"vM%ل]3U"d~B 0GnjG% Z*('nq7:\˳ۥ H#QÏyBBAtAQC癲"СЁeT5I`G{,hn}׊ !5nGHbtq97LN]d+uzÊYbw,BteCv 'S'i$0@bb:Ls :Nj AֆH 'Rsbm-*a Bl<2.Z#X= ; 40?c0њx7\tU* ED:RrXkX!H'[XP̕PfEDϳ DkI_&cno_ぎC~E%azGozia@Bl!Dǖp FA1h: \Lp?֬".{&M&KbhqԄqw/YE1Ԑ[%G88^Ɨ2%æG͛̃G+,/ $jʽp-sFȥ5&Z0\Jve CF %!QQl&V pzjnbOLEe.Q\z}.BМL<.R`"/p\N' An,Ļ_G W x{;x ;|I˗O{]]y4lrk߃^󴎵מ]?ܛo^[Wo7f kmb]駗lԑg;>u]OLD?w>iW}EZ3s?^oD^x2)%{닿kXzO;h%g2onf,hȿM"ŷנ/ *Wq~,0MDN?7f׿_C>vOG/5f}{ Ӗ~by3c{w1[2WA_<RԺ8uڃOq|mUۧ1(wϡח޼@X8q!@[FT<^FoSArw?L> F.Zy1}C%es@on>ש)~"-ps& d.9EE?^"r, v8(DuVv6@vf'Yۯ7K"F 8Ӱn}NV8=oU"aB6C햝iזM?ĭE溽1w isxyUx6(8ƀXYXܳK>Gb9zѬts;I𝒝|\+\k!婢dǒbvbIqm.lWǛ9QS^ lZhQOsNO@;hBC,D}.slȷ?bE90'49ssN8TUF`:q68idJsF25 ⚅DWfbeF[#&\=:L*?ۈƆE.3qɑAaD /!NF$-O埂zFĠ8T24TDD$1F6QJxTK55& PU|I}b߲\XhF>БXF(sa~# B$Bb$ZcI  *? 0OmNAi 2 }E'8_ HW޵q#"gѓ"Yŋ<,lY$ݧA6IG%˒l'g~gFVki*Ⱥ ͟gW%;?1ekӴ [^W/x&hJ.!5s^4PjJ1\|Lj>IT 3LoRYH4e1@P߾/4Wg]7LؔP# .`mO]Ė7cJI\|6Cv8>Ώ/}wKAƿmwQ "IXWv)@g:U)r)%Qu9,[!rȱ*ޜhtug{wK®&lƣqіДhGm i6:guG B w!iʱ 1ikGf{xg}hC!<ͳ}]eaߏյhY8hh^{r^V{>Nw"1,kQʲ=uw56δg::fE҇`UHh׳iJ7ZŻT7oo~MQ/)eqmΆnzS|7YɳOɗaKCs>9'CLF 5TѓiK%uN5jY``nm;v`rm(|ٛJ\WXGfuśZ^8\*g3zd7_[Ʃ_: +{`,ƭ5tia:9r=c;VZ)[FQ2dBjp6-׺2=]M#eO)I5FeHY@&j.&rN?8LRTa 3`ۮ+"˗=:ᓤ%ƚN wwoXdB+*2:&Fx*ӺOƧznuRVza j_blUٍ2stUt}RM-_՗%ZfL?ڼP+xȊ|{O8f)Rtkiq*B2ՌA>Nا9J~cn>6u9f~6%/DG}ysiz,:s4H,r[g1J_΅z?ݮNF} zTVEWhCnw֘#A0q?N/=W&ls`Up69H(V+{O0ej*Ɉ>IZ#gSN6=:2pN Hdk4i zf%_SF>uE6@Mv Nْ`JI"&%;f! {ӤX.fd*X&'0kJgpIzW"HZl 3dvVn="ض&G>9KߒAĀ){:{)bm"f)uB!HJ&GVnBNleQ'ބ0f]lN0h^!xTFlu' XS Yh鉿G wpEdTε"|1((ulQe)=S'C u,a[9^wlkh O(j[gґQ z.}xc2>] ʱ~&0l6'fĚKI2 U=h>xg$1(2/pQv^x$hd4zɚ[ݺ<'=.ut`b|pgO(KY'0 ٴ{XevGۅIsiU hw]a$޳#bGdRLH'w6oAy{liI\]Bh"`L]3<5l$ҝsUZEZ Q+J'N6k)M>יZʞ)wrgG^#B .(l%\w@-$`S N(teAIQRm̦9*q&Giuj)Tu@T{ t>R#PB5|L…@.j̜ʨ5f^IMc)%>%服fD,KXaٴNB͊ޛZds6jI7= Z(D)L4Uai^KѨJ紦-(h)o6m_%SzE D;18iXҢqj"c.V cU1!A9Uڢ0+ӥĀ&~tj([z)+Fճ%2smw*2A[e>pilZP{>K'0н3 kB*$?1Fhh`ō˨FxbzY{ԐCJsB 4ZZ 1_Ouy RJ\e'dR/S&N6k&hZ$GfhnjOw1;\߅ygaXv'Z4:[䙄Lf!>0_^zLX}kᅰ_qfȨRM@I ro넼uVW>h|>|&V7>8=&:S9"iBߜ wUl)sX;pRBڌ1tIFNܕ(>65 z9cF.@s| !A=zrҍT܁nH7E22!4ΟjO;+c۷|0~a% p)NRZ{ש㪗@..uL@$SxV/UB'B+%Fhܔ^%x-afK$&J 8'Ndr+Ζ V@v.2L$ i"m dbh`u&mŗ dKPiS @5{dla`̓.uDys݉VpG*P|wܕ:69ǺrIQśPA9}fNJYy70shy(L'ĒvVGf)֤T}O4N,P[УxԅFF~D` é3U?P9 S)FW(zF֌gR<%nR<- ؂6"m;C(mJ {4J?'J,9ĺK.Dy$}BIHja-ݻMMg U$6sܛeY,2`(ГVYﵗAd5c<,u(f0+2%)EY_[Wn-4U<:샶8D FϜyLOF'Z{bאLqfhCk@ϖX } ['DqLf/oШJ֫b;BK ToCYGx QRM+Z4cIߦYhn HNWWp0(!e78oDʷdo@jx㟄u81YMn]CI8kV6n_Sps>O$;o )V+wifc1`rU6&TXQ5UZgVs7JFXɦOʯ3k8M8*#aWJ뵧axb&ϛSL/NNya}]ᕐ"c_\#{wQ /d% -Z\Cn~gb9%qOa[}2'sI) vr%)r&9岺 -ď|̅9>nqiב#GB *rTR'T*$$zg2_$ Pkc8%RL}8N1$uT+$߲4=P/4iҙXYP6.s bB{G"9^#Njc}Ub."DS& R2f"0[ "8 Bls"hZ=җw`-Ўwi1dFq/ABVȴ$vhtf{E*5Er)M+PѪ86J}HȅWB@)W0uR“0Fyy_D0`)8Yh 7K+gPۉ7Kך"8=ڭ=&n9Y#gӾ0(@,eR`W6@n'#&DOK^Nvі"#j !L`H8= 1G=EZ#>oO F,sb@$ISqG'՚ZJ@xk :o9d?)pL^>6]xOQB1=lVGWf2Q>T@JP}PUQP$sw;">^>|r[C8 bps !Gst0($B#杅3&koRO>.ee6M_ySxA\ kl#fNl;Y5oWS{Qg9XwFYꦴ]xT[VOi3 UczU-{ҵMQۖZF|EyqW#^B,Xa% =މs>9Y=x<5Bi~O' Vj^!K֜O>ym!w,"tY .c>^"mR(ۄ ?B m/q:q'":3 k;X(шWV,̻DfaiP?6t:sC,zjL`N9$ū 94rF :]3 (TEbSDujS@ &I0B@3Cv;\sD4ɣjO[Ԕ 8K +eBb(hxsVZLġB顊Hݥ "qY+%ʑ>u1t1= #Z ^|9ljR<~p{ǕzW<0W*0_i9Xo?| 32l=IZ+a<)\Vg3OO2J#uF2X^D#BǐhtTH^^zbPpHzaPp(ވΨWvG F̪gfgUUHrfUUdU8d\NOT_]NI}kw{0>df)MUdzԋýsКw-awwW[ũapv͟WBqC;7ߦX)rXN2TflEާߟsxhs꼣T.2,k=$!q-Z[Z7[)9S63Fn͵n}Ho\DT:FU#^,~?^wJ6ŤM77δy'Ƽ^=~>:_| > لld#+J@6jJ`;n >M>U?pdp!ߑ G-`s$#sn,ޙp%H[IZcIQTuc!:sg:REpa&*9e Vif}Ԯ̽Y>n|S{tųB ^ThUCլ{7x)^V[pbUˀ љLF_-8~#za;}glad3ʦ&&25agd@0!tO}C{ p䃁wpa4)Am{J78LQIS!KEFqox;ذފg5999~7@j6;DT}Fh]{^sk. KsѰbV~7ʀ'k"` &/nMU~@>ed7~1X,.oo@>J}nymZ2.t̸3h`HS +G z^\:vI+i.U̮cN]ؕW*{4vZ[n:vkz`Wq"`QȜo`Us8"<2Xf ' qe0)i:Q|9^kzӍ-GJXv^1XT9ǙJ:AZCcXR9& ({?gW3imxnS|PʒYG񬦲)Wӊx}xnG#/ȨI/I"*̇׸E 3W(btu5hAK5 ﱼzHW`D9ѾmϫO )1+UfX BOH/*XDu0OsY֙`f㻳7Û+f75N՛O^+T&s<-25BOAVUm,!HWޢJWP'W.<:a t9~I.]$ MG2 |9Ke8?'rAiEh4+ߢ}H5W7rg/z;[W'O6kW'ݺǰ]y|K>DsEՁX-Wneh1SϤu'֕,'1[=6ǜ "itdZ0u9'HPXR( IW=Z~RV\|'EM޼١ :QrЭ?WyEY{7R@^o9|.Ɏ~t']!oiOڻb /mQ:^ VLk IIN*ǕYZ?SzYj~(QP=$EI&nq?_Gm%mr럻;Zpوޥc0/狟.!$x[d")lg:W6n#ȗ6|'g+=W\_zDJu$e;X6mztG_b5,4b`Qlw,%1XpeoSǮe ŪDooZD""4K#%R $+bd'F xI.e{Mj-qhq'3~xlgjlݳduDL}yP NNvdHy_E|y_J(ܑ^Kw0bl>}6d$25_, 9J"(p`\߮Do]Wʡ A+$ HZGm54Vkzd#,28w*tombP)liUDń<W8\ ^gۻoQ: ݕz1FVR;\]8] qeB#aҵ@-kTM-|nPw\vL :qPrܹBk|>\ yNuY5a<2JanT^D{3'k=ި(>I)Z2l~:1-Goy3^F-5EO8YozW E) CbDž$_*o亁/Q!G̒Q6BdM>kG=õhr*I39_}35ӗK j !1X5+`TjX-><56>Z>JQyS\gc5 zrA e];mAN9/k:g8mkՓ|嗌"PFV4i}k !V} a~0^;}ovc?WOfU0]41\`α'fKaͷ+x..y3خ[)gSBbf HR4H3}azKI'6{y]%$ж9{prKM`Iޏg!=i8s Ƙ~_M)bIb N4hb*YTx&Pb xM-֎*wcHK\8׌oO}gDֱsAtL6wq{w ށIGۊڂ47+~o'^o7 q`_(ǟSDXs=Ng<ӯ8.ÆqP]h8R5 g%+-$U(EVR_Bs95<.c{8V(d<m4z/o^Č2YzOD))ͣ/y&U2. I8RP%# &+ZCq_^-KwCrɢ oNkĚ5)[SּR^HoM:Q4b)BpjP {5%o)D !l*S _x17g֛{s;ߥKB3lxz0Uާٿĉ5f++. x>j"V<pW,xT%I*T*5*$qMۧ AuCNn+2B 4]QHuWq-}D|m6iwiLu0DPhj8 y5 ^WmpNBN/8T@ 2T!PmKU(#ɷ7ĭ򂅚i>.hl,C@wÜ:&9Ne@3&v("kH8PcҪWN5dz*< (#BT ދ!'8toʱB6WA:^;(uڮ%zs"Y:f+'\,Zl)^ Zqx ",/Ekf@:kv `$1&Ku9w3{~vY)V6t|p (+{AU)#Ņ -#`w⎾< 0žf+ -Vus;~œHCCYzec̀!ox1A;f0'C~k . SR8`TO|yE" \8󀱢A=(d +XЇc< 1/A/Em ѯ%&Q#v)Ø{*XPP:FP5C֤/-2T;C^āENI^bHK:U?t9/xH/z[eNbv4]|G;T^+4Ĕz t*kz@ꗋczx)cDq؁L8B6م#Ey``=LosmX.i9m5HC*tiRAvJ$%LF KiHbe$IrЩ W. ȕߖCZe?<>rnv8d*Zݒʉ=]]b PTRB!-0 H 9.=ICOsv;㠳Ssi~'FtMT[= i[V\HF$Cm^,;Dv _dWՒ8܎d*Ĉ˜"G: sBun4gS|[ mK=EΩ!F[j!9Z1 St6p[S0_gm{xn4egp:=)$b lJ}TAs=_i t^w.F8(pH8H-OiK2L<߭nJ0_d-vx/y0,S?3h|ќw5_F,&r,IdZH|%^Tpy'YJb-UbgPS-y&6c!~Wig2:f &8R8A2MG? Gs+x#{t/ck;GUz?i/#Ȟbv7#Z[-oLaDZwʋ-)t>hۜŜEq@LuoB͒'JLi=~{RqJpE<8x5yLZA>$ŶFIl~Dz6[.vr J$rZ$O??ލHWw?qNjO6um(׵QQ^|swoOkwG{?.{>z ۫^ a:±֖<+RsR1Ă,Ĕ"f&0Jc`wfbI1j1/ܕ7NV2 wol1{>5^ܘ5ppaHY&hr"&]Z826w]:x7qr=>f*ߌOn;Or;rˋ6 v@)?}vn#/xx%Z˗;;7< S.g1]\Ē\t_qnXl$ȃY.w: S30/`EbNft4m"^_*cL[}kz?<[xl$Bո݈W[ʮaIvs'PnuQˁhjh(N'M nH% GSu>1,XsȵEuZwUjUu/^_1^F4 .55G¡W8j32$DRpk{x(VM$ǧMI@5u@!P),[@.դzeOfv#T,AOFTP_©_gό.t%#c.};ifR7SoG2!];Z;v4i uׁ7ěqZ cWF8s)zܫ+KpP@"-Ẁ4 ]АUE:XČAF%23D]ո3*ĝff"CtY:P c\OAo YlYsyd B8 @=*tt!TV BjZA(Tte0dL ^I7 B_SGhLE.mtVj 9uT|!%_T^t;5%& %#ůXR. [8-/5I d(g[m!rP߬n @1&jǘ1Jd+(AWji=Ulo@"؈kܿ8<4먑p4H*KÈ|!M$Ԍ=А6IpyܺmzF>tW84<{sϹ2d{T~LTuPɩT}RزtsK%࢕)*VBqJP N'f4O2d2ҫ(%F{KLTԉb|R3MWK|YV)B#z\*G3?u6zETjn1K.T.J.JhlCT&RFClia)wFPhF[yZ*"6ÕW7T ;΍< o*C֚ \NZV6ZEc0\X= r Ԡ팆,T2!(][:y%us ߂Psf÷*|%[IPik߈8$4.+8ּԤj,*p~l?.ҡ$Ӡy᮪9oVU2 Vӻ>Y߼{zo\bh9i2G>lс 4h(8Ai251eC7+OE Zsi'$dGzc{I,zk偯d{MLJF.8̈́vWXg]E,߯*SīIm|5{H4]*,=CT^~|gWb>}쩑fߜ?[wyڗij_}eujW;Cz+K;llʸ;G=䀽w+e?8g_@e ug'.z1Jm$U{xsnӝؑD04DR2& O5mH08yBP5rb'xf.i?i >ߓ>Gi=A8AC{ס 輀ƠbwirRm5Toh5~C+P}ꃡdԾCNE6 Li;.oB  )ߤ(=D`'+KĮtf2a:@;Ui?z.5hMN;Aޖ;фp͑c1kH*JZKI6*Іt=g2I,{%VJV+=\ ;/Yj8Ѩu/kD3uDdjor(/'.xprRQiƲ%P..*_?LhVwܔ\:z {yM>1=ˡ,5nn^L?χ#F)%:'Zאan?H3!I~2;`$gp::I@}JfTzM (]-rް`E||/%HdQL*{5w4fi3a:M.f 1)i.8. S͢]|Q1Oq(CJM<15EtF8 EtfBN}~_кo߮iN\{+fp]st&la.siUPl&QCA+ݵ H_ptRN{(fU+PH6:%2SEi dw(=po@C^`]pPZy `~Qh4q+tb(m|E׽n[$,Qo}S@O!D;]cE9ֈ,*V2>Q4\jo&FxqWYldxu#zJHJ5Y<ע-.4z.o6mpmNEr#կ+!bjүe(HR*q,!{g]zDd>8m{0@v$#>ZDLfE< N'r(b9a23@IL3M~*=¿o尳;K0a~K ocKT{SwJc/i.Su2yoV&MS 49jWhqޖqUy|+ފju-zX2.ߔ(ey#vo 3b,eor Q[kFIXs=2Zo$@% l߭DBou-Od(93o %TnhXNrꮉ 8|;KD@-x4T{[3Se.jP=},f*aRƌ#IK<VcoJ!I2ߟ?Dpŝ͔g}:n/ʛRů sD0]cUH夆ҙR>z^31a{IM|)$9^~/g5W5vR[DYrsZ48#Ӽ"Xhw߼{rJ@ M=gڑh!/T"ƒRTOM(3ϗc]oƲW{{)/ \qӠIl8N >NdI%iDKD-o1;MTS{؆HmH2 U V,Nd smnXOx V%^0 C :Epa4Ez0[GDU9=^ {ꕑ02'GZXd4UoMQmh6h+|U z@фSP.g'HFPL`4C4fR,DYCW5&IGJp ]"h l3ۂ Aף0 ( (ʆx.&arkgNr9 '6EM.rIڤ63mX^SwCʲR=|xw>D$LHJ_##Œ|N0Mos#Z&+O"@L4Z^4\={y_i~u~J0 ըڬZT8 "xnݨ 锦fbci(9TD:"B92Ri2W̊ȉpr?C{~a_Ht8jBM`[ <)c[2@+9P)2ҫ<Ä  |)6\I@8fR!&/X{$Fi9a/2icm׳vRwJDLkdZ^4A͟ACd=;@?׿w cc {aD) &ưCخ A*cJ:8[27 vK\mӱ0?qTU1}; 1VN0dM!zi>:ĊAr"P@ײholtASߺ%@.~G+~ڢ H5u|"UΟ'J8ql|v4ӏ~2ۨ>}8ī޸s|5Os|1xPWտ9ɟ_/߾=qtv3Oegd;^8zm~Vys˫?n~LJgs|-3t-N{}om/"2\~wzх=Ӻ@6p.oq:u S7*3hzHiZy /YJsU/$ ~D(baxw2os]$aЭOe9tGI P&_ |QA VP/UdwɉNdID:{J[]WfklkǯzPwbJ'f4 G;h-NUD.sֶť@z߼~F{MT15dMdzѯ&x?T0Z$St3r^7g?R>\{I?mo0r MgGd_b ket)d|oDch,V]hCT?]\MQgOH͝$MI"q!֊a}PFυ+L}8@p;ڢ7N P`e>.hTԊ>79б;Zq+Է\KSH;s(E[V@UDtOAT h_P"D9`~f09r(X ibEN?AP h)Ş/,afae[. lA. l~ΔkmV|z*h[JmiF5Sk|x|$|Uw4'Ug8Rd7ibە!/oO65P\]G}KQ%zne:}7Ud9dž6tE6+)(@+"h0~ |(v83鵢^-y gjEoD5V.`Ex^%fyNS)'_K`.hHywIyҵbEʛOVmCbd᪌`QvM yVLIY94$l&`4T o}z\Ffp':V2 om1x5EkV!#LTm0[٥9ݴ˒fզ2O=Xi+f񼻃֞kǫG;na: zNYs[`6F5ֻil:^QldV,Ḛڬ  UaH}Y̧FV6eva F 9$.'СJҧX8P vA6bI0Zi*<>;ӧB)g P%ow戗ș\nD\B %ff.MoRh@ V+`yU(EgewTUQTzʧ[zI{H΍`+>Z}OsJ8xt.Vzʒ=| Ȍ<檬1k>yV٤H}.p%8w Tf.Tɋj  ti3Vüg||uv/.6jf t;Ig$ͲYw!J3r$w.R-eWM, '!+^P?@80- 9{LǾ4k? Υ*3ed|VZ|&󊏓fI-Ў#gcGp6lp} JrX] ]W9 2V%c]mWQᴫn$Ax)oCysW9![x1Fx8wcݠL\Y_(q)0E9PyxB0|6k&r6 ZfD:6"ڐ4Pw"iu|aB0ޠveLEHU"8ZXAgZ6j%R><@*=rh }\I͘ 1ᮀBSQ]8s jT+)(F<*NJ3J(FW0C6(MxY,$I$1Pjʤixz%̬m բRL[jh5jt`{9q9p@B`w},$%PR #H $1qR_<2粊ݞzBzz P386aJAGHq@1Jd(1VUMX@k*q[ۚƎ/ .ĮO7J<xgi*3dW☯WonwɉN7'Q)mue./.-`uK XZP?*kua:--,OL~jicsWNsKގu{6ki!)_{*V M݈f# ݅DҳV)zozg9鳼uݥb9:@L0g&7͊/e΍2wdoP-)@͊uZ_7j7iB* EщI\y&w5*^^n]ⶤyՃ!*!kekEyҰ:U܃9}>MۧI9{&:kMT)/,KW2+}LgGQg9'\:mط0ƙ5kY{1vúT骶Ry7*ekM6i1.V )ULE/]{7\/&uQ)[4~F҉FN$5*lX}hT Gw޴4h\v{(&B*v8kl8 y I;.opV>v6GR6ӷxPU޺ lo_-@*m. k(S`\:8U <[;{??n?'P,+'0Z_x_,e.HsPLFv;XߜՒ`mDt]b2c_7 ez^/NC-F%Y!T#. PzU˧ڰ Pl`쑽:D`1,1e7!*=݊}Ӿoo^hVi8B;~._™U$6?Bu?&Y@q8 nÒx$3n8JD0 5%rsRhJFF]k;溛&5ogU޺ߋSu]Y{S1q4,FUO<]e4쏧w$V:SgU}}#xN>]ևyV=_9Xr\c5z?mX9+m9ELm ]s톺ONki$i?Vݥ9t& ѣwV2&l3ZI뭇ɞi$s*c$YCϖ-xV^֓*qYe{ ',-XXB: lh(Nga1$2F G'a[ݼH_٧qmoggt'3ԙ~(߿~惏f//ߺnɅi-*+qK%;46g}ȗosMS~5 Rvn56)s֢_ǭG;n'њCM=GAp5}fd8thpjƗʎ[-RfIAhҀ!oJ,97{lᰆEVmDwa3o&r` ҉TI|KZڒիois"q&w6T9YΉ?:eޞ\ߙs*NBYBWɴJԅ򇛾՝tɑGoϫ><_\6+7LьF3w1x؜tPV&s& [ _}pP:hLj+r, suɾj2KXq +ep8!7/k 9ZKlo~U2>ڥB;Z#䙄I Z_3I.lZ'(m/ *lY05&_rpc_7i?:Mz\3uߠutߨߒ~~:W>˓7]YER̻cՇNTft|9il%IM8IE|/cy|xus*^ž/Pm#ϦK?]ʥ._n=tn1ks5' b{@{7|p.,Dva5vウ_~]aW|yPV\ fZ!_>ܚmIFZ*&2ƁuM009ۨ;VZ|.X{ #!qOѤuS7ב=U~u񝸳OAFSo{K\z9 0Lq*odcke?sᲝǠ rn#x̛YŴ;g{`(#afgyZR="zkEwm ƽfy+*ؤamPbGa{;>`P8vp{ps-L {0?ķnN]=8rAMSVQZK &p}*:jrٙ\ȋR[Ek,EȐTPuc-k ?s9mvd=`Ia=iBHSLʢ'Wj »eOgsn|y}m[wӿ8QҨhL@$zkЦdjv_ۙ=uó ^KRC%PY6,tH|;E NW*!@ #_Q:R8 O&\\L~vb2D$N5^zͪ͛#XUy~T XbaOGݚYm5Ӥ\I3:ϦއX_nI0 )uun1F9Oy__58TST?q y<ɅY0YIï}d('SʝRp\s /G#891bU6SQV3H*zpKϬAsF~J}7vgj:q;yҎRo,Fg $09;VF'2(f?F S; w_>Y$s £贆G B&kXKI"}(Q$Xٷޕ4 m4mM\ʠz)XAR;3YWvvvviۅ%g6neդDzցOkpԥ#(%7>:ըTRg\H r Ye 2i4~1(eCEHGLC)|ȁ odD,uLj8N-RDiVCLԢrD \̣';~?Clkj&hBwDi<@P}JZ|{?~[.R=i?XTMQ:%O]3QB]Y^ 3GC`=dNR/fE\)>kMPT14y̝!+ 8WSBE*D!m&#$JD/b1*ERzjMgA,٤L[٧)2>Ff(|4PbҞ^.?OF4ˆfY)g?fWwp\lVqu׳ 6ϟ= lXOGB> {  `#QYӺfu?]ϣ0Q[|7/*ϯ>Jf{fM_7o/ y4Ϗ ~yBOűln皎.O?o;'V9INܝS4%}es 7p ii[tDLxK9{,J;8ىeoiw.2ːbnZcXؗNoSVe'͔$a\*q+K1J*#!OTNg1Z:Ze4Xs䒛*S#^KFY_KP#n,$rJ(傓X+˨uLVR(^k!GZ?PτlSE vtJ3|I@ݷK1a c YMV) 9->^^ǵ&f̪m<}qi^:p>}[rNFaB$AeEhm"\q 9"#чTEoh-ABf,1VW%LX r K;q,!: >@}4] x={27S]΅:Aqʪ9{K+Z7bksA-˔0Rb1h"I Uj-Gt^W[eΫR!ժ]]iu] Wsg!CuجwoC8u6Ye0O: 4+];u}v(, g b7&ӼHwb`d-/||S-*)4=esn}Ւ LsOdM J,nlxλO(px~8LQVMzmsBJ0Ӑ|[WR ξߢ|E`T{'127#aA4H'YJ"rL}^&9#CظpY(wRwx%GQ4j%nUQp_A8q-~{ X 0I 2d4%l.w0sb(3+tũ89IL𳕶J#ӝs IEȏXA|vK4B4.1v 5J全|i>eb|#pcGcC.%7nHHݕ> ]Y "Оrh!c]&&}{nE#GjY>a]0~Tz ՇPѶB{i-ڭ|pso3X)o' 0`y8͛Ik[Cr(XQ&W-^ w§ pQ 2"Lk4]!bTbs_ҒZ{ 0Vee*嬆TqpL ?.}+(2뷇öqO;8,aS*dJ=KZ:'% l4ٝZV[ƍZ 8dR; `)K\l 5,9~`X5~.~"(ǂipS\߇pܽj'EvvC3vm+:ϒK(Y}Ur^ku(F!Ō ZeH 򴖛ʪ]4$+pXX{0זU .O?[Dkº΂¢V\yh.筀%G6%HU> ocg}f={ ӐS#˗MmÏΰh>-v5h.wq0P&b7[^K9!QpsSư tP)v1U抄Lq>$pG #L=7 ¸haם2%W/uQ&paڀQ6^2uz";{ryleփ0ZD-fӻ^Vzew7S x}$ϽJ1;RZX5b_Ob~vb LZa+QNlO< qA"Mi6Wc?GT-8~5Įq-w7UgtC(@74dtڝ~aܢ@nxƠ!W*'Q4d:Q<{v<;`h5gG~+ $Ǖwߦv #g1ϔ W]ߦg}=tyC}؟G[A:0H9Z>HV3s̍nb奟]So`H!ssSHsC<-C .F_;8DNH̆4j{c$֯靺{bk{i;zم}Ri6saX7yӶWzoZQ;t2˧;Ƹ'藖cZBJW4وn G Z{9nzଭ?*ugݷ m5a͊kYz ',}χӘ|T㩍E Prħ:1LŽ`5b:]m |;CwwmzറDFxJKHʏGVH7ݿ^_R䟧0۟P>>qyՄ "U;NiwO!egZ>?|6z 1 ՊyBмt‘znqWazp۫ك  /`s?[ e9[s~i/WmJjYT>ceyhw͗S8_ f 2"ecٯ6Պޅ;&L+Y([$e,Y"o zϪ+u[Ԩ\ 輻\xmէ$Ҿ;њu٭-1]3Rfk*Vn3"g.zȅw欱ej;F*lyx D Smc,|"t6xZ(ƀ=˨|?㥜o>NtqQL*S~j`:=|<ӱl Ƅ{gmH{NԙEt3m)B oY\O8be ;lJ0ALiWXWʎz#U. Q_Xy$,*rZH̳vgi{̙i+u$`FPL;jJ3CR9LRwE2xD,pt~&=#"%i1Χ\BȝAz,}(ḃŎdG^#6z2|I/3 rAQH⒡itHm\T|4k &WzE)TBۗ&$DkLHQGu_B[UE8v}d]5ey~:nxW~^+MW_mV?eWA] r`_4 yxMIΫ˙DA((3~C+pҘ)U+>s9F:o|SjK3=Wy~r%[ٮHm\(wJ?T(NP Yu]`ᄆQ"+ f.1;)T,L2;R PVz Ue, P\|[Ɓ#pI{VTM 1} yKB*gt'Hn:I}![WDLJa[_8 Ɖgæ۫7_ٛۛoqf'wXK4.R*g&;5Z鎴? .v0uEu/Ee,7;d#F1Zv }*|ju%+ZʪAc (*^jXT,q)HbmŰFz ex;Yjo!Zh%riHs a c,y8 aTH nW{^iVVNM#eOU:ÒGfL0t?~Zl +6VsX.;7W+M4 d#3˸*nOF:^} w)c_HŵRkS?_n86 oխ4Jozg!J[ U`V5BU* h'v熊#P5Bx`1SDI,$5;+devNZRkmD¯1N:د^xOQնlDPr]S!\f?7r>oCS^}x3&|psnV"13w5\<<,kvwrSgOCm'(ȍak>۔9 C()c![?wj%b,;K;mm yFs\c*2qCVp[Mo- 2䒫UUK]TկY{Ӹ.PuqSûMĖ eW=?z oFG3SR!11F5cܺw {,*)%s]T쑿<4 _#0'퓴nKצE)|3[i=0!DǤ]hhwMԝ Oٓʊ Wu(8Lz/SܤY^zCB|&.z,oCuMk۝lHR5cet.Y#ֈ[4-}%(K=^xԻΛ$o,+ol 7LnmZthBֵM!7>(U Qy^./g%n9dm5pBb.m0T ;=pqkdmVѬ6Xe"VIv Bܼ{;7泛BK_Iq3WB0 CE8wTBQdE#^71wߛjLqjbf&YKiGPF)j>7BfdU{N{GKS8D).Z$rQGrb(^IS׀)-Y&>^ Y/؍:+\Aي 4H%U¢J@s1XBbU5^!@O*_cIcU[`V0 ӗ  fuUAvnt(|Tx9H{ 9CmE|d&&H{Զ".o OFVs7fż:{+޾Af(ܧTzӜ=w:2ܰz̿Ek 1{0h5 Wpʸ=n@)Do@^&qֱIűP+3߀H׋7vЫG-T!B*5F)J >&T90S%[QUJWIG!Qr4🦆`B:\@q|[*R:itk/ڪ9&۪w%[KKBv49qUpYA) +Jii}K*ںuG\7Ue0>Ҳ4O $UJ˪`Yj-jbs ,G,7?v[2d5eӬ֥2hOvBD-t!G=TpZbRղ_ AX o7:OB>=ӸikN/jzgm]"c+Tk?{էnR|p0YO:Фruy&-#09Cdh #glsĹ9[=g^=X=1AN-e{Ӕm;p0э/u7aRC˄~û!C {3F|Q~LVRVUyRJ3êB[bm 1!˲5q+V}W-ƄWӣ{V;Ֆ-v^dFHإb~r3Yb~n.keZo:UmmݒmI1ʬp`;p ?{W8d/iyEAUmc=BW|Fd&muɒK_,%%ä>h2^ddDfdģȇPKiQ͉J jV=J iiV PIQp݂UJX56|ҟ>C 2*41Dhq0_.,)f .w $D6 bb.M/YBP! |^{ܔ;2D*CK>OA\쯏G^(!9&AyFInծu#IԲ<,Qs^a<3m^SW|^Kk ]k :Ol9+R5@GB7t%yV]PN;5z\ټx!>IbT]۳˹e}ޯ߲F^_nqbI!P҇߳]+ޢ~ko.AcOxy[go/h6_p?cxuh߀7?3MF🙮~2뽀-`Fp4^RFs*m E75pMpF7gvC L㌪h m]̲if#9emM/h"TP7EU3|;HH^q SF-z_dLҚং(f)YCfY) )f'6Z/UZzOhTfg ?XSpwk</{\\?t0Hp4>LS-ͺD;XfG&,-"!D.`(.sV1 -X+ uB?%2fK E=N2 xo%PPʘ&jHSBBC31I#jRN5 AHW V Ѓf:GpYf1VvZcgJCu 8ՙK$B/kHS͓ :4g6$FTwj ӊ1ۃs ϸ ʊLX@_<.Y% ThhWJbt$FBs_"' pS =a\5"-q}!m-AJ.zhXCÈI :8Åzk%DTc8yM͐?n\RF۸ФE:G21u.wADt80 "/ch|r[uT( >Ӻ֙Gz?YF(WEvaP{jJ}ٞF P#%_fI%&qyfe26h}w5>ZZwE*@إت^E)иnl/1eg9/:Ph=D. US<>>X-1oѿH% jNE4WWP}O#1.t{)P#ne]6>q -&3{/n.&U7)΃1y@Ye xɥʵ6bBHEqYlfРKj'BA(O $ FOI X2>l+/lA@I/Nu's`N.82M6*ԃr`O]0 6־\y:\s;aJis>0_E4֒u됈0!wavyEa&|d|3/"7B3 6>[ʇui{HYCHwpg$筂.CzE  c]'iw/n$L4ɩOeNNa+2NNAC6T5ُAȁ;n+ dhB(Q]dP{~ 5ggݞqVN57n=1A +brqLuQpVWG7( *l"Koo2ϓr$*h7&Q*15XQ— `ZylD"F8!'dNe4M&=8ݾ&=w2]?ηp+]l˞a?#1GʥbIl ?ΜZ#MBpMKHBӄVRe#lT+]ST2AHfm$kD:ig,{g2n3*Chʪh=f=jJ=;Z_6i4\ 1> oMV)19b4}syRJQ ɥy]+e16 @e]؇{]qRfO '\EF7|VvUK Av~.?VjuUM <|~9 Y@;fRtꨎkKSF֥LՇ5ъ;HD7]1V[t8N#BUD;: ̩T w+S*9* 5yp%_Ҡ3HvSj.#D.UdKt oʨfO`K.w&{ 3#ݱbi!K3\;\am.-sC)5Rڬ?īQ [NY@BsPGt/ 8ydΝ#ur*3sC  (f93]^QA[Q5 &7]v|"рJsmore wb1y&Pxh_n6~}t?*KS:-XFicu%Ze $2 <;r'8C^ N'YS=.S+;YY'Ѱ"ŰVR@{TplSO6qM?hPYAޣqXܽ?\hʀ&kuEv2_"=>W2mMtϺ\Rp pPE h)AZՅf8|>EhЇ4yR;tdAH$!,MRy),GVȃ j#$IOw{R HAFĿڄ82H;YpԾDtPA{CϋJsú^ٿ2GO3a?&ŏIL7ekP[Ps 6 X_А8/ X:ypxATcF-| ,!sžgiA@/轷<)EpϢK ѝKl9^HPr$ ?w~>4B֌yGk7Ʈ $59zp"$/Q׿{o)4Ogܾ '>slSdl^`ytw_8EY)ʾoMWe."[`unWNJg$yyOIv, r߮ڵ ґqAn9^@]d)7 E+Q ,qh&b;XA+u> b*b*YJHHH L-f~T\W54 $v]XhsMoJnHܼpH6kh_Pk3~͜l߼hBy_5:֖tN¬a3&JV>g\4yNq?69(#޺%xi]IXŵ&"'$ѝך4PB:7DKOR-2|0:;&O W&ϙL+>y,ffU <~?ZE?_HRLfԹQ)S~U)g {/< ƒ2yyxk9 1u&լ RTL9+ϊ+IHf AԞW X=RdH8e-~^}i;+[4Z {BKeԺ+Ral6J䦬!ޢ-N` љVR^a8t.%p-XA?PU6Zՠڧ]q-u|ZE`jj47 ?O].:IYL]%9a娥lk8oѷJdAXs>̃R(fVscOb6wALؤYMjkMYjP L-JM(Q\o)GY\0Kf|WrQ%k ~uVkׂG8ŴϨk֚~-_(k>Vh&^hS15o,B+Olb-j̰rM< L`=ZU . kʃ28T5S7fKV`->S6+±YUE8:h&F uCڌvRV~9ᜩ}qNTd+AMu$tg12+-CHo@>'|Yrikn<=(QZL;C-z9Opڸp 7B E;ol&Ņ-3IDJknIΐ^H`X}sBO}ӆHUJjoI͟I.$z/gR09g {A?&!y8ɋO.F>8t!N͟ "\rYq Ů, K$Dž\$ 5 1i Bw)'uџ53Q^#2`Ǡ Qu# h)ŘiØT:ĘDRP b_ϪAY>-QC0ɚ!{:08"mD k\$МsP SҞE+%19#{&!բ ZwkmW~*M6{H'߽沮rwB0xmȾfj%WmKKYNY1S@'X7@!"z2ڐR3J>8Xi/B/*ɶ*{Y:qg* qO3fNy=7_QAiA$B1B\hP:n5IY{bL=}4}{)99FEW?k@gh51̕ޒP"%\W)iu6}DTؖA'IuOS{2T5bM2CPSqxp.1M :n@1xܓ:T |hb]`ӣ1nf29 ӣz Jw+ .kQ$`V& IIHTA(=h1賣9\ιh\.gFEʝI)չ7} 4wRaa)Ş)5Yd98a.2A,}!gNz6#4dy|Eϋv*K3<>~-Nl[Dwg}~T`2[яKl C‡O+%O?=BWy<.(#gy3 ;~p~zWWx}ຽ!%rB a\RMf]4גCK57R쮾@ =ͫ&3,o/͑|< a4kJ%|㓋;Y,}RG9NkS-ZK3 =O3e򵩷qtq}|҆u c"ѿhoqRO O` 9beUY)uJmMj%T_׵}Ė"Җxsg= ĸRxoXא+ןMp> ̅ڭhعO(8SïQz> =ǷWp=' fW,}Roj0SMEDSJwL{MZWtN@- f-Qk0g#>Lw ?E6M!Ehwmww0K]_9o(W2hm+Õ-t, 3,C&F#@6 7.I£w @^Iu^'S rjfo=*T&-;&O8w T[Q{F}dƦ'8*AHA D6jV[ *⡒%JD1 | p@XJ1('kBUcO<>z.l/]/g1b{'w6H^XC8c HuEH4̤zvTgf1.Bg^Oz $wbTfB !j(ߔCHNxaR5Wm#T9YeϦѪJǘWHPGPJG$NX`4 É<ט)1(vڵ_]Ċ-!w$*[`Vߦ[{{Wo6˷ɻ@P6+v4ٮp )8BBhX LrW tu7TKh +&n Q[O(n$I C$UT(,b9 /صb{._ O _ך=tv{nIx_ZLJ݀MUkQ0T4BuRa5dGR;;E>,w۵VI84DErHb#qA5̞P ilU1kmIӇv}ٛI*fGa4`;Ptiv(~==)toru" .N>6Y>mWO~Qw7q2U}4&%iP~gg! ܭs8g.qojs=|8-rlCE) k)yߣvA֩F7Tδ[-ݚg:۳B@AɽE vFG *T yӻҤZ*Q<7^H퇟>ld2^M;Hw#*c{1=\wr4iYV E_EvHݶcm(5}B߳4@hn =hEq)P`3l<[Sٙ+&W%"†5OA1etoε@rtoWKV ~M"ntwpkpf4T-%|YcFװ}uC͉T-A >nK}ㇰ¹fa}Vf5ݥlo X4?O7OQe z3zŐlWk 23ۻV୘R7gUF5 .@ڴ { 74D<ܪ9fT%U?*(e"rꤕ. Pk5 q9T#uLE#Z|iNC1Law4k۴\m.3ڊ}KV _n~\v|=)'u6u(5y e1 |PsԚvhs7Y+%*oU FMUm=kn*݆<@up<]4f#f `H2c$!R2 -1,ipVk8WHjdvj{Ņ<^Ѻ;=?=:b'#wX$Tߓڻ?3](Pa:bz|aAmXc":½'W.9ϧ/ AnVsӮ?=!cP{hxBWww}"ɂIh[AzbU{*8E@HZm6zJSrI )pVY.% ȋ"XלCY)-Bwm_aVHUiNjdKR.  %*<g!E/qsp(e["9@; #XŬ_ h3$9nSL}0pgAK_/ˠSw?r[ہkfz狀KV7gv? )H1V;g-4d>vzHOs-ڦ@~4ane04%)xB9Ol=r Z"-fjf%4P7{cҝ+3_'ߠJ<~Zzh%`?~9JB@UyUJӁ1#]ؘTs $V85V$Bq'9%)AQ6[&u)"T!]4^񚨩< #8o-髚ʮ2QRԻ* WTU`XgoWfޮYJ]vW)C0j wz}y"d䳉R+9s`Tک{}|Hm~%w!we A^.d!7Iv%;XIo޻xHb7ioP*M,[PoI,El)kNǓ}C/Z2r=B`U{75W?ug2og3n^>[;x?` OV`UZ5͞J"z'kekļ53OhfN=:;~I!R9u%ve[724 ~_tF1R{CWVMDfRq<Վ+KoeS?)^Mq2&໏?~arlau26gZ*7Z$ +r'_8GY!ʚC%ҕ\% z8#., #BI\:jкQS2uRun)~Y*υXvϩrcsl+^VaaW4O9U"7iVZqe5TUZ ">0MF딞M} Pyuv5€5Ulq:]+fͫȎJmJ|Ѽ[ߪᤄya'ڒf@ulX֕JoWGDWt (r q6E!}747=eǑ4Z%ìb&盋'3ӫyWQz5siEZai4gdmJϛl>^du ۓXVYbeOv"K"m&I ʙǓ??&z]?\me8Qnz7|Ugv~lNnVv.6t5?L߉іUX!,O:yRڙ,p{YPv( cϳ˼_fՃ2fg6z \ ySҜtK#J:U(#ݎgOD[ੂҭƧh9EsA'p%8}װΘ$sQPJP)D"Z95A1wIKLKr^3q`vss;Qv^貓B Y/W[: [Xg_Ttx!W}Q 3mq =\m!7v6[ZvmF(N3Dh4gJWarWE)J T[Kڭ Wi*%"~mgLR;3͛~F% Z+ #{3jE4ǝmE[gCuM; v#UwYHD5&llZ(MJ52 1BG~mqUt&[+=FH ;kbKUˎyhQ+pB'¤ 1(9b(|qR1Ҕ-IF˺EIi8䍃Y?N\2 '>rLIJ.Y %2"QIջO~<Lf~ 4Nw~ã+. s}W͠73_`mػÑf\efW py-imKiT5 W0 {jq*5s$p&RQ'X\);t>)EJK-C9,i$J E!w~/r;^ ;!|5_&#yeOci>N|oo;3|;/BX;9N|$pm0"0 ^~wg \~7{}F8n6&cOd2@ֿDᲴ2Bv7.^3T?(_Q/Ӎ1~2Iao^?_%_%[0Q  <Ԅ$9pJP f4ʥ,BU4oard9&aSdHA,>*()Ztf0tkZCĹN-Rĥ,ԐE@LRq5:q 7dP64#?Sh; +G)1BC|Z.BQ~f +9w vA7ab8 JA'Đ < ƚZSV۵,*) *4T@@Ǽà^ D0ǰf_̆1 EU@V' -Z)lOalTJRKaAiQy"f 5:hY|PܤeG,l3iF\m0qEoRDǢqϖ(ni" {/8u%g4RAL.M i`(5OOu*ã'JƲB+T*Z{T_ @7Hb3!uxaQB4 c%;tg Pfkzdž- qZ[ɂrCuLjjY@gWh@%gn(4\ѣThЩA!j8QhJ gk XE>1c) NiDjY-MN -#ǑNpG`w1;KR :8XWCK3ԵI82D Yٲ-Qrb WW2$h" :J{Ͻ'#0܌gL)8SE-#2=jl) 'r67w f$`[(]`Z,.ڀ O3*0X Sԉ֜`N B5iyn -Q\VWbjEarpĆY.a $L-q[vW-7E".T:?Z{8r_/ְF nƪQ\0ߛǫ"xb/_z^ %1~ 2F`.C)n ǿ$7̭;7eB=îf2RVdIU,"egCtYT>29o=݅9~1"c nO ABn1gY , N) %r JXe Pl1.!TZ@_P7D+gb=ǭ;(l^kT:TkT]BzwY5HF7Xddr%cqBaձx?\0QxA2){CWbCf|!@=2{C9~t bB]ttTjv¬<=}q@۹zb@Z!"v݄yV{8/vTOcd Q !n>ecV&eJsk5Leb(`T#T>+nuryD IWe* iyңJc[y.Su٥;l<{< :u>\K_׏Zۏ~]|v t|E9~ϗ!P;ˤkNUy!;8(wUL:t]tN2W!]Fܑ-K5 #Ow#;婀V7Kې/\DdPNqnvc18脾v;k Qo-xڭ ELIgʁA&UbX$7߽]2'|DrH;YGؾ)q~ݳjP.n-Wb>䧬US3_Mrq1/ٿo3pLu3]۫Ojy=כW_;i:]o]CběTUVnN;Cֲ,q G`<h7|?и`aA ^Hp?H9sG=tw<(!ڰqՍ2ei""/+1h90snޣ13vQJyEWp*AKՀwR>z*d57)'.J. y;o?Ƴ|ޭ2 a#ao{C{gf!u·ъ},+cFQ /k^āf;z")ˮtzQ([lHw6WxY(1:ǔ|&Gwμ2=ssJqiq˟vǤoWwx򍛇!  " *3A ZD]Uc'm Si#$!SrV%2u)+ X&8P5ĵeic*RK.- #TvxuC(~afQ\S•P$M)dBЦ0FJZ 9a±V1RZ`&I`+J86 ҂h#eŔ8*B|E1ʋj!Smi.VS Ȭ2 U*oTOq;DğO;9z!wU>ßb?}~\Y dHoG#xǠ!ݵXYc`} CSqNjם[`Fq?tFy"]7Du`IPZR,Ҿ,\YRz-%5'q=5AC=I5T'w.hh%Z dH,ܘ||{wf=gA{.a<\$W~|oZ&w{?  6zc@VHV.y[`EMi>, $t=B)>J&0<Ipvh_$N`_0 y$,XE`J-٬a3IHX{3c8tm ys8hgLB7 `{77û2VqDA3K+T,)%?,"cK=yuKvc+ETSl~2I97<&Vq*#(tlCb EUT2Þ"ǘb\Q΀̲س6UoU,{2e<&EfAX&T$'=_|OB"..rB9-!3i 8Zd=; ҹI|Ωk9쟓l9Ϊ3PiZD! FYfEA$PB<{:+7u|7*~cMI S\є2euHשkH&xABkiY@R  7҉ Vu8N0 #AkX3$T,UGm+A@ؤp[^qc"ؑ1DR1{GR*H~4WDTkkiSsgL;8@+ڊN Z.-N20Ewxhfja5u6IJ#swK % My\w1\{;;#hn^mBo@QAv5{KrFD3 i.Ég=;'Κ^PL,Oqs+ wg kg`? 4hwvTVmZ[s?)I@TvRZ3b@Iڼ Sxg+%w3mntXMmDLwNi3pg6谴a5T8vLT@c^HnZ+N\/dI\eBԓZ:INڻI8c 쾲S;QU&ߩEu !S E>U ʨw2wJE?,d T˦psR)?n=axlPzDbP*8NnPRFIuړm Ex-i/{[Sw`bı؝s*P|ťĩ2XzPf & GCAK!1#_iE_hq(QOyk/g {l9E!Mc_y tJ1CI|k- ;;cӚo>]$-tC7j^'_Tcj~'䩀B{TD"y$<RøI02N,f I<%ڏtS)2H3cLf]+d)A kxRs;m)41LRM0ch 1@E@ wF.Sơ|!#`. Q7܀Qej:fgڤVD吝ۺ>ۺ>r2 #b9HA$# ((duQT*\D)6„n>3o77A֮`":E7MH1k"a}oʷߓ$%`0b{-JW27b\$đ&5v#!ZhzO}LYáDI1. ʪ!J{?fEh X?{GVuYv +78_\GK)\ U{R((9*=<8`S ǤX۽_ll{3@vz!=;*#(ClU ^!"M ikbQ2Y?rЀ-;n9p; 9%CmS[T_} iɫ%y+ ǵY{0:L6xzڙ9~aO"n:>MfGSzI>ڑq)kM*uxt wnqѩj3/1ytoGZ[r5iNqzF9qSLN`҇ShJa™^sk@ In )L9g( .FZjh%JPNYD&͙]s΀ UhNsU+SwgWANo{IAp“j1 p~?_N"$&_5RHK8 2"@wmmr* UsNP`etPh_,u37(,!v7}fK@kNm@iWN4L1}:>~9S,[AU9T9m1myԠqMQLPaB5_G:NݥTPGJQdz%ƅ&-8:,)i>-nӤӤӤӦ~6SHb{ N"H[ :N_HL J_TW- 'DJޥ3◤ӠH_~Ddq9[p>*E)~|aS.|Zv{jٕtK6tW?~0 ; p6uQrtW)al_1M0X)QaUҎ܂8x*V{YitʼnuQEU[kG '*=#X#R?ͪt$hF0woH݋#RHYi{adL!aYh_Xh.:S0.W>EF>~˗/{6g{q邒荭Lʻm_l8H_-P4/{ % ͛ӾJ5CY7Ũg!u$H]"{}9xun+ScHpB8#=ֶ ?ޤȉ0a/yd|H&d"$Rd)bHov|1dxIn3&6Woƅ+xm 1FW),@+'X Ɗ)&pTTi+єb9 ͍F؄Di[Jx1V3@BGbQ(Xp H]/z\)*y9ü4"t36Zu&P * ZkǹQs*'n-X@ "Qm*ʆ˕s eŠm[4*:QDў-c{0NA} ٌr'B(:E6)z9PO246T"JLV|Ob?nX$],>ꋓ!iFS/xj_[_NIhHh$=ZBڕdxw!\YZvaѸw2ԣ̹&'`(dsRKvƱs[i:C5ٲmY?Gl1!i1[t=z;rB+:m|S[?kpec5x&AM6y9"M #B'0n Ob\UwLY~ܑo͜z0MӔ0MfC`#)Ô4E-&ae Z -B6u鋊HK_^0)BD4ɵ&{̵`iz'b"LZ>rwuPF QIm,q+밯0VڐvN:ID\fqGZ6ib:Ow:&`+I(2!_F1 vLp 1"TK )‘q/(LI :1wte0aKQxa uLY,X(+J4;U-,u3ÙȃkR?@Iu],bZT*a~>M˶yIaToU#w=5_߯L@5 ykL뮦LT1}j__DDhT]_5x ?ھ'to_}x{^pN ,[k´o7G0A_ç JjoN)0gSw^nimݶF˜Y0Kaӊܦ~xa 5v4&Rj_]*41ۂ@lVq mM"̺(EO4I JtV5; B`Ƒ@J'Ik#wP ┈{-[ړA7Z濿h`[ScfO ^5\8CJt@”\\T`vQ4*žJk"1q 1tz9'ɄR XVcCe"VSN9=g 8EOfWFiF2z%=B*sX vE4γ;yٶ%wv-pN;tErO:ibD{)+i e'hSi)t%0;m,Mkd4Jؓ cP?dxY8[^۩t+fݒvkrr|w Y"8{K~Εv3oa8O1P~K(N9| ~]eBWʀ, Ե:Jx_J`gƫ.{)u UTdjѧ/>9^`,Myߵ^iN^s$fsW=Z]^ ݀k|;,|yx,D>^4Rziu$hO8ln,unlwn1amM)&S4m&cwE\cXTs^yab#+aU6cB鋊J_~:kDNI"UYNVeUsRa6ys`-I[ ;n:jɤcV o#QFbeUdilj<K/)^ XJxU2I5JͨD 2kˇWC|)W6Ӛw23 D\$d2A˃I-Y5plkT{ Єrufvxww<],j=>|sL~A{N*z*q~w̼h޽l7wf4,&ջNEEʁEM"cRPm(qLF2q !$X#·ha*80(8vЂ8guY^ x_^L=]Os Bk{}|aV!<ΜŽYgN3мwzt\S3kJXؙ o.]K2l~ƍug̃ypA`tdhqlUkmqmޛH`K$(⍱ {\DH0.a ^8\[`E*-Ҋ3^t$%DQ%H Dߏ_M\8ò7V+C7oz8Cݍ 4iaϡYk!edQajGSMAq,pe8rcuјUhщֈbvY$,V \%Ē!kW<tف8/CKS:xn}bM)?ZjL^{a.>f)fkGشR-nXꇱfX J^;HD ,aSowGP"sk1B\BrL&J "wD+ Ӟ`4QDN+emk'dLv(!l cw^S~:ƎוQ^8XĈ(r*#<STg#< 6,X"LDBifBc `  pƳzn[࿋P2Q&aM:;TAsK} F:, (QD WkPvc2Upۈ}cl%֑HXThst07]05*2\K5qYuWʷN{:A:#Ƽr'-g}дi{?ey_lBe7AuR`akAxE>F#U-\||n]'@QvEkf|m5r8=! C爴zp~[T.̳ :tƙpΤXR|0a8GX5러y;rdG %Zʡp‰NeWY 'z(zq]ģi#zx 9v{%rk!).W#6bH@vHeލB;HNjRʲ҇'%ׄExWtGŇ8~$0>;95fOϜ`~=2p}>Yɓy,O]6b0g혱 Ŋm9(:Ax8Bz exڃ"ʗ@)"qRpG 8:7۽M)X]|WW)H>n|g@3^_wur w27>D )o)E;_}XԿ _ngdulƷOEa9+`?,c2GNٻ6+W, _ a`O`7# ŋsA&l9¿}B8Ÿ1fivw}J%w/+$9iHr1/jTWxkz$+ykw/'ϳLE { )Bl2C%3.hKYL-QJ˫" jiǼ Tce\D'5YhTǝX!)6M物Gpn?&zh$_f.#XѡA w93zj@}[V19>P.9fiq;=xpt?U5'XJ$]d86ox_YEZ;/1*>W;|zn}s2f{9k8诊3*+Qλ9YrùUK2jUѶh|ebmr/>9م`-8h[>;1S4.sCU͇zWk_y ĈJI>Jvˍ'W$?Y6\VmZEb!z\,|V['Ґ"E*dwscYPkXWt[p)&֨KSX;Ł,z=YQ.$w$ťtNQ:\=zT^Dz зRbL!t zL|Qp}2-\ܒL(4\TRmڙL$.  "lV:G|ͥ"YX%1F.cBb+{c 1Q0jШTB6xR +GZ<a䁳)&b&M`[>@TRüRH(N:2kbM* 6>l ;jAeVLq ;D!d o޼ o7h_|4WiQe9'kͿhYUvZk `Dq9 \>ox@_}\LCMVFkV\˪L(*dCOoa5Yвrܚ)p5/ <j-[ćgҲ2a/ ŠP "8r94u*ɽ7oڕqā-,kh+=*Z&t#m`Ӟ.苘1м$ +vx퇦dH܂FPlu_PYHec&(g[ up+b07W2T5:ˏb- kQ2˛5ÈTg"bU 9n\*Υjgk4Rf漢K cŒ(u`|0JNAjh+AĚx%“9MWsBG_VBug\nC|(isP?M^[;mPu),oz v\qéVR(|4zRjx QVTj"*Xs8ZRV(@6--] G8!="~h}Kist~KWEge߷(^7p>YoC9N+j+].dG~8*(֓ {]{rBMuGP>4ݓüufsbQ@]pb%T-Q=i=y]G2!A91._{`Ga=aj:̜ 7yP =ā/f>G*& ~Ϧ~`Eb]M-/.k&U"L-$.pk5b"@)rhO a&HyxrbЮ_K鞔~b0!6 |9Q{!4>la2?Xb, 1:,}$% G.+2r.b)M{8OZr}[ x2Wo&׷7Ozs$h}c^dn_\է_f5y዗HqYyy9-> ;^p :GQJjRZ1Uॲ:!SG[,ǎHM?TyZtN9(G-W:*M2d56Q&D eh y\e0 UM{ꭈ-!Nr&TTl-9G64Y8N!20usISъ>ؕIwـ* )r.hǿ~y[3Ep޻uppH3o ɪT+W݋h$zsL\D ckB6 ;f)/ \#K0ٹ%:i|)TL5lꔑHPAR֎o<-eoSw<׸|44{yqog)=sg&[!Z!$S%␊T;'ơ<ϱj0z ;+7;8آ9!R# )$k,A!Q(y#Sce ._]A0:V$z܃$r⧞+2eɩQL`5[&,En=RMLR}\b E_'"4iWܠ(,gaN}UxqT'dJ17|];Cc3Eԝ wHKpɗ]FΏ.`% Y+H ȩXR1u)M:S ;\O}B OZ>4+|aS]2& F{0@rXq4vx5ok8Z8(9ff}?r8hudQWRNWn}t;;r$)1_[+YqDJ<&C9L6 99:NbΡlOrj-* b yѾúr>@ww@Sh*M3, ksY%ťGX)qt2aŽ0KoN~},{%I&N$4USGREp2i |t H&1O ]3.~9fqp7~l'd68$ԍ p],Z^)!PH%5{28x_4'[gZXӮD>N^i>JJxiJ9FvP-H($qI`NvN@iN(͘EaꂷX@D(El! 9atoTl wޭeNDV# #wQ" hF9_- Xŀ( BCwJ3ʨ#Ɓr 1bi^! gSè 2dkh݂> ;zEl3yk\gf{|l#ޗ/%ٍ#nԀ02%</[.vWWqFքp6Y eݝ%34{u֦Q};%G# ϯxvpx[Uh]فjqᚨT]WYE*n[\ kКT F,|b\d+, Ā]zzA~qXwor.4f߿%߾y3 q៾|yuFۜ"8?"_Mnr"/zkA}Wd Cd<}x})XE MW ^ֵu_(zB%H+ۍzff|B"jU7UP!}BG3TSI5E<s6h9;µrr;aQ^k)^XsZ1ă-IPLa}GϦ-zrJؾ}W aH̎/Q-X W) QJ3R$BJa=/Ž櫲rn;;L-E6EK4\zLA؂:Or3VwF|9*ྍXi01Bɍ(b3e s/XK`S3&`D< ʕA:#DnTwf`AA0YIx:*i4ߥEOa0VrGVkg )Ա\@obK$G!LR9H#4ݪNȦTD32Lb U9X &xm8,(^R1%) >%ر$x;RbT m@KB)TLD8>$J5ITe]'LŘ G{ AAꨲN`CL  K5D!Jθ1R໇"0 QoyjLB4CAP~q)&Y1'<9*.CqA%@U}".τ vL-s2SދɗsG@&} | 䔐 KfB탧(5\Ͽ/WS|wqw7-ٴΙ,GdX&Wටv귩M4s᭙ u18 "ڻղzԶH&-VHԂ|$}߶[dG#$# מ -1[(2@[@a(Rس{@Q|efeT4az@S.Zwp4@!?6Fuy:?B:HK>BU\:d=R+j)AKkWXY1f=?.{8yx,j݊[ӽ3s;r[qT9zKӽd QɻW5Tmv2(橁Lo#oٶ[ys~x ̏h.lG׬}o"uɯyW1B.8/9H7x km#GE˞;rx .g$N`lJ'9(Vfu7c T}U,YŪ[/x@5k%SZŎEX^bVJ[%2ʁ#Xഺ(8RtSk%ח6oodTvwiU*' 4 aBR9 $5c.Wb3N{%u}խ/*ѾDI G/`lH,b06G.V鮃M/nT#ߙeHoDJ!_y7%1G|S_m9Kb6%1׻c3%cꬽd98iTKbȞUB(C>Ja9~$f {^ fdRӹی7F8~pZ͓jAYx`Qk]sk9zx,Y/a<ޏUXCo Of~0]7=J0y6{9ݾm~>ׯl>qd|@8|f %Kd G` z¯[;νny~3:3`陑k:yu2e^'󙑊#B֨ gaɸ3G=B8+(Bc(?#?8G^fe_l+?H} Fx;0YLe\|>^|WWfICm)v{UG{*{ LX.'G /^AGn Nd9+=[PNAO`_Mb*!?ZO&#dfp535ǰ( fʌ ;Fv[<^Xtj C"XH 6uA@M,WY_=ҟ֚ ;w"k{2̳ Bg23cC1$`zk\k|@\' b15tb&3La <";pPlF5UJs )Bin@,Y `r; 0ioǓڳՅ :eA#*!E[Jqdd_ uc)RB<т3 'yNQsSf_1]acgQ9OGէL$>XKkZT&FY(E'E|J9 j VSŗQ68^C˛SrI PKR#7ۍk}, //zR?8s^@oe' ;z " eSd RdzT4]<=Z0sźH~]uB+yS9>]aRLZQimh]ZWw ׸*xGή=1bbV 1$P_ʚ!r+oU q.%s519Npؠ]0w*M>euFtSΰa!|Lfg<`-h‚Lڐ"`1nMup3խd̦D_r|we3 mDE\G+ 5\-6X-Ȱ wjXX$̜`ǣ F#byUw9_^j7bfbyU٨>*Avx$Voe?\R (y2\6Ip{CU28lp9&OT vteV W LZ2 fVOr 8 3 PQ7}T 1O4L;@C4N¬IMNu,{eP$n?ˠ]VSbZ)SG}*HsIL?w zd,;;b29?.?(n~\$׬Z+pu^`ݳ>V)wrEUߍRpEB%[27WN2.>ܼ$gҊ^SVe|ka~ml߱\VmѨƳ[ϩ[VW>~v Wfhc{"(Qy״N-cbڧ)W>B\ZBIЃPi/VrBZgNP )գ-"o3/vyzń$ P@/ed{?$_H+lOV+hNY{t`h !}}FNw7%2/ }F!2`к4вCX{YwADG5H1R}pEz3F$=\8*ɽ [QвDN O#w;ƃ2;(˦l<'04bO3w-Y홛2F'ˆ־U*"XCw{H=J+mT n783OB̕f Ibh_\*rA\Syj=ahعM?I\w1R/ U_E|?]499ξeA9; (bDpQל45/riu|qJꏐZrJylO8XC˒dV:TfNIDLѽJ^%=8piq4E(L#[*2d1E]Ļ>Qtsx]쇵ҟwKa6LmLmؚrQJbaJRS)A.νzcN/0~0냟×a6-/Vp5hFCz/H>R<HK~!|~HTyLuRЯUcNUK8ݪEKڦTz8c=wxrc!' Պ+BBo3k$OU ?_[/Ml@]Ph@`Ν ߁"^O8' a9gų2.kFq RU3%* [K-Fkõ2z@~r88Wzę~<.c^gLc7Ŏdxr7|~Ly]xظ0@?n6?H`d8O?~Y?_\>3g_*l%3L8T[L0|t1vcNdhup>ߵL # /REcj9R njLLwi2(.NjRÕr10m8Hp zQ;|0w^QD_gXw_zq]cMEL9[ɴi-f z3IrDps(HeaFnG v-s0,S9u~߫Mc2RS!20AH!-r I92̳ T2ƜkYŋ[CF(xI)"Do 1HRnby$ImjGWGID;_d<:鵒 |}<' 2Nx@vKiWc!2EēM;ǓBxR WA)3dk`$=%yk;Jfdb*LK\^ލ"!󜙭o[~~u$;s qq~ PW -N/%cpu򋱀FOZrD XaBr]ƴjEs}x'oNpD |ȕzלp~TrTWW嚴i!s5@7CZ" %Y0, [ u*A5yafaY{F{w<7g,}ߜY; 80$ccLJ2&u*bDXNb8nW(ůc-mt?0YQىn#d>b=󉯞٬}5[ =ܤ f )=b+AM%NU ܦJSMy6Q5CUP=z~K-3 c"01!)K$FHT[h'm̈qGىJ鏲-njTuT6f^;z|\䂬ۧյ[SŀӴ qB++j%+qNcv:N[`![`DX9pJ&ķS >q$12$-+DRu6hB hV8rc0418jXKE4' r1Zaa)ǜ)jF9b F0DK줯-R>c@*~{ kSx)`P$9~SI`4XPXSpo[B.agW5M1#Ң !d*A|;`ʕG!P)6yU9g0RHw%*S^~,lybpkAQcMʠCDGׂJRҤ1xĘ#B:Bk8. =0͆x3qG?!7Ow"c}E?d:[r|8n'B6Lg FCwK>G#xLp.7bW|W'b!?xyhO0\ ruP3Ev2oonJ{R(*,:dsC&?{`.}P!Xⳁyk_rt;f\:z kfuc[Ho_@DF 8`iiGxgˉ=Ol'=MTxa(TTArp`GFT:Ř44PyrVYfoʩ,a嶭wrD8cMvfoƉC% hNAy;@ޝ5-n%RRcηZ×BҢ:*?gjxA%=&xWXZH*yvzgFEKOF! g!cnh=uT rimǟM`΍bhA;j67nU6hkuڈ4ڐ.V#ޭmĔ_Jz^/[Zeu)<+یoł-ڀ3?FoVdwsWs~|rv\"x5{˹Sh{V6ZARR[+n/O4έ'fuM˨gį1iЊ9Q&@TAPZy~qMkBBZ9d k*M҇ٯ3l^{lдQ`&m{9Y( (T, W+[?s'o[C'`/-_7yl1EqYYE:㿖[=&9`ZjE!_"'}R+84&^TiygQ+BV>P#&NR~j+"{(KHW|)t2~F?ҬdSǗLƐO^Ͼ S |)N{߿]]+?cA_Э<iOt֚D ܉4e{AׂXzʮ>t],]HOr\\\HB.\DȔOn [Y BDEsukڭ p-#S<'ښ$TΟw+Aa,9G!dSus[[r"ZJh;-&I(ڭ,!Sv"t=OsFukw[r"zL@3u|xY8OևY% oEK(kd\ixJ/BjQqar58J%ћ?}bcVp f] NeIs;X}-9%J%ﲞdLbLz.U'$m4w/C{hXj\ sr@U!19G1L6+/hLi'Th(W8@\0ඈ41JT[ΉV#~- '8E ĩ P-!]f:]s9S"ϑh"*ֲD@ugYiAJ 80Si$7Hh"RcVrA/'-11ؾQO ,UjajNC, V H"qp+6 bN, F,M4SprS߲ Uz4l &'sݥ) /(CIKCYF4FvSZ;)ӌ$TS {J O/)X(,kcD+WE5S+tثl2EׁY)+E[|8yzo*2;YoRVH>trq [kW}1E^][ 큒RNzn0G/+Q}CCoCȽ0 {|3`1UYFL55ƶ)A ŐS&Ԥ5PSTiKTϤ"* aXXp  qHUB[.nɨU`oӞ;OYsOYnAT5n*h $ji X"_߱VsEb}!QyKF=Z5xn1>d(Rc;wm;iK ~OG¦#68I-xQ$\qΙq!l,O(BaRs# 3&[/ hfP0ִO/}5][xVzu/FWdF#mlqYcHO IB\ &M)# Thx m2 l-/ ]{,FOG|1|= T{rD\J5͆x~<:C |oN&R8χӮlzi^;z{ҿPM:Lg-ߙGoXc5/G e|4v';(c}њ&03 dQv7R"^I m>EaG8)F dA/LU*Fhw߇W}<8˧&D1wݲ8P3Ǡ4V,Cb qo@^0p8X9ڎ<A V~>J(z=ΰDj'^Ʋ|uWOѐ8+PLbا\bFĴ=r֌1!w| 1úrP3ǁCew~0[#m80H@շpMW.#sBbGGJar+n^MApP_j-;i@@H7k4 /"CIg9oB  . WFX9#62veQɘb9R $Y}Û9=]!n歖Mr8]^f&@&;!i5*漰kBV󡥠^t֏q-ԧp]v.0!སl^\&yUt'8ˋ\ZD#)."75JobN>&{R+q4gtiwKM1tB}Cک%;pe|S۱aXYqVLaԔaWg8x&1aЃg em3SM^*h+㘶]E_jBb/U/= 1: A}/_[ե>&|fćcn3fًl,tK/S[Xz阗hhycwMTC_ κu蹜P}p!tX) ZY%SMbEosG $NhHX*v (já]z=< <]w[fJb\ `,?G0\*XdG4 _d6}㗯9&M=^]†,DYO C3qj2NTƩʸL~rr6( ADb>c<+Qpp:deF?|}{6z{ /'rr1QP˷>n:bHbGa>3>Ҥ_>.}[H.ÜOz2]iT׸tC8AS8Aee"!{xZ@Gm * M+h'8(΍֒(. wi줭: =  ѠtLg,4 ёl@RA}pD`lTrWIG"*L]y4aç-H(KYX)d\s,F s'Jb48Z'Ao\F/`uKz3崝!-Sa1yx-8$e8*$x/ DP8w-e/馚%i-CVa58"\`@cda,(3@E WA /w P:FT6XIFKo.q:y6q_dz[oE iqGnhj$9as2' ooG\^k?3@ѯ=v:ć+ ~W3Wf:rr9{~LDm 3)ӧGDSe!O>X5(,tI!Q{=31|q(=)5N&yZr^cN#wxSnoU:zLPL R"r:ųñ N/?,ˇT1܋R'$'ϙ" @"X9, `WA! BrI R`V `9.8=_y&>-xo0^ZS zeڠ4f&mL 5{ K ߜ=,)}:Ξ<"2iXRqh%JakYf^P l2r=KHj$z0k C_e8EM]4X/uQe[g: iA%rXJ25zv̉HqcS6J)Qe/U9rNc]ԡ_,#U6H ?Ңc J8 *)b}sD # TחzDAH VJDJ F lɿT'u|&'ǠMi9BeK* h+Mrl,bV QEۈX:׻X 9J~Im G rFXQE(L0)\)f73.AN逐1\=[5((NqZ3)f]ߑEGc'e#&=@Tur8[k|_޾q/Őx9)U=ͨZh,P+K-xk%B/AyK0n~TG᧻Y]Lsp ifߜ~LVtrV+|;Z|KwsuOa/w_^F#sԷB]#+zuuYx1`ȅzsN{d}XtwۙA&7~Rnޅ/.|0Oڬk}.VW?s\M&MJ|`zh;[ˊGw 6qqD,d\a[s=Ӟ&5R![[}@ֻcWgOp.PljSQpj~^TmEA3=!A❃Ȭ X@3(>(:? -& ꛡ%҆S/M%?/m WyTU&*SuQ{Ӏ2)RÐ T_ZRNbnӁQT<[,(툁6T|~r-nf%ݕgə5CJocAyz4 5IL-xlzYsM.I0WҗکT@vqB`j%I!_mpd0- u]Zre1HIےZ"ݭV$,>Ed=U @uES@=CXE%UdL):-^,ݍ_mv%NU9^1'EPuHT0SIܔwBe"tR_\ϫ8}꠵6<98Q yI?MlNx`o-~MLBSPҌHo-YG*jH6 fH^]A~uOq#I@?|_XeU[!1}A't_#Ս 5bN3q5KFW {PETETETES͊a@Qm,G{bi&LS!5ZiVabR9"&N%W +mbXiG3 8phP[@" ?4<<ܸ+kC"vpFaLvI˽Ήopʱ8A *y[ lz KA9븴ξ^.=L,3V/.߽9ί$!8_]y'YHƞ/Ja>PX4.%"y `TH6U@LZk#}5(Mu]j<5?IJk)p5XJZS/RN)$8 F\8p,5^E:5l$ µF sdhUN0xa;K.xc'1DIiD(%{がry#bo%햡Ja~R/K<9"NZK]r~1Ln{W^/3ZXOeX7M^&fa=;v$)7 )/hy=6i`tB!q9Al1A3EeM-mLrx067 WNSN`֚`urٴ#ur٤ے>4Gb, q~8MGrnӣ+p=ѨQ(sscbTwbmIwV 3TȃCaYge&/W GHe4( 򳏺W X9{6^QK;!HM1KD[)M2_p?wfR)&~q|bw;w{0gSN;^"gts-ƞIg9 kk5hj1730֠\ʆI׹V"vj.GN`` vzF˯˻űtgHS#sůOjdtK?w$AfblWs=Z0 Ԅ* ~h ߬$V0m^k& -&3o, Zr bqW-&x7VHhk5lT3x%I4K,!6@LbPӅ1)YS\aՈ?^1 LwxW+V9r0ȧ'"BM}N/;E'Mӝ?Al%+0[o֖\]ZmwiVݞ=v1-K=KgYki^ ~JX==-F_X01Vi\~1z!CŘ&"C4 ײojc+M5 8^HIb0GA@Re0$ R2qՠ(KDt"ɓ]r3yֽ,>nsÊ {=y`G"#g,Թ=39=nXz%IwLŶ}OOT!AL48xBʇ_לXJ:)PPGL=੯c'XrA0 Ny A$ίe*gyxM5 _ eX6db]jIEeHh's$Dɢj>lJiK1HZFN)#[Aj>kb>F %cYqضuF)#ZR RNi9툾 ČYY=fuc!Z20۟Bhu] WuuMx]g A <;iÖXРfxюoۧ3ͧh#(g)jƺ&ϯ>|7#vbZZAEi?n:dKAФ̺n-. rTz \ m<)c'Ьs6N/e1jHKf_}}B}>C1!* ?ⳏPA'ߜB1DW7 wW["2c̀² Wda[xϱ`* F+AHܺ)Pc0wNxo8޺0$%q*j_ C 3s&(r* B1(O+}SRx<;:8uOt0bQ`Zx)hLǷ3@.%o$rJ0!N[eIn ^gIpj}eI(,Jڔ;r޳ej OK[=dISwdIKW!2{Q}rVpz( Ǻ-á@ 4 7$\FST#"ydDJ[4[cͯv/|h ~8"_\4/owW_* yC>6)pp͍6Ex@bµNJņԠkdq [ه Y2?׵(mlͬempl5dq7"WG.;Car6Krd_);Ӛ% $I2f='yVYZ$JKS?E{̙0׵YM@J ;.-!E'PC wRaXI\k{JmԦ=Ĩ!L:H p nz`\Q GIhi4Blr!,qSCGnma'~K.e_;Z;'=p/9$`tCARuȳ0ӛ7(yaWqumL?q!*[9+BadϥT۠!(@F9uS$jqsu}ӳOwYhjG@^vUO* tB{vƫ7ͷm@LrI0è0rXv6̂V+!B˾;ܝ/a>7D#H"2!éE<Ty*d{qN¸} 5UA@IZCA,=J!qQSR[RX{.ۋ$~cXkW9|u3w:W_zG BqLȇ>x`|fƏaϷDDhL`짿 |&]4uZ!vk=_emXRe _onQ]⣞=ϘHuDX#~H~#;^* fykiUޤD&S"d2:ɴ`%SԨ׍4F~bFӴbV)Z`:hS=4 L#f"*晨 AQaw:Ş8-6b.R^tDe}>J;$~dүג4 *G>|]Hm|Ha$ZP٫ hz}Ǧ輄,@]D2)>ddzzܯg !KH#i*t" |0(Kl?h9ꝱ=|Ppup:?(cRb Q8-8X(D.Xv]Ts-H 8g)XIv[UN*I ҈HdFŘ/{\HOa$H|[ǞKEն[vKnJ eY,UOYkx[IDKq21ZD_D(艻%D/#Ă~ =fͨ& 0Z8v4K)Oxv'<ۖ*w' W>DT+`^܁ucЊPU-Jf[,K ,W sߡV$ɕ~a2i1LeqrԮD -pFTJ|*Yzڞ2{r T˥t%sPTRz 3I(dt/ ˆ)j$B/qE`UyW8JKqN8W[s]Z)Spf\^Y[ݦ_$iVB#[`đ4MkQaWIH?M[@4F/BT%^PՆ,>7i Іə F@DPN:yojR*u|֖*-ެ,"&tgT*QPęCKb9`CmV le ,#B:n2AYۚ ln-t>q@>C=/(Jė$' $: 0ŅS\h>ŅS{olW Hz`s $% Y pk/f5J^WA+DB=ʼ}чȞ4,X~#6.-(v˰ ]w6BSz8c3;vRSZNF+53$ʧ) 8(sF1K58DRyKeQ24xuxQ 0 9# <og, /MŸWW3IJD?T8TdbՎ[>V7+={zKA|KXoGN.@f@2-0RAhX7ձ́2n~*x{$i1|IlSk%iDX L0lmbx{T}nƎ#}gou훿l:lȏʓV܊֖HGDU6 &kJ3{|MI(D~;@RxqֆY'0FA(&&:2 J܊H޴!)[yTԸQG[qQYԫ 9#ssWR+*K|C-YZ<)@sex lEmȼ* ʁ[ɾF%`- ,C:* (lMP"CC 0lWRBŕP17ԅ678$&Y@e[eWB9X `9:8`.Z;;NGԇrĦISᶤf^~(= :JLZBIJLKR?|T=]A8/+,-u啬/S2ѣ:tT7 5TB$=uA7fo˪礽-=5wGHS8=lzB܇t^+ F=s)G,`ȹ'>*λ+eD0VZVAy 8}eB&֠*CwDUE Ъ&'⥩3Zk_WH dA3.C7U7!լ\ 9@n֠)*iBYU٠jMu>70BbTZZq^J` fIpu4sQ=G'"ր7XR"LT3."p.5*nj{k5[S&oXZf#f.(kOOm+~ͲT =ЀJjO#A1;U*04JM}QVfG_Ƥl 8徯~@Zf m ,3 A.١O'!IMȝ:dAU2bjXT挠ʞ9P)A)yj<5f\35V]z&he7kpy@`JS)8N|0yS%n pMݛyM׸TTfO_wZPތM>p 8ȘkFc#.TO"xW{1qt:Ic`a$A)`͹`XhjT=~ 9Eu?ꋯ._~=LZ)iҊ@WS -(>yǪͷqQ1ڕA󫚃]Z.kBwWxl-Vrwp_ۋSL|Jdޕ߾O̎_жƳr:l7IWv~A._MSZ7{u[4?,|\IJ028Y"(Fp>('4,5jkSѮI cɐ3F&.~0M̭v ]Mm%Ckl*?qOO'Kzl({@I6L[aޫkkSh+O 7H7MI6 A0˩w>|?}xKuѲs8Zs L߹{zvvQsF%,j=?}O?DyFJ ^8?КlLz 2ib:pj([;_e%sZ1!VZj51]mZqm)ur:V$#W0\.ҜAU$__,N _/Y+BhyF?$it墯؜fh^HT[,XRDQR}g̸/J4̩*ajX WBt w>]V ؆ș }iZs:ժyY׾[}ۚt.)]ͣ@tZsoe{ˠ7uIER[u7<ȯO떞m}:6 kA 2Hg 1b53F<)c%܇g c"*K45"IϝZj5$E` ph3pڴ×JצSsTSj@Q<ўZU]Su2(j ܂WQ.-H͟X:5 J焖@q)"ht J wM+QSIlMyhv^c+Fu5ueRт bԌI+&Yb%h6)}B!XY5BZe$ T:;V{i&0ߔ'S1"Ix`JW4Ժ8 BKrΦ**;3DF3Ťw8WQ=գQ JJ )dj(`QT{2r!TX)xTԗP:XRţ@nW49#1bPo1aDVBKU*9eS81-~)3[i+Kˑ=QE*i+ j%FVY-5A+Dwt NWWTWYA1 BסR*QMu{)WY hUKS% cB#5Y{dN@Krѡt㽪=GU#Zipڲr[j<.Ϫ.;GK--J@jbe֮Cn3.z ._m&p/`wnWg<ur׿oLm>}ǾчظGӿ읽٬Leڰ=-gmr^!|SH˫N @B*ÐF.W"8&_Scyǂ!WBǃԾ{>5m;zOp*SNBLC4*L_$Ͼ /OR!5-EpyJ+K^<@[rh{3|a1zB1Ф((N?28Jpt9BپsK(uM(n $TF - t ԌoZ.Me>Q&i9vug 8fd2Fޖ;ѿ՗Ag7Lpwcl7Hh.)̌+{׌C@M)c\~B<s3S@c^c+EHX֦^ Bƭzpb'-Aŭf8b$żt|nZr+QGg6'ebhfUV^xəꈳiYj!t] 1%rB8 \O)a\.MU6f1<Ñ@ZF)DZ*EITl&c#ЬUP s›] o^ɪi̢ o? zA)'Kaz]7{8Ş]'_aНY 1ecg[%2%Ytb[_BHşޤ9QS%Pv[SBBO O&zٝw9'R"kʟ>m=8A#-;iO ]v5($-X/e쪾3BU p'Swo& ` =So<(*f y&@x1Ĩ`GcnUjGYQ;ǙAOX} b53Ct7,*<ﳆy9ɳJtd[`2#+W6M|:W2_lM7V[ySalH2T%W~!j_yQs4r]&1|]yAWuF7-c@6J'e,;17ߐ6sXtc6 c[H* @ZL@8&FH#FwSX{;<SQ @ l:w3 }AM6j}+o=吀sEq3U LjbHy'jC Z$1R eh3!{6## s$=D_6 zE$w6}Lv_"ybL!cUdAm)/|>|PyUˋ !#Ch$Sa>BNZ M)$l1L0HoTp<`uZ j D.Hq2r(:JffI|VP8v5 GA]~o9qㅡhOj:~jY6mP50/WTAT)ç(|Ca'm-nX/6nGSlr7ŬPQݾaΩt{pA1Ωmmds~9PuemilM9 snG:{o5Re _Iw]) N?DX:ԯ"({$5(V,"-5HJ5 A!c]( A֐ t Az i{MGg쯽Wz>|UZLJӁ*Y}A%@fz:PYpԻR(`93* PmNȠX"TըUZT @ ~_-5TSDi!EuL`.khe{.D/nڤ(8C,{Ԣ0n`vu|Su|$,{Glº`F&$䍋hLOl ;UNU+s|j=aWchr%PPh8"g=՘!"FWA'D21 CJ^' qB+A2vzqC {DtV֭Ou%4*CP"zF~[4׀`-7-A~ m V. * e` \ߎg{?q~U`p@l:vr:/ OA ~(}Sg-PD$feh!B3G( 2#(4RT~ c} ^ ԫQ.Gjaiyfk~k7YML7G->'*O}|&ؽ^8rXY>!u,ŘpH¾UhV??o\^8Qh|G5:.by닰¢\,ゥxI9]e#X؈p:KC"?El(8\ {6 ;pV>o9 ѰZ=琱cb e{!J;) 7 h}7 RJ"lWl}Dlm=`W?\ލm{7޻]w5]M1n-HZ[ٯs:Q̬GW$&79j lTgSж2hB~*C՝r2v*)uY^[/nm2;uxEC,$ "5CPDQ]ؾ@[m5F C҈ `Bu\s9/i3 DoAfdr0 Φm OfvXmͯ1 DFrHq2r(bJ0K2T,rt &u^K@~RddES*#yЯ(YKR3h%UF Ic"gJYTxqMBx2){!#OHoTa:QHv"or-)fp0 ^^7P͓S1z:͓+Ӏp5~YY>Ļ'W?suTbcE/wg#7y͗xN'pߙa<,>XF[`Lo=oyW^ZA_p, /LMu#ECDEJ2%ղuf|6_-elZ~u|kt~\ZNai׋ j$9BP܇lfJ1κ2,4G{^K `{wz#1ΥP8C)q̶/Bh-Oͬa,/|)o PO^ّd//Vxэ_gɏ|\^\$XLC>+ R`D ̋E7B6UdDVAv$xATd7ѹS7$GӍaWP~Ok6[ټ"AFVAytl3/H2\ҜL5|_-ιDX[mr>yϋxks<ΑLh90 8Ǎ+|k ax۽&_.fp͜q{~EI3j"&.X,'`&r As d1xΌ?{x3LPlzwPMEAϧcqky*KFVLF ^uI!VC}-HR˥Tbt-S8gHQ:g ,J035%fŁwl2vO,J3}I$QL:!pwjҬX[vڍ]1/}Ϗ\&"'Zi"T/zSڟW)bTx2^5hjSW1hKh+{Fxײ/H!$o9 AP=޶j27-L[ƴЅr1pXµ ^=ӎ*c? ˊ lUوLEZ(/הc$&yQ`sj*5em[SZg^U\RӒaLYpK(WhYSJy)HsX-S9BEnrt b@@ R.{:eCXFP)-']TSO@% ك):!ACϑ&ΆDAvJzAܖ b&Rt(Ϟ\hZ+@췑3F@K/06Fox>LnjP;S=Fܛ%fyV-Ŭ̌}YC >x'Qkȕj5}w&Yuoz8i\M9*Ԃ]it1*ƱS%يa"\a.#cXs4ڻӌ@ tL]ɇ W|Xi&2".˚ު@rY+%VXZy#b6J gP$Hd%(q/#i H$u?t,N.gQ{ y2mqZ{&uOf^2;vffWd8Zw; ySputpE*"oZ9cl@&g|- %L.u{aM NK bi2LJ)Bi\ctbS>T< :3tAP ufj43#[c1p)Jj JJ 3V6nm4}}Ė]z\Q|18SqtiڠmIg9m/YB}@o'SAd̜H|8gHcJhs>+{\!yo-%ƭCAn[W_[ p~YE4礻 ӝdd mQ­#ad睳ǘh^ɤyd\!kqƅ G{s.kF*N}ꔑ;'#=;v]ffd#]_Fk[V: ?gcr۝VqɐѽCUD̟ҥ5p8~!#^\`7\mk!M`k2 \V * *Kn;tI$60Mti`@qé4ٝe,-qe\'ByyQRVǂ I%ǥ}kf3f&O:3.TlzS :_z>YDZVB35O5H|áwT3"0~0Bzi*<`D7T}Nn6aL=080s`,><+Fj/rf_[n$ XR&& ^=Cr6>xZk"_(PŲ.k(0^.hq1w 72>K#/AȗZ {Xf6}ܻ"B!vqV&$' _ > ~8$j$S3.I"CJdȥ沇IK1q/ $:ya}Z/A.Br,.h&]\](4š;dLŜ/u.Ppä́d0 .+/C["93](Q hQe*VӒG7׊wQggTB~OEwQ" =ez 8X8j tѪ0F1HQޢN?ψv.'ݠW еjˠQ{o+dAռfeӼH{rO.'!Bj Oڋ*zMԧ\_n:9YLl4u3XO:΅Zx)Vms]Nq[jNm鎯/]Ev {7qoJ" :83d3XfTgURܒX$,c1ymdͭ9S/Kœ*Ćפ`I )ҙˇ+3¡aM&_Bjp3Jsä/Jò`\\!g2$kSЯZp*Bh/'% L9*(sVCPIZkQV`D RT14:\F%tEv"`s{o'&(?ufƺB['{Q%?XгJ OS7xx?>nAÇ[LZ~rf<2.q"s_` OмйŃ/ʇ] k_]Fs 2w7edXV{K% Y(!yQ)B'A #;i<_gn`Pa^zB;E]$F;"Q{Z)G 5AKB刳R l?Dfժy $gpǟ};W;'fF ^妳?$H-:h/Ϩ}F[F-1D'%+سg'qG}~1ԼSq6n4''qVS,({%ƪ35 1Rޢhz-)\[!jeVZ{Fw]>wXZJz8[\rqTRQ&J=[nw,-Ffz8tf#8YnQMdjkj< ) Nm;kbαZEdh}gb"qǝ[dkvWKc$e(.)-%ՌTYSI+4Vzy%)mA5ך^&K2[f[WɳK '9E"KV#᦬*+8wESH+qťsZq-.*))L; M-c4*VBa>=;Õe(侊Z== Ç ߸#C  y7bABb 9(-Dg h:D㇧¾Da8wsalĐKIbg<=\ 67?>M&0Vؾ]R 4TJPt-|=ܽCrA|Ӕ W]@[zlգ`,svv3+ 5CvlQR-0KeA1*jr2adzu0ëU˒0 Sa1;R%\ I ,`hU98!JC *=+e% 46\32/Y{cm&4ur:v!sȩL!^y]s~=HAp*hDb41[pVVز*[bO 4f5R)لcrhMXBPE4.C>`W(, le#?`R81C 8ME)+kK0M9NKKuzLւSYBC+E*_RF+o4bq@#j0hX].Kޑȏ"qVs)|%]k]{)E2p'FyihȄ/MAn!-QdRT"- BZ<"(7 Ȥ"w(8gpԝ簮֑D!dYDŽ٣{NqA91cԳJ iQ`#EB'J%.|UZ|zAw*XEPD=Z$#WhY e>cRBf3<fO ×_&gjyٛr<fϢt'9Cb>؟؟!'Gb).!fف f;&NGEjYS=IŖS p{o0ZJMaY :̗.!lǯ Z)/NNX4{nřQDuY1 Je H#H}6??wJUi;اn.X*k <$xN&5Q]8/4E(c΄dlۗuof咇wSmt2ݢHri"7XjiO!fF`RԕЪ0՜1-0O[-SSpW4\o:zP sC:uo}# knuEJ!QCנ/j t6Ʋ^46 kУwH_K6J2nKnrHp XYHr [zlC-&fWbUXg¢r 1I\H3B$#h {_;w5Ru,LBjrMBnwxϤtcs-MDhvwhXl5Jbn]񿴼 Bv*=-H "i;fG Q,n?̧| 8Z颔ܹ1ט_َr܎zј0''̣x:ysoJ7d琾j (% 9ltHG.MTR_F Jh-ӆ qv=O \HrtC$*ǣJ۲|Z'8r#˒)+F!rIFjlJ*)E+bTBp=2oW^3%  p! 4rc%.ɟ]yXte~0K=B9j-~@SZ.t" ({e__"8}t60Mnd@{ `GE; ܊L"JIrɼDOagzg@w#p- /_6#/"umhG*=%,a}wEWvױ4zkojBzt%!m&!}15g=c YY35B Y(̞$&F_q{8Kyc3: q4:v zxv]o Qq쮗ݺ#^Ȃ{켝R/W7od=F9K6Zōz=َx{+d=<<)u-]Sjt MF+5qT'G7G9|w#pTS"DOuquۭV[[Mɻc!M 'c mdEA`im*_N\^0)im4HL)rݸ{I#*޼m"u!PF'ЄjCͩH+#\ڨL % Z-pyQATZL'G9Eظ_u՜aUn}ʻ:)iu@{=ej^r% ZEÞATl)S^Z=߻ @@ctvg`%l`ͫ`SqwDe^_LNKG)pfUΩIEO]󥒦 F:A[9ᄦ7I^x@ bh l3,i ZKA{y6RRvZ*f@5u]E~ZhI%c9Ih:C%>(6B)ܘ)FX=XiAv^=z0^ɽio R J63փ }'RՉe1CV n1uJc7N})4 NQ!:BW H&*n>/K7+QTģ7CScr0f~Y#9Lj-M~!@*4 '}Td?vKr" Mgu" RWO4_oCBpm"SPZzĉéV쬩ӵ.J.Ma&1K3ӃK·4kI|K62[HfGuQ<嘒BblOI5EQW˴!jXm(cԱ!zw<چh[iC!!_6)X>be~6t}yj;l0d8W墰L \H@kKh慦M fýN?ܔ# "OiR^B_RVWK+e^21`v{˯zpTLk{.~X }Ub5ogy^|O^*Kq\_9e G‚czX4+@B2AMA<14Zx 1R6X +N+|`q3t:e !5X^-z?.1BT$Z6 0wU;R//,El@"%r&eFaFIH)os.XgC!W԰!ـmHJm#RJKcH=#*8z *X)2EEL Jq^" _9oY&b2O{Cgxtrm\YlrAєsu"QEqtNr5 2sr˖k~d';WXX.fҼǟ>q&Y5)ՌVY ":Дn Y(R}hI ~g1qc(lu9hs^8 p2\DtA IqPNrqnq<@9A=H q*͜[ Vz6ҽQM 9um䶔iiimR*¡ժIB-SejwPG~4B3oP> 8Uojf>Xt˂D"*ŷ-` VP RzRlTw'*;TS(H*&n&9hR 7290b' [ea b>1P.ɨu5ev>9a $Zɨ+(RIFD_Uw,u9+oBzG-qmy=fX}?, DF=,]/ˋ 4|qåVJ\$6,.)O!~77S/|%pPkad?i*/Oow%sj~ y띐g3e0EBx Md3Ohy #Z tw&N-?_fHmQrݥ DvD5$S«>b X,AglBS5iHɇ`;I5e4e݇gjъ[: Z9YKG,Ol)H"bIȬ.4-2 9ʨaSa^ӔL8.& HsL]ySUYܧhճҳh$ ɸ{5\۱>gvS[tɄkO>՗Kty} BE?若ߑl"} ?Z~bTB|'uU]M@6#;|ĚgDF|L1xZ*1.EP`--dYLHqn!t 9) g"@߭ \0}Hunj-n3Y@6nn~ 3 bphN6GjG)?fx;8C8a;jZn#-qs!gJ{_ewvUFw{0ntۆd2)UK(nT%]%|/ AɘHsȓTBܷUOB.T1A&4Ä -BQj51vuA%%r 4P1.Z~јG4JBDKl f_/C((^:R͝)"H#h)k#% 8G j;1Ti :! %V= .J1!Xņ 4 0wQnQ/+f|%\W"w +nS O;ނc͢WEv+p:[?Ke( T1Z~t([ iGǯs}ne]Wa2wO@4Hxj5T?bx)0ծ苚&n͓[-cc"^(yG!$Y^bѭX^6լۏn3\i 8<;uh$4xV,lg.؃Vhyꗙ}l#D yff:P#WiKD˓=ϻRo5ڄ [ŠsZPdP >[qpF2W(y.lP:8%m] \_ \^vZ,@87Tp44l 1Ԅ<3F–:3w~J1x%h,; ^@yG!6D s!ZvrBJ%o]q1 :n!@xY/O˶m&ە7~XԎ$&?p&1`[To%1` ە$VI ϋ^y5Pe+5ZX"I+)#/Xs 9-Ԡ &5QܖHe0ՍSzXI,acrKϣK6Gf"D "kxE"琠5F I *Dg uf *"@W$GlW2A9B$(sF]YED0QXK5bv!$ )h+!Q $!M%H.‹ ^3mX7rWf8~6/R]gԹ, 5cTZ,n^ hUB0{^h뤕j\\%jj-1`bW  (H%SD&*;I Jqum.U]i:X(E+WB)9vBARrJˤU Lx-jpl45$8\>UbhHxdc8ǀp+~)_As@!jL2|~\zEuQm (^vKQ%)CQ& '【IJpCI&y\Á B[TX00Ӕ@H!4n15H5B RQX*Rs eMrsey>!IɝkFAm!۷A|+ɼ7T#kԏ2k,:抝3Moh":ȭ|&٬y44yI@H(u.dIyZ."h (e.^Z\/r#S"6hl5aygLV6,݆:5b,*[ ?aKvN\lLy)3+d>P5zwt:vƱDLbR5d|VI-it;2YLSaYD`CQȊ0-k~4gϺjf2ֿN?G?ʜ 3, ["ʏfOssx`o:[I_*8Q*T&ЀH+ ;ԹZ.ы ΐ#%iDTbt!+`l ݢjNS;H9JEey1_ ~?XK 7n '^:k B [[ # Pk{K A]R6G@ /VRE&%7#e?[OyR m+"`$ 2Hci,4!dTHcM)`8b1'dەZ "2t>N-.1U$ݎNӉv?.01oA'13m ƙ5Lr*cOeϞ:|nfAw,%w zpOfڟe qW"Y/quENrkk<ɒ6~4~L૞Ԟ*&Q9[_=UNQZo[~2wAVթ"㬐99uP_Vr-S\@qpݹz!Kq do hJoKXS|+.*Si`W`1Ddq?e'~& !-#|4ch@<R_ū$ k Ux^uNA\ۣyKQaťD6 ^q3J>W2"X^OE +,W 5VA: ؞@Ub'˲lk)]#$`R,&BM0S1 "XHFR3CSsms^O{uږ FUT1bs]J9ڢb\u!jy/VN] ]@}ExN/W=њG#Rz`/Ul8r% ^l * T.o>t%d49_tuP!UT-6j)ؓ|C#_6E_貑ބ1rV ڶ4+C6_VIX11lS5ϸڗWVX]s1m__b o;*8!!Żx6k CLېJ~hGkq%!%8pIZT{I$-++K `6d}$٤d !$լ[q. L>m4FC&~XH%I'i[jB(՛a<^ɟ;߼ηu駎y6;: K]l(Om9ټ=sټ=gqk@dT2eHi))A8v0 8j4L;ɵ"yJ/ U_YvA9 Ȟn!~Ru s8Wl1y$oqŋlwJgW#any(|ͻ_ȟq►7o]ZC*?KCF6o֘sKύ.=7K)Ll4`!!gS!n?ML8!) &#)+m98 @ooIZ\30ʵg>錧n]]ر6PH#F$a"RsT(fMcx?߯^}T ww+4F=e,042XɢɘHsȓTBR-J0  A)4 KH O'XiSߪ/[奕X3uoJ=3}t$N$ў?T_L6($O*u4r ?~~>x2V.ٷ۪r;鉝BR|Q*iR1P!z&@&I,AJHJ RJEBQFp7%h !ʠ;Cm=حK qfBG^6U0,I2duu'/. I|zv}7wl7<տXf9UGgNO, CJ!PVQD?z3l\uy|۷C .YmfOݧ,.QA8}~019uaXq* KRbʳ dLXG4"H0NDpN"$LBRS ]Yo$7+B`J͛`{ƮwbC٭q鰎{7X#+Jb3R- #2 F{0c;>]s/U I]ʋU5w_[l-G WXƺo}*տn]B3ޤN`dT:E"l7%7 $2eS )md^1I|ſf;M >_1@{>[.Lcnrv'nf͐ _c>_3ǫ^JTʱؿQ›R*IXnR.G7m#Zpzh mD_k'7"N`Ns;0C``5j H6RMuԆ^zayqD&&:H&6P,@<$W!`+85#4moϛϻoKt 98zIk眂C5\Inɤ5vș@3o` jmlzB<`JGwFs UI.4Yq2@9g:\ǤQ#ɾ  s'† `u?/S! gÙi 3rך:M)m$I%z4ɡG}uX_*.b$iQ3J.4 I]DRqS")J[RF _/#۲|bv:/O~W˚Ǝghrvc7JZWv {l_/ 5X!5US<x0F0Ӂ1rpK'0X1"'uFO{o5nLH'rYR %HɁgZ+`z5U#]yER1jJ5~x 05x B"y9y|10K5[dqlÏ oKhN6F? Zr|5V(*{-WȒq2ȝ"t)8{9^N\wv>_;8EXoƔ!\yv:IP=YnmONnU~t ۳?O7GDDWbΣ_ҵ4 ^Ļ;<}ږ8Y*ЌV-[l=6:?D' 01n<̭~tn?bb<<szv?2<-3{6131o%ףu=Z$8jp>[;(0jCFGOܒ]35=X=G`(5VW*Jrk'R3[ET$JAa4PjItI(¢Bk^'g\j b梣OKt%NmIN )4KVijբ)Arfu7tAC7jnzz' 4rjK䢴D8A `sĝ$+9$CUa={d7`m۲:i {lVX3iH @^rI@Osf"mdq\$eG1$?Ur]ȦINv=U. &G !_1 K8T&:pGKr *OfGZrA8Z+o^d/^ϏO#/ ^|΃ה>50^$RFy1 %)S"ƨ{ Ԯ.Qf˲ 3窱!2p7uԏe%~ 14;LC&h9C\)hv Kro`R$!lu7s~E}ĵܼr}z|'UJ vݟ_7˅ JVxԊyU|d%ޯ,sUT~?ۊxtOӢf pj~"F/NOΎ+:X f*mG/5.k4Rl`㞌ŀ1%nCPT6m(lfRc[Nko HjNV2fd8+q{osH,ywvjĨbX\~p-JXCه|.Usi_EmX^O7nPE^.|05/@)Z^e.}"xeKbZ݅A)WjA,Ʌ.  u}Cu9LQ(S}Ԥ B ynw(&6*!2JSƫT ULڏkm0Z}oˋšF:&$1b;i63&uEa'8ѯ`sLB[ح"4FK@~ڎ5U!&8;_ֹ=1ҺCᛝɡ(._Az>xlIp(:LOoB*x [7edάi2;t*>Ym`Fi@y#. }S,乩zQpi]ĨH:^wa8RI75LZ@6D-1IL^w!roXnoBSZ˱ވFىC0JްF*i#& yCI h,yO55Lmy+E?v[_jl=QcNS:Ge2qEvE( %j-jۮ|5K͈(B.A?ա* -!x6dH<0 Yq!UkȎ @L2!9Ir'"BDj>X! D݌v[`UY: hL/ w`OK΋DP#8u> b&R 4.LՄkKe#Ei~̩,]}h3W*_׭4ѺbP:bweN9C[ꁖZ>4䙫hN{kH: ߭թF(ŒUԺ!\EWu2nWAꍥ F}ouP~t~q8ܶEǘ."“YS%x7O98? 1۳?Ž7kI ŤLӗP=t&W-gr:vpzP*3>qJXUf'rM;h9P.nN Q(J/Z=}o$7Qk \DR(1xтwyq(ν+k$R\)@G:tӎ^-P{v#㦯 &~} WU2Zˑ6hLx|s64%|&ƙ1+󟤐]q:4FpJuZ :A%_'4._X环N)ՐzE8=U#/ެQId5'3Y ᠢL8v?CGVvCE%PKP@HAV1tx{R 2>thXP_[q Qw,4Whk%GLyJ(t"V0($I6Q%H.++ݺOoF1̂KAsnȕL3x#9S{wE͒x@V1%6PmVomʲ$эy#أ fOo60Ma!LkECoh)䆗*T#t#|Hi[mq ݮ9O|=cйEF!W 4GXKsXF"#5g+h߾}C&ZG3vI5}HwmYhF{cA j<*9?Y7WY(.U:aRF,Itgbe}b#G۟ed|΋*Ee%t,X FL>j Sw=$+A'lvϗC<}[[~OZtEv)9 c,8і3+:% y~o(r7\QzS,,.a6=?*ĦT&v@;~sE2ac*W '/̞^y]^~_~y70J/o`m#GEN [aq7f v7;1Ʊ=blL6/j*z c|w2_RET*o eC)wKs8G/&:%my]0缆9Ϋ #q~t:B=THMpcvn7n{:9y@ώ˚\꣺ 7|`XI䀞 {y5RmyqQ/^:<;5}*lcT@]?.R=qeK*&h#/EyY}1򲽍J/lDYϗd/IO T'uÅD!;j{~T/?Wsu+Id;4~Mfe'9?s$vktqnb~=_6/~7/(.eQ t2%)r05jKaRȪN*lޯ8z[6z)5 )D|OvY*=Y lHzj.vҨ=Cj@Ó>!)̅p0Zk mݒgg+]yTƸ3ذCy4o #QGqiI'H1%[ft2p)^n,ѼQdjQwK?y)ɧ׀^(TX=( ֥9WEUVH _PYEAмf{`vK!.N5\! JkQ[" 2#jphpbRY A-* lޚ-|+4s &swmmo~8mnͫ<ᖡ)k,(k :IE޲bkSIPaAhb\eU^@IWyHG`0h]4o+K"69eeB* 0 jn,mn(TN՜r $SlQ064 tAq&~ Ă0Rs)g?p@PMm3>$.FJ4ت.P%-bN8+κ,VQZBY !EDĤ(ݢƵ1Uk<#V %QKڏ̙}^'>`j+6w!D>\p8Ǩjwj%)fz]yw.pp%^Jv0˦C#]O&g+ڱfѭ&;sÜuxWd5ij{՝žW{$'joXg%qqxeq(o([*r/CħZ { /h%A+8mgh⌯-T]3ɧC/K{-yXdװbwЋُ.֟ɿo's~L;.Uo>^\ӃscG%/G3AEojuj?eǩzʮF؜//fڗ|k9 8瓟O@<Q?p|j1MqǦ8`w\069{p1Re rA8(g#ϧ+΃dbrΨpSŖ4H s4'I"> |fߏ"@ GRP>􀋪*aL0B!T].ehĊܔnuuUUyUBw5]RhxVJVZ;s(p5NrZT5TDj-FrvG;e V~_G+8D,HUXdqi-`j| I{I.B% j+(s (kF |GX;he{u)YUxg޷̡Y91&oNu7Zf?u7=] Dy0m~N׻~6=1֔ /.#g,'U=Oqʯi[z'jh{NVn[mZz v`:"Ys5G)0&iĆ,:@FQE&`[h?o2zs1:]W{s{\oN/oW7o˹o=l [.-~'bFo"ϝgW*s :otdW)T20Ԗ9R yfuI RZqVIʤDUc kC,U%m-$Ֆٽԯ(TbO$JĚ-и=y WDڙ*m(hA9E ڙ[SKSTjOə F) 5) JA*Ep/g7EGb BT 4m}/(Zү,vOSv@`Et"du=اi[6qtJ,B;wH;(E4bo{6pp{x^ vܭg ʇ1[y A+HR$x<0 52M3`()?pZOAO!YexϝZ0k4&k4٪?5 H$~ ɋSJZs2usNf l8 O ۩6yHA-y3`nw~zp.)Q7D:8{ щnQ\N`$ 36=x00.LTt s 8T#arθ_(.Ɩn{lGE yS#)X(@o7ɽpʤf^8i<9d) TrvלX7^:i\p꣊Ӆ)؜{ ʦFX&*([=|(89ǢP^E84ރ!/%gk;!#8*h3kj!uǾ38k3zЌw|xbp= =0Нk3Q'21%f:^ty']4#@ppN:k4֢ZJP SBT@EI5)J|UL*k"W. RtxDѤyU}]/}ƛ:ɑF]C遚Ml+Lo(nH_!㗲 Wkݚl??;Yy6cx옘<[Z7-ٹבgVdI}nˍew8n}1յ߾3q?|YxOa'Nڈ.}ֲ8? ro,չssqsVە{ʟXHH{[ y%ѭvˋ*DZNks{ŗzxtkǷMqK:CN5.'{}VskyL̍\/JS s Tf^X ꂔ5Ƹ E42*hfD5 1CӯxM9p cKxq*[Cl6=&'=E: 2Mj4XN*WQi/NיYcsˣg)4ഽ"] o7УH"H2gp\$ہq!ҭTS[˚a9mQ nˊٺfբ%0t0: $J(؍@3K`ј (V`YB'JJp&{r2@,Hʍ,LUi+!+aXV^s 2$ +ÀLG 84.{iSq[8|1d-1fqnn7&|-1w,J,νO*]e}nBD"Y筐;sJ$ ŭf oK(7b($=v7& ZpwwkDE]҂+fj H+r1VQZBY 0\,"hߥ5n'vv>DՄF5!nj莂))Z e\ VWϟA0c`!rd/qv)lڤ=oy{,<ߓ CX*$%ÂZZRp-KDVZT uC4yJ iTd5^oWP iKcQr >f+9qdr-%wԝjNbdjtP֖K} \2X@i*ʭȂ[ ' LS[TQ-nQ+wTs$z$b!q$m5e$IB{dg.I2bvLMAb":chm/Sn n9$䙋hLŹInN1mS0\?䉆jELI-Z' :q<_rZ2=(6u'+5 = Iq^X(E iZVFAIdtM K鎫tZ&*_]@g!m;bplMTO z846kMFج!b2).0ñ.0Q՚)qC8 l,H&ftˊa&6jH&$(5JȞ98{v({k<5#|WLHIqyY(bKxױ"I9jy:Cy% Ofvg'22KznLShaLQ0SR&&ÖrLzLkT{I:?q6SQiC 3 aqD-޵5迢̑yI0UyٞVwҾ7=}AId蒴DA]0f 9m9NY^ ?<[5V-Q7j2$"/]b/VYܩQ-Y:k?r3~g. <1׻x?0EUY|!'l[up8V ǵLmѻ}(]`Hqm8.WRq:z(d\gDd´nq뀝Leol9 akc%8n B}RP=WUO|-(!9W,KE^{&u%Ʋ;˄Y]Z!I BqƉyaM|F5ff-H'TY&JAmF/1ԀCLBFV:onyC7}Rg2M5;oE)-||4Jm՝7 _|a>ō4yҤs Btސ i47kƝ:Ǜp{CyM#R 9kfߑZ.\ `€c'Fv<Ӽc'PY|4!2TC4H>^;c];:ȋz&(k>"f^o 4%@pi )yr`,$F{-a6.ck$;Ռ*~YP6@oHR=!UmG{k΄z$(*(dM3\rYɞ2FlÎO!a*{8;D;"o9P䜼mW 䯎ZɯLEo* ?rZ]3zW$;Ge2AQjG]*͛9|ܱfI%FhI3Cہj$vmUFodSek#z 3IA'Co-- Nj/!Ug5|g2%yeI0י%ɤp1h0!yB)#+D8:Vm`q"LQv6m@dzVvg{Hʭ˴힘_䠌XT.F/:ӸJ 2 *U%jȾ(5'XvU մq+ ҆r79}i=+hpG;O|T?뻡99 Y0YB=*^) E$La\?{laDo.VYr%zrusa7ˁxfia#o@֢N&AcӖuM'ws:;Gp|zʔ/Φ[n ~?e|Ԭie}% ^QQ9+K&.V6Gy,)$V:yZ,@5/QD", 3F1'UBY leRJOOƶN;Q"ʨRI6wEB™5Cɘ4& Tk`fN4 >D$ SZ /PP@Di"UDnr{ (HX#k.㦴d0=6"hQHtf%8Gn&4RJaH5Q-Թ2-ﺤƒv~Is+ ~d'N/#NI#.޼O\R̷*7*~~;@Zڧe$EyĽ=y3`^i$J%pZ RӠ6h0&Cϻ8X"s hVPX[͏ ]; @k?* /*SC 8wSuO4~f5~wԒ(`.sM ?b밺_]leDJǫCWa °-k-I,o#>@}7gB 6~5AWkv$è ڕpd|gp. sX1:@>p*2I͆g|@R{)3`U~9b#D8bCXeo6`> n^ɚO"HStx<24 <9 wF@a}̄)y齕z,*_+Z]*ikHaNǠ@dfD_ɩGPHM & 1:XSr$aS [WɿNUZM(,u=BS5H8; *Y)CB,mn^}/rlDh)lX¶La3)l 2FIa(D)l;],]R: g^~ ;K'!,,7JVJ2v¦8U? -:$vg)l ֆV7qRؔ̃b 7 6XGb)l53&*S}ʸ`E6&c$)#FsQ]z B^0iPm=D0 3\;eLE<̡dZ$(%g6iݟj%S=Tv;e0g@e%$3.seg$`5ejQiKiUu}22(d- ,L:b![Xq/aۓ#KM<\pydZ/Rʚ31?'Gs ߹:#AR7s/:0#t.ArCȅFa)ޙm28o(#K߶<ޗb-Nf+O_gDk8|Grzj|ޱu![:ǜXX4`X0Üz,ޭbaJT'W_n>R3_7&d&ϯI)0⧓)Qx\O/ˇݍP,Ͽ]KS⫝̸'>]!`Gɾ}YF_k^z=kbcd~ӱ~GʉK$X0t3DA?MM%KX$Z R%G`s![+]A A!K`a6D5Ok4TT+lĿO-}wT[hG- TU*YTðQnj˭=jW2J!۹AC_ϳehXSV V5xy"=P.,^ KɸqkϤXVQsgPQlr~;8w٤x}t=\^h,9vh[%/V*$ ZIlF{Z+ M xǬff4?!fVKėoacތ)WsȲ7RTYXR% 8hk5_ݚpjY_n>z?W-W1J\GDvGuQ}v /B `k!h64b==z8Ya!􊷵+s,_#fZջA]JIHwoU80\8]*=q¡tJF7w(V,3: ƕ:83G$g\cfւt即,h١\ VuA՜8MˠK9<ٕ+K,JLۿNukWP-"fC}1ݫXA1NF-EK!uHˁjF,c̀m#^b1FZPvkاPQvǮL.%ҝgwM4^Z>(;XY@nU]Jt6:d]E8<5uh|$-]UֽS 1%e-gCiK/PD {_Xg>V[x\[ [[&gr[hݟ =tKP8PW޾jʺmq(*ޞt@T;_{IZJ`1aJTr Ji23zQckK+̝' f${rN c\.d#gRfZrJ63pZJ+`0@6If&ҟ3f&qbrG5D,{= g 2|>qiq \"̲r$/\eME~0[>Eqvq;NĜ3&w7i/.tqM# Mџ;onHp?*ЪF"7F>v $peb!\S|E+drK0|0IV.i/쾯s!TV+r:vr^)m!x8BLH../HeZMRQl&Rq /aYI.5 FCZz L٤U{I$3Mc0eّ{B*-I *7,ևT+Ԭ'եKUN?{WGc7"$mn{/XgJ%Ĭd캴vzH <Z'( =fIE.K󮗤R#ji%n"xCԒkzqY*[yIsn[u^,؛$ ,}Cl:K/|vOAsDt˚rRKi$۷_-`waH0 nM} @DǓ%®P{Fx5Yx^c* Sn|#mx(}ଚ d̀u2 s'rWѤMI,n/;Y<~y>蝇Yg@D@Ov bhEp.@8YSOmSjpO/:?3K 3: ɘև}?^{پ4&†3± e,|1.T`^.HrrBMeݾіܱ9=8rkD+.9w)NBNzhܿq?W&{^Rj]@!te hާ_\Ǐ_F5Rۙoγ 6Zh)l4)yBZvQcħHF#?Sؗ>y= Bg4Oɯ96Ĭ!d]f#pgϜzDT.\=V͜-Z-9=Sԁ*8ԘTNV"FSOtnjmǰui൘TMquHfCffUu!{Vs eN 1B*Щ]bu#T"ho^U/BƳ5mo/.P9Yw&+ \}䮏{d~E7zqT_߳.9Zg)cmC8& yf)*jg;7@db Q0|pGu^uRj5T/Yj)rM-rfe >a-DC۳S|cBk i4O86{jʾw-+)e:qm&{Ml7r{*vrӕ}xb_HpqA7s,ze]ꌖϴδQ{6`8gճwJ ?5^r:;_u] ^&gjwj(\zbhCعjec}FTE)_8a2޻3?n$NvIODwDzyw9vnO' ~ag>MvgfW@m48 XO@װ_j{_ N,;/{-\ܲ@<6Yc6C< ;h86~JA+~0ce/"T ʕtg|?s_轣$OR13Ӌe%mt׵Ojo~9+7wYі=35sԼHcwq6B}3 9e0{uˌHrӦks[Zmsi:^LQM̒=)ZoSmB%wqwY OakwFxww2L<*=,ǞVVK{}t,jhGPXċ5\!>75QF9#pJ;Xh΀h53>bzAtwߖtيO)[CIqy'B)!ML&`ԑ [b&"(E\(+?A)Dk)%\#5ivbDc!>J,Bi]//.vK|ON62o[ݵKD*x &xk)Dxx[;̝zTf_(꟒5jkqMʲG:7R(1N*eeL+}\ $2ޏG(%d9f+R ^CdEƺQqBo|[u>4~eiplyxN%jNW]![Z4*o7<9oƽGJVp,#=*IE'D57M Jq{F"<іGX'$rb%$cV# i l]x }JScJ$.Z/p0Bxj4I QVl,8wV%J. ^H%{F8&|guLwA?a%B>x><3bQJez{g#;;-5`g9oQr5CuZ{uϯ7KGwJ{᳔;gtUM6/ƃvN3|'>0:/)sm4(<];a&Z |OnĶIomč + r.UXKzI`2(gbp՗kJpjj'AMJUWSdXh"L(P%U]Z[dt E-@ŭ㉺VYPeM`k{Y]Rd%qFzjf~޼lNV"R0IiA7̏D}Ԝ Tq0\heN|(12{kLfAvBe|3h)[OGݨZ6P9D6茅=#%i_gO1e#9ȫ 6R?]PSaΖ(e'4yTAAGNQa{2̣^;CvfYTtL,KWv]ym'`K=a|ɰ3N PػP@3?X]U gb:kXrd 6;egf^b wå58g$uGgwǴ@Xfޱdd%vB ځ/!3T+\XfɨͰbN " {5xjfv<5p:u<^z([ (a"kŝSsȑGH}ǩ+|Γ=l_]*.>qSZ+roty\"_fs/Hn Le,5$ZbZt+W.1WV2Ͼ]v 0F)WHng6zO1s4m_ \ahoHk%R] ᶶx-Տ>:ӑn,qJjX?C 5Tp2E;)Ơ9f_Ru>Ӯ]5˺5lʺ'GrcfIr#D}Ԉ0l^<-k~uY5cC4J+p?jn<2`bQ["a&q zPu]~={w-Ⱥ~w3m~Cʢ.C}ޔUQ!+?'8i.+e7T5;_j^6OҸ72kof3dֵ'&'kFEK^#V")Q l݇BՒ-Kڙ`8OH3W3=`ْ{b]kqF;ϟ7za]μ8OmspimGC˜^~ӠN&*bEȍ:`NIqKNJ3XMW,nmɥ_|a ȃz. BSLK¬m^[(mU,2\kj{kCǖK%$SsRSudhoL!W2|8s)fR(vi(Wisp+iVCM˅X8Ks6л%5J/9'^X\ξw37_W8΀D)_>ݖާKFA+S޴t.ŧp76Yst|f}DVըN>ZGYg!䛃|!3'#^UX/лjNuW[x^}w!jSH8D-J]|M7->ߌ GYkJ(;kb F*೉6ED磫:`,i IkylzK|BK%vm}M.w|q>k?8=ZEܴ:%zrV2\+/}ӎmC{^.k}TI2xe#'l=_Sj*()& U(ClffN#3v(se*Z+ǻ2)ʲZkzi6laWO_#R5^X'yH~A8e sq5~t8W.ӌ8b\9[Ytb$#W^4׳[O5޾"&??V 9uU˳-ZoӋ/[RL`tCT*Ȓ:iI/w_6yo /ߩoDnDqt[Kԭ?m>W1dzzeJ̐$pR JN,mUajNEN &%[ 32HlhF?h2>ztQZTd )]6U[mi$GY]&8sNdrIA+&S Vå6ԈhZ򮢬2d$Y V_u!Zc2EW-+K񒉥v %07@.E&ˠKfTK „vX\vՋISn7 *d tCjͮs̢WI}gwk %7$˞HSQIȸ|"٧)$!<w,~)9va]{ [4z4A6g{Yo^zz^zwJyx?;紴y<-9O(lO+qϗrNq`h+`IԆN+`{lJzp`7D5mgp wzKGm~&s[H[Y;Ѥ̺/#ڭ/I+h _͊K_~;k`I^W\6Yvt0FmixCGxJݏi,tSHqA'Q@/A;B+3g(ӛa*݃tynz0D Ķu9:  2a/}4b-yՓ X}Va|NcgWj\9 wkeQ tC_r=9W+gXU=ƍG 6K~(q%\y+yqS]Mωp۝]?|T޿!!5^2'I2Ɔчudښ)WLdbN*$-fS݇Z.n8lfe(E/(k{7MN^`v$ZP0G{ }[RvRx_{ q&Px +R@JޜP3*YVțj~ӿ뭔4AHel5"KF#d*vET*Oۋ/yԆ4k?ekB8$z;eO?ϑYv+L6ۅ\xy%YrjňNY+F,(l]1Y{&) HC5BփN: ^UdS!)L!WVPr:%5Dt|;jV (oV>B%qbvuО!! 1Yt=|PhFR1xf@KU h<|P^H-|dVF'gqbptRry"ʏ(_Y|lYIvu*)?|Q:6i\ >%!_ߔШ9a#tճ5U܎T`jFUʾ兵sIihW+-]R BdZb?ғ);)]s;*M=i>!ʯ@E)(4iZgE]X |cyc Kf8E]{}}b,4*yҋ<A)\W%1u0Td @5=ឩt ZsXzP={n}Ȧ7c{ g3hW])7grǵx&z뫅޺Zci%7-s-mKr-N1c i%r]IE mn2l Md+YLZ,vG-mV3| ˞jTQc7[Yw3\`Rd1Q vJഉ1*yLg4Q< gHm ԁ"H3HX CHX(j1Z@Pzb M`~+_14aY pM\@:fxA:n:jѽm^]$+C/ qn3nSemG'I*]m2-4Ym))#:w|MV]JDj݁Iw/]\|͍?xD§xni|v^ڠob,Uzy_SߴT7T}t0^${BD䘖Q;[hjHAB@VI5dTEx']ATYB62~ >:W^hhgFyjT!a_wNRhKwN\%$OaϺ6r''d!O{]ߝdX>~e* "cQ|uC3G8K%SER{Rmbz,6PwE5}P$ c9E/H*zQv.T)JlpJAAoh#!&n˒ }fMR§^y{1}X=Ă2d qٲ쫖X0FT%,Q\$v)H If%)EҌf#mVȖ2Yy'ßC0e+'0;ob;GshڍVk3vn:FxO0*Nn&>$K|}KBW*0|uM^ ObA{#B7b7$k:ꗟ><55(4w&:(etSV}4+Ō^ב3/5lZP7"kyW~L\ʏ><)؆70EETs_Rnm0_N2j$ f2$TP)WK_D#zkӼu;WX Bdrj.Z&rt3&)咁~rKMC[pB|dR `qtd8#8|} +I5 ͔a2-FOIe6) SGݫ+U/.Iϡ nΨ_EB۾呹…xMBKf1ףDqdVDQsFy@Qc$_eQSȄ2rͨ!pGO{HUo̎W9 6$s.~u!91UDQniV(  4IC (ZE;S3bx Z9EZ1 DBiѐZ5SR,e]˚,&HJ +eC@"scp8gB#/u(L(P&Cq:5ť? JфT+('\#q,uBN{0k TFt2) ն"$G$ftk)h墤g~:2?OŃ~MyZq툨ѪJ1x/Ka"@sd]|% Y#,0Z+HA8m<#P#T*Ծ~:ºΉXUcq;Xƃ6B 0?z?$t*s(`k %k'~^ڐq{7sK xPyմ2˕x}^.Dӽe{Y$C0ɺA씆B$Uռ/WqnASLBKV 2j#[E/';[mDl] [2H%ܮ׋E;Ņ X'I2. q:X6햦!S%( Nzwp&%ߢ@e)`<~,8J0*)*҉yWƳpiM0Kրu`+!F[rTŴ]W&B?ʲdX^\)>W/7ӈ{\V_"aץ~=1gVuJ}p#nvy+]gmwTzdWp*.ZV*ӭʊyLSI8S4.ِ,xaʓ;s zAZ9Ek%ߧJ{t,x$p$^$9q/62r.e XXrZS ۘj3Ǚ fZ;3N{bvΉUSרmUiԳ+H[@n 0׏׀풹7rۿobG{)ꀜ[~?brJuzwiWV/ݿypRXSiiL5ZᴴjdZ K}X}y]|kWm֥wo &8-s3)U/̛53e]]O5,TNժTyJK#7u> 84(}.'ʚWƚj$d*h-)z ǿ*y h\ӭe[)c09Q˸XSڣWQ$nh)%jۃ)o/qog:{j|\C=4zr-M18B;vxB|kqGaF>a|~RaLh5B.+$),J#Ć %^IҔ5Uh`Ň:hS-cUVYm͢ V,PЪZ@ (ʂPsdhzn}nRf` @*dJRqZ1sNxu Tι mρ)DsNUR&΁TZ+ Q1Ss0>:x4JZ+ST3ǬHzYcm!0Yww.qy71Wk @ɶWcD>R-{W$ՄAvVl--lQI'ElBx)?\oDѢi=wnwTkrc6|Td_ߋ[*da8I'Oti PȋXq ~ 2> \*o9DFy/ wND rL#DbcG¬ajJ4z0QV=w.-{;Zro* |eooF)h!Jpjsa'(ړ%MUa,TM%+cZs9OjYHC[;S[!ߺ iDVBSUF) qiUaKܻͬZKXt+nR@X 0hTBFX%e2JK*Qhd9AL\'G~9:hzB qށhIGT )wE:_ߤlb\=z\YUIRz2L%- ђOSȟ*7~D_x9.2D `4s#@Uv CyPdx9{2qZ,P1rK#dz%\ K~k$ve 5vƫ6d!snYtڳŏ`P*R׃|cL(B&Sz>9чI<y:aq5L?,%: 9cG/%'4ɧ>C{&g1 vg5 WS}DSd{-]x9/&eTNAZ+9ŹŬËӯHO'Rh>}q؀vnzcAS"3]jl,}I)/#9yG}@`* zkDd`y5I-|Tp8RܗI;Oz%h,LJ  =g`Q1Q.!Qq×KC&a A`<փ O$?2x|u3#M ~cI1a9|3GHB4]բ`{"6$a+>y!j92tM^tݚȬ_^=MUuL;B*﷟:y~hnZ?/>?L=?BY&-7=a@a+i}cHmEPhZu_$*n힃ma1Y W:+e8d]} !v{;Xֶ1 ($UiW3 /wΐXNnW|*te|-_ PVJꢲKjU)5RQZԜ})녦ooUTreu s;^>GsNl:e yt۟^?街2)IE놰vjx~,onSr]/C?7fQ.]|>H?p1dkt:f싾}*MzOv=#ͤ/oVf~/>N8oOhvkHfrUܬSx=BȮ n{($Li|3ހbaj;=˨JުQbXU_~tn}Zّ-ݮGaL - R iNnA~JABB${2 zFtLEOd?K=~ ]Oc>T.r|v6n?v 2>Yፒ&$.]Bzx2E&ͤzzbo{o՗ɼ&b+n^glAtw+FmȼzooubJe~ḇv*Ŝ vvJ7pĠpGlL/[UUM\`QІEZUQVЪjvmd3e"d3h՚lKDȺNc蜉kDxbV}m%Ln;5"QKk *Ӟy& M2Ȑ ;6V#ØۨP޵6rc"e4ৎXlf$Y,e;qێ$' TIK7H vT$s|TU&B ƚ) ݗ}u ׷HkV. _dYK /q\瀼->{G9IW,[yrQL̤T*ӳI=fg'g$_fNO0cF'kО^Ew3ɧQ:-%O?ꖧ%beo]w k%rw*<ZzGfd/G1*ʌ(2dVklbg>N)M9;YHdd벨s :*9\wfwNh"Xnq TA,W5KN:3\ө&}R箟YOf:굳RuiV,xI-V7Ml7cBJ.E, "(U:FIsR\[>"kh֖R[]IFe:y(d!ȅR␃͏6 ֜ϋR?/Jg}͹D\XyGQ AS_8Q<{1ܶVkm_М'e5Qu.ťus ZNcHBcGi"*ȋƯu m+#2Q[ch U6Ui5z ji8j+Ӈr8J=eiVlw%@pϻL<zAOI>ϻs3!usPci@XZ%RbC,~ q~:cu#X+,$ǀcy.c 6B]ks"EV=NB\rYhE/8j2`18`.Ap,Md{`Iؽ?[a͛p8 greS](_HW)ed4Q@ Xp F/zU8]-5 WhY-!A>=>0#jeIԖ6މ쐅j;$h ښj$#@'GT67_%@s8540}isclh%%Cf3 OjYlS*kGѵ0f߭Q`u{h&up\'\nteDpHMd{`E_5ǜgPAbo/)ѳȎb4CS䧘wS:4M1$*9ю:6cѿ@ձӴ(Qq3w1I\nBԘ.t34z@BsDuH[/վϹ`ms?:RN:^*тN_c0ب2rD͵OfybA2m$TR/J+ ~r}G&qtu%f1*˕7("Z'=X0S([яx~it>Ua$ɨmCF?TGF&FIy=dZ#Yhvkv=]֍ЄNtG\WȰ[ v[@(V@M q \#}r& 4έ'9U9xFDò&%aSOk4[eMbn`LaAZ{TQ\H"#4j%ˣRw֋([ʪ}]1ÔV^X"פ5h#LrlPY#j࢈hNP Q8(2RDɔg\i\9" 8GfJAӑU;X$Bj_Y!28Ã85LB e6w'[F}\Hh-!].-d $+Zp %9gi#PFc)Ey~1_z,(-|',/*M.Pa7>L- ʹǏOd5yK6>Ob߮L}?;LɬFF'_&B8=Dqwٵ1}{/i;JXfS q ,y> @ULOFw ܿw>Zt އe% ')i/5԰1Z_ؓQsyO7)˒KKҵ I&M&'+)%MB X O6 ((ݍ/M]cMOX2AIW(DsB`s#DD᥏,9-J{B@Qz0i#XSRۆن%& S"rvY5d6cK] 2 L T ͍O`lfލ/R5wV>ʜݼ@,pU8ʳ z^(s=fwdp;}1C'cXd2!^ 4AT(YTaLNRG G )ZَTb.f۸`Ѳ-[C՞:漧*E)U_e3ʪܳ*Ź%%&qD\=V.7\W.Ot6=deOO@y&vF9H=\:jN'7@mk?`+-S6=V hw|~0q_19r7F/RJ=-5`7Zឤ2{|O׳"mY5wu [i4Xfe0 ̤f>7Eӈ_gl&-/(q E20V3e n{R^-miN#na}H$-myA^}'hk|yAٞbs )d( Yޯ=kLNbzs7:NDd7[\}np)v`l#|{?b(%aܒ\B_"SdnjTj{-ق]ZcA3 Xi:gt͵IDyZ3hoJ^īem1f%JCWmmMyZ{jAAPg8{ ];mbv]0C-}5]7ϙo%kzݺ k^X%}XA3bK\:_QղFZѺ}^ڭrj竽VۻJDCj룺bqpTף8LLZ(?Oawp(M_<6hAyj˳?mcr@ʗ8Rp*ɴVV<"?TY'Y,_:I|^:kODsg5Qw!\E[TmP#wZ~GiF fݷwk_ݺАtJɃ8׼1{nm1HQm4nEfޭ=WwBC>SFkf4ҔTvÔ}bN!WAd1Dʬ)6 З99WJc4 l0:o2Fy-/&n[(@:GRΌȥ,HU c7 l"x m(h VPkpM /*9%o+CnrU\/ֈ޻nR(.>-PUX/X e${ d r Yg \aުro|B8I[2yBCyT7ǀj\֚54pt C]֤^ 0lFrmf#YQyFg ƠJt:sU3C :騥\-.K4: p^{ҤLTJ ;v#*Dhq-TwK 949 t CT[ |;G.wv9!#8͗)Q0b(xIˀ}t4WfdM@(ϣ:v\W舠D&A| ]io$9r+xטCpX /nğdYb٧lvzm{"K*tlzY4cؚ͖cX-8Wa$gJ&i$I&i֎Sֻ, 3aI8|҉I3t1&U8<e1 [A.=0ʾP߮J!@'3 xҭ**Q^3VT( Y]T1"3̥;j\|u,QJ1 ' kWa? tX hYec,/@4=a ^fP' *}:o/S.(Sa-@P3AG=C'XA<~ҧ86AP7N Bk n }{bp}l]z:ٍw]KsO/;Ny}|n酒Z}X^{egle_kanTEykdA{VޫLt&j$:^P u病 N1P#}&LI7Pΐl]18-">r8W\tXUӲBWz٥$ ~:N}"WVR?:qGqWcTdw?L&Ӎٷ4$ eY6C7Dα| io M)vCP.X\ ]$cjՕ1Npsn?MKPj,RQ^\eB/QSF e헲s5p*ڸRZhmjSjm%RZV5h1xxKeTh(qe~WnV5EZ UzdeePq GjKKt>a];*X\LRW0E>yuĒWٙ/tb1TQ<˒(g@;aZ)RSYB@%D.ըl(i $)KT* SɄR> pgZYLQ+1FGmg0*tn# PL8vSgIdW$g9v-Z*VZjFҋ]YQNoK @  53 kz5AT|a=z=@4Ÿ_d{`!>x)d S=>.'-'~#qe!'0 ENvO1J50z~OmOH*YTP=5\#0OV0rB2QT^\2U0HF8P8pA;Ojݟ9k:OV $HQijMP`$HIR~? O|zp+#r?՟]1 ӈGҔ9FSZΥlD'`?xtY+PKo-} k0%y$M^z_$*Ps%*NkY神H%$۹͈2D5AXpu."TіT}K 釐mU6uĮǹ:i)㥅p(lȮ]xw,Rx9z75|#F\ĸN;Rx`FȾ[~օq)ɯ8(m1 urߑFE Anm#[]SDuMυV>uMu,z7%T⑃eJK|Y{K5@<84^ݜS!EbHqń%MF<a(XQ,+(+U}0k0$^C$B->0텺P~f_'ܧ3c^PWW4r7,)>ojeB=5i3ڌO9VVh$r> WP3sxM<€V2407%IJ$K%acQԃaGi#aJlQyp'fM4=%z[SRKS\hGcOnbtN ͿLdpVd?->*%c+>amA*D1,JPJPXETvO-D CGW$)Z %msʶ|k9 '6Φ k/5o&\uT\ER9Z '?.wVB:SQw$Nj~(eC,y.SqyrGjـЌru:Hn#4Q h#[]4˧8S]qD7$n:g#Fd]@fC޸n)=B4OAQǝ:?(jvAHszF.t8_zꦾ7f ӈ~ Tӻ^nK+ॺ{h6JIx^=Ni?3/A6E s٧3]}^\4=&iTw`VxBkU7`!9Fi6kgkle.g!c˙BLY!L4j@'/80te'c.frLt1٠ִ Gc',}BD5&G3*Rqwȵ}AT S=uj7υO4b=VK?B'?g8e@r8zf {pTS#8 iыo TfRLc4 AZВ?wM :[wwU鱊/%/Nxd>DUR(P, B][ \Z[;E Ǵ(Xy6} .|>QM^-4Fl$r8 Q&>IۏMG)/|=HDrC`6H&FdL6C%RU۪bqB0=*kQ2ufB&@EMwRV{ml]IN#e*#ʠDq( %R %Q&KU >9OSRt-q:"b~X?ʑJZ|KSo{ W^՛˧WuqqUplRN7,ʸ}A'KJWr@r_C|咤DlP(҄>=)ήn‡4M]:tC 5,yy(w;[DvWӻnL(TE )PS(19]9CybS>Q7 YKiIPX,U ]陮$jF zjbɖ9C#h"xmC wӣ<\jW_ kn "T &Ozr".ЇF&BIVagl^ݝm>G <`9696c![H[(:3JZH_.%[yc~(GBҖUM QpDhQXI*8rq u]VPPUVRx/71 qWw<$,]:Ւ׫;vIOkǑt ~o_uKW!8z٣?CqGO1.G[^*&$4hٓ+^9c^"a?oZ 5y +iA50?H$BVk*G k}1٠A(p|{BW'(fN#MpfP%0R V@ߔ[<-TY,L)JآKZRWxQJ U`Co>kmFگUϵieL6_:DӀ=Hij8n-[O p(%bYujV %(B w˵t^(N-A'p "[/*H!87J۵h@@Kadp"~WXVHJ0@ q%uT%*4*禪z]+m?H%)U Nx E@ vrizmrIj+1V`V .4jXI@EPk,) Yr"5dW.IJ[] d.gpiPPbOߵe;[H%f]~:c\]ShM}x`%)4H(Z`%rO?_+ {k \NRy/ք싾{L'ޙDlV>?+3u)Jd%Hj %)K 7Nlݒ8 szNm5A>^"K^ʁr 5%ﰜy`4x)Ux^ԗ;Rfݻ8/{ \<mNVLjWVM]}-5/V֚wXT a$l5֬PxٔhFe.RF_W (˘!0z{RF{p\]i~a~3Ey/-M@"TZ%n_X,FY\ۂ. abeGqS Q‚_Rn=lbFZnpٶ M`j,xܒLk-dO8ٕ8!Q_{lġ\{<4اcGlqOI{\z%r$h D( |Wn_.4"`L d-) LѥX!X J)UgqУ&*#K^bwWJAZ3YVJhn:FSQQdѥ)6o Fn&$A`+*zDaS|68PЋ/o.az*gZ]Ll>*[Qg>T>|xd.u|˧7~1:K(CA)IízGAG[W4bM9‹\Z= `pTQ>Ss:B߀_Tj`6ӳ5il>8yԷg+pfg/NfS?,ʇ{rC\Oa|ٻ6ndWTz<-U~-d߶\s\SCR$CJR\iMR294k4Ѝ|KylHۨ l uS{R"uQ7r)*r+5KV: V'2#`Y&)IH&wdCKT293cC6.x3tl23'6LM4X=1Y9g1 gQZ”*)>Odcy+> B6A'G'*gE+ݕ\~MWΐ1Ja?rM5?}A~>lX{Z}ۿ-&~5= M!8J6d[yMq~:_M246kG~-ȶAt[|&"*AAwp&ot& CZ*EWww; #ǭs,9q|^^(m{|k4I拡wiW ӵ*?t}~ änK>6yX}oC-)O2k8˔@SLiDʀ!kt =3 L~hڳ39y\I"0BdTgA>"ߋ?]nhϊyٛrqo`Хz@)l1 UX<2ukyzKukk}u/E֞Π3"11~MWOC=bjξgZd˝aYaw|"@DcpK 0bK[|ƷY2bO~̋{yO_f3@) PJiw?ON C \^QGƃ qLh#'_,M^^nn>wchHҳ)Rkqyx~<{*4lo7@NwciNIe6TkVJgJI6E UaHlZPm(`R3 韛p CYs:ϗU|ҙ}lemw=,jLB3Je込!z }rknO4uk#u-]8eGYz `^Ͽ sKG%QN)XG/e7-^DcՄR GN80Ӳw4l4g~_*f iBjīW^21+N)*4*3.4$\*"!GkrRUʀ:y]o'lZCݜ9hpr.0DiΠt8ɕ 0k3peZAWdqҼ.;H遢 wf ç";Ag{v[,w Ö'31%>jAZá-)7d4ҒRk~j.*=ܤE sԃwO.XFڳ39B=4|*|Daa/ GapumӂEߋ5<3-*,*t`)uЖ/%^xnqq\|+r}"p(-D[ZD&fNQ$2My1˫xf]m_}yp2'՚*5j8TrZ;5hcrhMiR* H[3<"sHInZ)3ȄRe&,5Ғs!gRrjt@A-Odh\[gszj5JYK)D^J CJQ}jlҳRhJlոm69K ӌ@JUͷ\zR &PKQCJ¤6R RTZ TK48JYK`aR*}hn$L%(oWkg/R(y1^6}ۨO'xPh R-싶oV4%i6jHERTbQ[}zuSSN⻯8H+ף cw_oYsŢrUoAMDžuxZ \ y/;tU}|EA6cA~< 1o}ِHC*SZ^mD8μKwKuØfZh}5wнgsc@>i,a[WVJJ5 ]vlHgv !{@WA͵|zs^9N[Ž/C?gsP PpV5*rdU.A}5u(կcn1Z oKB;n?|)ooU)Z-X{v&'ۄOI#SOAƮQ$nU_?#ݔHw_nyuW8{LgkSCVOwQ*k~"_nͦ}GAs?['V24 b*ަuAPrcl(X_ TM#]cFv[y/.3wY⹍jlq\,n$uIס]ƻmulmm}-z6xɌ][3z+ŇD{&Cf(Y٦#rN\xfwV2&u "biV{" ˲{Ou)?~Y=lpC0 .o;aw|K TAY= _x[Twֿ B6(kh0 wC_iix[H+AGݚ}5 j qrkfl*:{[IG_a@1"J/dcɢk g}q8kSR8G5$@ 5 h1v"QJs<{}"Zh-ɴ=8`IRYbpJi2 UZ SM iT7ރL*ʤVT[e@&a-)I I0fZe y,JB&y $ssZ0c"JtbK2K9Fm + -O0U-NzZ_|Tk%AMڬ!/mo^fa #QY|g{yO_f8ʭ__ܼ ū?􎎧9)3hixˋ-?n,?Cy\5xj"P3IoݲРDOa;:_̂&(ɩwE<_W6JWKw>p{wsg4:"ʔھ6V ^PW#vI\#%$mh4CXRw-gᒙN`,HLtow? Ml/V Mt$)FIj25n 2  Hu5btF 'jFlN†$cR} /*ڟ!2bP5YROH38QH%1291 ,ڟ?GFL(|46e,ӄ"EĒ9#^ICM )yKbGȹ̘܁%;gZbKJT^藊{kQ,yzߊO&t/XύΝl<1 :vfoAI(*|=8JX?)tthS}jM0v?G Rt+ MIJ4-NK?7/pnrnﮕL nyz[?L5n=qCF#R"Q1aKwXT\k#8ɕhDUDkϮVBP:> _k zX=cU ؠf/DΝ}wKi](э&Vy? vVfE6-ȔRKd>R&b.X8$Q)\v}M<BFھ{Ox靣7Xtܧ}} /Ef9hx3H+\C,A'۽FeKcά0$pP dS"R̨컰@Q )11މ9- tR䥀>5QJZJE saH4)ŝI:%C[m1GdRcsNyJeZ bף/XXI܄Ztz,3Q j.Upͥ#Q7< ==:ѦH ֫ĮOQ{ϾU0mI+m~%=f7A2seѠ$ƒ3 )ɶlzqe*~Yd*e4GP W ޞw~8n) Lf%Ds,SHsľnۊwJHNYNK,Y1+O]>p??%\|{<˷a/:|QmF iFz=(!3j%Bɚqsڵ'l'%jlAXّ ['aD3"K6:; b^VWwNխ$ Ӂ廽zu. 6_<,P-5D2^ $NUKgMAnW.r5 .gxJ<#CIp) 3;^IDA\*ͥQzޮ%cxnf2$ڠϙ BiO |P@JPKTPΉXIEShZMFjI'ӡ Ր y\Fᦟ!ao7E:c%<~)(?.VBR[%] *8"Gg^A`}/NыP;Κ,1w_Jz[^\ki8[%8m<i '{hjf3s " ߴ_m:8F^܅^ .e !\(%;{<%3ehm]dOݝpkJq #/o]!Iq1E_#e4"W_y5Je'9#áev>IaXfhM7ovt;vtc-O:cW2 Wh[xNؓ,e=5p4=4M[pk? k0 vLȽaf=+WZ:0K[νVtǼfphp(9̜0f~[Tn7__WKyFXmE5<@ -.-)_Ĭ'X=CͅL9,i!cUC@U {O}xTO\Љe>Y-9^8nbԦ f];Bظu7hGJl7>;zVVu+FAymu^XUzz𣅥і7*۝6*JBj" r{a{ݥ0zVxduRRW# su5Xf__gO^U<-<8dDqr6ZrZ9![së 3g6 MJz.ƿtÌ{Y}搵)9ݽN Tfh5)VjSH((=vKө2턳/i!yڭ+v|MR?nrBVSe)Z״v~QB&Ѧ}q)1&v2(ݖt)aO.Jꘈ@4TRXaAչT`hubBAU#^mZH!h*B %_mJ@gSU(a*Ȃ@g'#rm{+@a4sp)wuA+=&} y>x( T}܍[ѥ'# ޒ$OQh0O|8QϞ( Yt"g?$}:0! i)|Ea9"S ,i'<}u%X80ya.clp1ggh3 xRC]X 8G=®{ٺA& *cjBQ@5D!DA2T UH:r,${ 2#-Hq 8cI#C@(y7d3px frAEcK臾B"@)FTH3` cr! 0åOR | Gb+NunAmt $>U?\BOن(D|gH-fY%\˓gys:ϑsN.E>Cy=K:_ѧ'3:4,β+5٭#5u2Œ! $!{atӬvύ M%#xC\RDch 󋼘P3в܈0|@pKI}.B?"LT 'shv7̪NcgGz3 2e^0-rr)5e V˒Obe`(oe`/YՋjeOqlYخ{a~۔O´ B&y8f-DI=?!ԽxhLWbDGd/hlwvg]%ē"xd -EL1' d G>!\HC z|wnm4>z֪/wXH[2yЛqa~1 _‘Pp_HH+@IaDP`pXam0(zOI $t:l.: lj6O}g\;S=UHݝpG5FH ʇ7oܠA \~n_/~TxbqZīNX鎴k,Nr:\һ]M-Fzn 4v*N~O(Of i<3CLrcSVy99*HG`7U޺Pf' 曃6uly?AY䵟&5T9VfZoLߗ_ ~X\s{B|dq33˜̓<w`lHttamDj(k.`܁qБ+h_8W` q T쐩Uw+Ƭ)2'zfF';50_T`ԫM==z YwΛX"gr:ltc>$ɴ}  .ulp$v/xX~0=9mL71mh>T!yZG5."n:$]bSLE9QD[]dgOt&Q9!;0\#•jG9 ;rӁ7pp;\2\qIb͊ԽΎҏ\^qmM2CYx%+%}ݬs.6OA4С̍CCsx˕O2L|637xU)vL6FYYՂKSk;AͼwBޯ W;?~o'Cn2KťoWjB烙AoAMCm6Ⱦtq=`~v-ׯ)**_$-TՇ5L}jǃ/S{mp,޲&׃[*IlUѴމA6Cjx-7"&Jd6jaռ`8qQoDMz!v^D^uQju/5MBS=Nv+KQ|Dw:cB4~E(Í֌6|NDLQ:TPӫ Koa= nn3Cw&|?lYvVn2FQumT$=jP __߾kSkջь:̪f0y;Y_  "d1LY{Ydl5clן7Ɓf2d>]`6?Pd45YBP5ݘ|^g؝I,ʸi:i6֯~H y_H?5/OLz~6lc?e_׿ޢ2{X~377McG[wxh T%M 5K!~JY ƞۮur%fd,vxE imz6Of,0紳w˱]nB/;Ă%O%r2@2hX۝sW=a`f{8:tju73~%5cfo3r=ih9ᤐD,P_B_QdIuVc v[V+Dž,b!*H; YB[&[{_n,Xkd5{dmk ە_NV89O ĸwa#P/j|ْnk|Vn=;ysyw~|ÌO9.8*ySyۓ9V91!,t+vظύoy$Mo%=ܙBɑ{tיɷ>tP3g(t@u?P=Cf'58 1&̦N#Hv!qISzwܞi4^},-cSlb0!T#SL|,Q,S3^|"]H-|eAjkJ5qBŗk> 1PߗL,SM+޹ocd*Y1[E[V{^]Y]qOuooҐ\Eht{lL8w֭/RT]Cdh֭gD6|*K#G֍3%` Euźqdjh֭DShK(QbIQ̧XnguNۨ=w7'xN8}ȫ\"I'.^H,'*Vl8[+'sʹ^gbq33(E7^=xڳGԚFY^8{pG_ƿn}X[O2]<d.C wW3b囟 .kMX 1 otۃPOaH]$mhg/4j(!Є4"gTlvWC~&&"29yT߲#=A{*9k~m"+쬘NF;%̯o 6(]QuL;+Wԗ؇A(6j,8Go8ۧK"sK 1ð -9lƦ _0 Xh(dd'Q6&D929eg?-îѤx-C]wQ?E|f.`c}{T-RO$"Rh c% ;P!}oOTh>#aw2.QWQw)6+B_.d"e~5an, @"[X^jnqrN)PΖe! .Gs7_YTiٛm=D&LGBn+ӧ\:~XkuͶgyZ628[]wɕ+Lma؍z㢘j,QT zWXQV`{7*Mձfn̖kI pj>pa+`9^34ljH `ؼ):_A"rI-Qxʎ1.0qhr| 5 p. P`9J6 aGU;5 )o,‚Pr!>o% qτt.+r#[d2> c< Zrq26@&H* nGm'#bQcQ6[ C(ZڻI?r{Ww):3JRjxNySa ö1l͹+8bi;Uq+#sWw搈 9CK.Y䨤RZy^H>QYUB YeΫ{5V:*T7aYov f 7laTak<nn7͗vw=XM,΁p31W?¥%.T \M @NZp..@8HDfEYjҍ:G-=D-U4k)T&*іVTT'B(2 'JgW\ļ+L18hk @byW{s-˻$4 JZзlYn݈x50-C`/R,tLH˸S"]fU{dx6odx5emFz )(d Î󮎲N6;8PY #_)=܃r/~}GN {ΗUm㜃0!zt hkC\aeBY΃I(,|Ab`±|]'FbVFbnb&sFb*ј#:Hr#Y7<=l{&&O/L2=,A:IP6b9tcX} gȥ]w{v&8P;M$~aaDu> w}4gmOM*:|$F(8+ 2 gAy5:gO'1am.s  Nb>EY00;.у^2$D(o|Zg&qD&:ifK%BR`c7/Iq5%Sֆư^hFU@"`bM5qmR 7REiZZQG-=`-5AL)ZP| qT&-QKZKNRQHARӴ;[K%iN&Ih4-FsK\KUWWLBKUǯ&ue؈,8Cs8}zmBL`[ҍa ;3jpB{W̽IL8L #J( ĒF>h{ S&8?;x!-%ँ>6:0͒a(@YMJg<\ 17x%!0Z@jMݼwsz[V.e˧O]mڷl1?^1dP&^ZO@ Z˹*\݄GYӼ?'?/nysWջ\;Ԑ@o\fL 88=ۿ.x! $Ěב1Š!by݇Uq^^'bq8ו0X+}nhL@fx Lg|j{s#5<}n{&&O/L2}.A:IԐ$OutILH՗q}tbnioN9yt >ן&) M5IGlac :h F>>yb N fX3[:sOϥam~My`$(7iKZ7K!.QAiZ*R JҊjxt$2і`$Cea֠&#rE*R21@BtsµA+$ޒlA$B*ITըIx6$]]KolŖ0SASno.cg=Lw]ދ< C_K7NB@)who4(=L@"[ρ  ZF9K՛?/8RBΘcE&Y˝zI2ن1o7q2Zݮ ؈Dnnz9mR T&$܄u5](T>||@+ ,c8Vѣ&fM|V1._QJA# $$;CFkoB:2Cf$kD}j ` uĄ"* Ҁ7`d \STii9+R(5 v4 [MLs ><}<|Tm2tڀCƪB!9X!=SaDbR3#HS##SjK `$<4,|&_r]oFWY-8$e~ 3ؖOl&WMl< 2E5]տBS[#e&B*n,mi6n,i\7(}*$+cb`)+q 1N[K X`f*hՖEPbM+Ng҄Ӟ,+(g,s5`0/ !B1‚\ J # c2r(/s4aH _p^bW\Z=(\Z07(^CIXajoED[S[wFud>Ah0襤6jaTHsph{))J+:D.DD2O #-$D>$"byF% |@(*x I])հ4す2aZK(ŻkP0 i| 7XW2d19幡)`Za!lJi. qڭ9\C $ïv$C`swOb!BkLbhXSC3]1?&҄m;TKJI4$1^kR$V+ PkXG"ЅB]ǻc$~ o gPPgx`:]q7`xVGejӮu UuLDXDk*D{ n3>XV@}8&_ʾ9@On:-x>{n#^6f+ љ`٪gpu Ua8ύgE))5iTX?VAxM5@5TG-+U~2 N`ֺ@i yAmhV>7׌<[w[\Nf2%,|_yC6.#.Tc-zAm33(wVW@cF?BI \b;j OP̊LK@`IJi@BlW(CipYgy%޵VZ?(Ӛk.7]?#aSȥNnKpUEG]!,Z_kԩpJNKgz@*@Jv.+R){8=g`jx)/v+CkrT+*8=zrÕfBN.TlLF 3^Ga"9՚TS[TWq )cR̈d0Xſdޞeed'!^f?e4~6]tT=V$MWכYd_c : nnZ d|*T~].pW&n6J"yW/`+a3C*\$[Dlӓ_wKA餾w;b0Cno-zIwkBrM)O>nD*(I|G-v]eν[}Xւ|&aS:ҦP rոSƌƑ58(DoEͱU58JW5w!_$ZYTZJ{+Y|DYŐa$*7(E{<'(XorG_VŞ$rNiS:jur"z_R- `B"N;qtZMFKEu4_bbZ8ĬrTĬ@tF c֏ ;fHV{ƯJw+B'[tNB]2-[[abS- 3qͩVRiqCABN%U/ɸͩT:#فj>$٩2IN6GSuo!2݆|&bSXx7R=/Ɉ)|8L&i~? Fw3s ߬ }+?=bb߾lt񡬳: ;TU~ ]Wfj$T\`JIab[-VԂZZޑS3 ;TB̔2$0}yR*ǟ%$y=际*QdTzB4;Raz|'_J1%'p"7ǎ"%?R(m442O`Bs]v,P S i-W<=%Z$nE(6IzW2|h&IV5h?83~5IO̎CcA?fW s(Ted{[#&NE_@Ywv }X` Qx#{1hrVj 2!B[LbɅPz@dji'ȵd]1akg;"gǩFFzJSS=EdܪP>ߔ4S"/ߒ?<0-vʏ:z7Zn uq41Fg_O +RM%|}>}Dx)ɟ描| pmЈ*ۃ?SؐRJĮ+уqנ&5jlZkogvjX+:Y$JɐrF-5&".czH#KRjp%l{x`Vc[BwUl,EİC''(rTw";irռQS qB{+ /ܿ[$ӌ{vxwM&#I35؋USomd:"W*wEajOɶu*Ɠ_RS,RlH$JF{ V7gV \Pt8nN}'@5fqBomDmgoek0!0֤^0qD苬*?u4OB&f6 ]$`8pߗy^~qb?tKs#Tƕ֯m.r>5bEܺ;{sjӠW Ps˄sg 1LbY3;Th3fsMb%MgҤ[Jz[* 6hIjP,<(TK>\)cЂǹ<(lz+ƍTXQq!8c*˕U`4,<1V)1ىѱyu? IJ/)=<ɏSoDEOϗ-+2[PbT/:Ձ'dR[7^` xSB y \N҉VGGILq6 LIY|$t)1fq &ijcŐ3U0r%2" !w>D{[pCrz"`'Qzf(JvjG fg)mP\sŜ*/BRSjєp" mN Q!,tቱH T[b,b 1zșr\TP/P+ј](ZH_x(bTC?%/WCB hN-NrjmQde҉72<5+.<bHacf,5,2Zbp(8H"Z݀ "d< VQaID*4`N(ԐTZoGK002pk<2$X3F` Q wנD2ÊZ=J )A:b2Cd\'#>&=ioN䜴R”JI5HH95<1/\V3U a)Dlm k"c^p@#ga'3QW}N<+ݻvggAeLY5 ZѤ&USqP'Hu?S?,p&_S n:^\~?{ʍ̠e^7c/''L 2ɶ#[.Io%Yd'vĶWbUX%- o2gg H2!JrrsY91J SihZg":e6:M-N^9-CsC*(m\i˜uS&3KpnHD;i9!:\Ԍ;P*&p۴C \4][NՃіXKND=|(ֲykbC |޹NhC'^lƇczbVd Imj۸ҩ lH/fP (zW3p-Vy(˲:je 0l`m  .7d?@ÿimөP|j zkE9ZezRѴ T1%+TSBKIh%+Pb qȿ$,|3M_'o6gMc<=܆{ K M߱9> Bh6!Tha%,mH~#N^b^mjG{?/ry֒ELir;vD]vkA4|G-v6+T[̱ڭ h#2-ɋn 1逸EPʗhnfj$w.R2Rv-B NY<&:Z=8gG5S;իCH!*<zYdVT LG~O]'I6&) u$K˧~RF'By) 4mмIaKO lYc9\F,)Kz.;WϵJ9S~9kш%ϩژCx$~2R + oo.^ZhqH[IJNc.1C2㏿_-!_#~uN&0 ?kc(Bxf7 5>X7!I:EyU`(vR ,:cGo)dM\0*% $@sWw9$s8;6.:wlj(qZ\jhJ(xv&o"r*bHqۻ:\&^x@5r!sFLj(:%@Xauz/VgPČp F qq&W= *KR$KsQ*";cKezB3K8TT)d+N<8gʴ"L= 1ig?'q;\T=:{4F{HFJ(K pL9.uFsO/$d9rZ1BS;;jb)DOXߏ_'Ϩl/ }%h~ze,}qmq]ʃC50ғwѬ0JfrqSc؎bK+:5,U)5_} ƜRl,UP&F4"Q]8oUqN$JْlQG*Tg+l 8LԄ+w6XuU7sSo2QaϷ;?輭?][yzy%qiuc靣LRuI]~tX=}3 TvXT%wv#@GV5fuH;E7w<{rZ64vqNg/d av6 9g*] d.$D+Ȥ ;9&Se35i3RkN6]ɕ`=!VFR.BSeLz+ڪ >lHGx?ViǽT5o.kM&oijQuOu׼=^yZ4] cJs ajm]iogP\.^t=`JV tY\]_jcJr=~SRRp B4@or\ケP4%u2%HsFD3i%Ze<9)cH9BJ 1#w2MZ?mhNkcw\4h( t9vu1aqR\7ǿ~|_o ^Ln:$;㞒Zט/(.%UԐRp@o o+rLFR@9ZD4D=tʽdK{CSܧ1-HT0hUJ @˘%|p!$_ю\ZC emTo-t.Di/wuyW.-/+㬞r;h g|?d)7ϯX1!)WbS_|#jdBxRV"5?\ t6_1x.iDpIzAg AU Bd]CHJiK ц\f]~3!US*[33wAkCw|kz`6#8j W[I3gãwC2Ow6E"h{=d_ɗoN32I*ʤOZ[GJ4\@lec_0ݐA!Ew0l1E"|N_QUy9wLv 4ctѲ"Px& AP p]<3Ld9:^ak)3Xd6gkO(>+qoA"' `[&”?\ ^hp۫(ĜBнxzv<@hUͻbmT_śޕQnHswXMHǹ~ m˷ܦ_ BK}pq$||||Wfr\=ueQ;;+ 3nn񌚌mCwVc՛ѹzӗW#2ED]EUEy;chM C(ԥBA Kuli0z). q#A8MP&L"}O pΞ=[r\[7?찪 zr~1۫F25$ KW77<{l?QDpx7dXB}$O҉r  er^5l\?{OF2/dn`蝧%$m{$%)Rb6 [G1++#VIe&!)H4Ĥs2?ks5OOĚ3?5B+`=/~}pv@gEY-!/ g1Mwkq&պW`Qؼk5*!@ u׮SWeff=E7CYz_V/F { =#?`"X5()Y-:CEinj^OqkLnl2z]k&Y_(it _HȷH$@ϗN9$"at%@^ؓTh4͐S~C+]HΤ9tɌ^|KUH2%OO'U8F ?bڨvVq+P{9sq7/q*rT+_mXؓ6a[Dg"B9qq-'[ݾ6FihOl hSI3R3k{A֖(iؽ݉.nZt "gt6Xs/DԀR J29RN,I伏]W[%W%Z3Q}ϋJ͵{y1oS(_VKp=cY; byz"FYi :#WgeZsl=6PTiy2GXwע]$7#XyMf{Hq?{ J`4 ٳT/ $i\A3X~]b~vdW Up-Nʄ _1fZZDV:kEdezOqSKvTɝĽv$ܛc AH=h{׫U!n;,bhV=a})=^w!d3@@f3#ӽ/syzx_8|֗b(oɻJE۠t)÷wuvew7=l[p39<?݌>X/GWnXY輗e̷\U: ˴+GY]Ig[fV~QWݱ :^FUR SDitnB+?^o&||}wL lQ=e\b㷄nݑ#3=PfщιCڣ%!>Kqtab,-<~:PrWDRn ʹ#e}lᷴy|1X+KKV.1gԖ>q\ Vj l0\+mOdk#UXeɯōo1{6$%^Dwo~3 >:7TgNK|iܝ껤E7l挊kܛjjVS hӺVZ FG;J^Y41QcAPF8BiL+XOSҘ*VәvbY9#}tB09d* kA•gs0h|B&a@0e=M]ΔBX4NSV[ҳgvPu:sJqάiou!n;oAv%p~jUUՙ6 OQ]Ԋ=BSWvmP7}Y]]z䲻Zpi%齺]*˙>A |?>X]x(4e9)vT׍h% 9Q)9Hi1 ;(KH)İHe@ YE7漤L,Yu;K9rO*~ydBέW/skVmFȷ~im8Q=4o^5n7gsО&_u3_TӪi/J8kY .RQ evFSo0f!S-9ce6z+*ab:g @wx^XKƼJI8(X Dḃ % NLf}~Ƽ 7Itr8Mww;Mw7=w埳+c-rW,{{Ϊjou!9q݈K$tjU{uʬᓤ{۰G{[uWԭn4r:zp}Y]=}r F_zr! 뿙Iţ;P` S{^̵hBzȰ#7f5~[au@Is;g"6GiS"n{j3+ǻBBP^/D'zdDNeܲ \N)kc I)E 6 0|Tq.7u|G7L詖w|gn>-$g襢Q.4țۛо߆VF@-gT$k¯LΊ+wH(Ƅ9 OKM7wSJ l hQ^Ff`F̻>Sn5zL.%55M-/?\ v D|WҪ%SCGe+u ME רHռngIٱUj|4OTG0j+0 H9"VPQ q 0f8W]z"zk_{Gt moOEȽA>GE>GE>GE>GM\f)NN^;w&$0 !&]V&OѸʃq>Y5c \ vfe2n*p%aiV"i6,Z|HqqڂڸRb :&Еb |P&k~fUQckr(Kε1T JL0ڋ`%ϼP>3%DoK͸ygrnˑ$F=M)dg0l$G5G:[[uq]Upm ^ZqTI-E[',miUI-T| 'NK;"J{'N;T}+>@s {8*Ơp^I&K%OBwāhq/lSn3n"LW8zp3kY0i]ZD=RPYU:b\֐"Ã^ЕR{d?hC 8҆Ů0ب`bAk߼茙ݷĢ鼊='$Gg"&IE" gNѶn~4Jٳx_O]HakMŻ,ȅF0(Q ,e>IBU?\/уAb'Roku P`,S,9RJc_a E셇 {rJeʐCY*1 NΉژ G:\+b=2dBB- [v%NdJNt[C[͇J!>.(j8;+? Dυ!ƖQ]ic=3d:}h0%JOSAPB$! %SY 7V茢b EAD^9zʁـ pC{p=fZH*z'DMҬE^{¬­#>&t|hp %$!U< Y՗\wΛo,4&QoᢎOb&;Fqd9g"CC1NnU'SlGtv(}'?Atk袌2 jAv0鹏)@Վ>dl|غDahTC(1Gf"HfqSOZL%d}`;iEvY+}.*ےRƊ1;ӻ/cg4"P͠'=S.Uq?Ay N =u*. -eށ(2*;&"ς9[}N9)1gOeg-&!bj:dzeL0C*5hR /^ ֿV_u!V9؆ǫniwxlgΊSh@UzS}VeYq\Љқ?0>g$WSiเ=@IUUy*BvQe7WchiI^W[6kͬ) T=΃Цы>2;-*tKźIiѤ6&Lsz]Q@FBD<4RȾowڲ;ABNZY eDS2*L^$~ӦcCUku#3,j+!kPjfpύLm,Wl~;ێD9͵_]7'%0La!DGPD"Jg:˾RC[dcdalDA8R3HԥVJN̮؍awu"<qҜ->РUn,s VFi8MW#W5tX΁sНY;@SG!%jN:\4 ^rC-aNPg9M5*' `=g:cܫwTtl"^c;0!m?zêdë0eH=DBkz`bܞ-EkGsT=޶S`=g#)+^vqNEc15PZ)oIpgZID݊c%MS r8GJLdYG/ZTʀҲ@gU\UˑwMu%99iu-ճ9|t]NExiM߀qIsƉuNQ˙@"*ȥX[ ٱCR&Ld2^T4%)| cGBՁ5D0 /Y0:IgpZAu"NQi]NZe&D PQc.CľѨoCqV@`0} /بV(WB tOv;&Pj3@ AM.4JՃg;M@c&K>{1)eI,r+F#dPG0h{(#@f U_dbp;Q.sG2*MFT5E#B[#K2HŊ*KiDJ@ɰXON&"= Jh;Ӡb?4jd(UTJXnh3wz &;tAWt ]2|,F[nfJw6w 9Zo]țe5:`0CY`_V7{y6epU(TOߚYl2RŦ@VS/}-g K:cvC))s; C̞{(, `Dk(̪lTvƛ8z,]r'[1hr( BM{q6hvɡC;JTFLTWB/oMуQȣΰIFȤי|?AX]rjolڳLe>_7 "nlgOqJ h; |jo(5]:od6E? yDb,{;yZqxb&1"ئd)7 mc;\]8Cιgz?2ar\Vc{ltd [Jʀl?hFF}QBƜS f,DE=iqy]RQtgI;U$+H+AY;>X+(?O߆:*y8`(&p59<(nFm/ޑdC*In S%cu<Hǭ K\B-GizHj=Թ:+궈D1iu Bzڿxo!yl3X^@vo\A"Z |jhA:r5̡s?k/BRh`nq0ٔ;ޑEAIxFD]|mTI@F_XT^k+ |5.;%q1F;e8:@_ҼbhiR]Z j庡Vnh9Ϧ+t_ zO|&ď˷LJmr(Ws-Hj촛 hY.Wo֛RX7;Ova&UyDŸǢk-<,DiVůl @gܟS /luL;*֯<1aIt2ۏGI[0|p1%nM'`.`ou)Y+ȎN `VnYxO dmXM_S} b= t_n뭞GX j;vEO~1 'Tqa770sl?B_JjQZgbXM};6nf}=7*a Jžc,?1:yw7Čv\?~pj-,|;1ݱG?ݝ팸Ї`⧋~ׅ4̛p?H3HHS]C)ؕLS;)\.*E2,i"-J[MLԖCBQ#HZXz2qPY2^C:݃2L!!d`g Xkx@YƀS"SB0W$<5۫1RywRf.%+y]pSBY!2molɏi^\YNu9V|2 S < igNj/ayq?eİ0/S f*tT ()\^"ݕ^1o͗dc]+@]ei;,L-ްpUP3 )CEP aFyize2/X@_ EVgnP(bRu۞tKGVQZL5{,!'ͬLReO Aր'ƴޥn+ݗNeJt* \,a9n2r-0EVbj0!; o?>z YFżVȺu`ؤJޥcy2eABG6υv;`tsrAq"q /}խ%0?0a:q|T !P[h/X0:"Rm#j<άQ,q& ,!xu}y!,J e"J'|j|;\4iQJi540I"?^h b/+8E$bv!g`d JD"@uvA-bH }àdA8:,_&(q b8w.GJ {/,`Ԙ&@iЀ Uw?Ӏ_78{7iS0_UTP,>[UΩH]jf:^|q eGZ_r8icFZ[|~Bxjvd4s[4sCne3JMw$UZNI "DCCAI"cH/MuY:<aCe?!76^7ňˣELE QE wFn.˂!e!̀Zr =a_FQHdTLXbTazpd$K8w5 ¬|UDld8ƊJ^ppc7cycHnpn vȎ`M Y#B0HnfT?LQw[.C>ty>pHYY  ksf|X|(b5Y cm:o:h]( fl3Kdَ`^J+#Z2?^SBX"U1DL)fI= mxUcX!X:FCĞ#LQr#A[qL:$ҜInxj]g@Uc.1Wo7Cssx#gdd%';m,-fZqaI f_:>FU j 0Erqquc%ɠ   DkR#PA f q0'ىߑĵMUcq0Yq@?>'maFg :# K`[/^i4,.fpukz%VB6KOB8[Յ bYZEQ:`9,kt2Y*'%?)򁿖|33yøbLC_%/fCNGO˿O3 7ss lfeƒ]Lfa n2N 夛0KoiK JeC,kƕBſ~I5Qlz`gW:KU)psHa.7F!`ne+pY| F "c<\5B1/=ӎ*1ҐX#5^EBC'-Ќڍբ|oڒuC!5){>+}uq JJd_ʥVJXTz0tCҼòl-pP/1֨s|҃R򬔣䕰/(JK5TevRUׄfV*Ll#Dl۰ƞ QmjVA5jHmwk3YmJtt~˩hیp5'{뾸+_#`Tyϭ~m1Th (*d:ҴI6Pw<٪t)ZmӪ n[u nZunC4$TbZ4w5~ ɋ2GH3I%US|l&wdHędb9[eS[b ^3 dpZ((/!ei!CPX%ց?bkE2DQ i;% cJ( 97RjCHG_D l8c *,!}hw$FiJŹnG,S,glgFY;?Ct779ᴆ̽ 1#qu#qk\v$κ8#qEF'iT,#^K*hn!/b}5BLO{BH@P})DFPU PV  Uyƾ cV@^ Ǩ>Gu}8DqbijHqd1b'P FPPkvP l:5lZo!U-b2 @Z?*q3_`ͅ|q-.ϒSqykYz5ruޚj't ̰=ݏrc}dV4T 乧`&g|<,:"i!Wѫr{ jyon٨J|}l_= 1zKeOB%{6&zTl/hMiYT pa%+i{j |/!9ڞdB ⩅pyZ@B*GŦn)A[p } Y0:r0*iݙ}3t W2%Cϵ>ٹ[ŒB͋lp y+U=Lr{Xրj >[Zr% ZZ { =ٵK$FY0w]L|Un06$+AB\[n ;JNWa^~NgI:U"r:-d7X*r@s}G<,j;DT}+>wX>x21hݞsM5{n7 3[6%fu\N>U7OxY7:O},l/Bj_ |+Q(hD4zB2_ EFd9mtMP= |Ua|Vs=$GThn!.PǨ DuQ<M06lcgH5CYgTkTC;qb,fj*"SG*" .)j鄉/gNՑQGĐ\˼F,@nkLn,c.ylѺ=0uUkPN_P $,jxw#QCySBeKSASjvCH (Z_kY o5s>GhzؚgpacIx{ &"t3K4.?N?|ip&΂Mz^X!Tq(a︭{PulIڱy8;W]w,՞BԂ ޟZ}u~ϲc{KPCZB՚? n.,W诶RwUy '\u,htmea7xǹNڃL)CF@5$ޓBRKI8^P@p}#TH3H T hd:xha9FA8 7y0M$a#0F/`4L5}1= 1f{I=L8o:Hɦ" W#<5حnG[}-f jg F;˫J,U{wS2$R LTQwTl`ĵOz3 ~_ " Dz?lX*f=i!طR! Œ &f=0%T<9LR!AOWKr3³c'%?RnxA)~6 w.Knz 0nJP2u@B2!ߵᇓAI: CjjOO:A:H^3Y\|7by'<,YX039X:7[I!/sHV/a aMvc99&_rf$\DuF$#i]7ůdU\$+lducȽ{apy;U'L[aL БCp}Gp(zexb1KitN b<  u>Fd3hg:w:E1 f8V:A ⺺PYϟx蓴?I!W-G8֟\/Kwm58ߤ-,[8hz ge4!h %SJ ?r;bβSm7{tpQֽOgSZ96Cx?Z0Djt&)37t&x`nrppXJ2`NՎSZݚjٸpi;V+s30LVھr &+ {a̵i X+~KoZT\^՝D+@t=zz5צRTT *IҨi*K wTYUV:KIĵ+scT١Z]Y-J쉠x@i-NJ)>Ċe[iTe>(aqMm}ӦVa!:ݸc_Sɡo 5- *iͭ糾>Jٚ?R0G-n.z)E92$ %537!#_YMP3AYA꟮@/Jb/VpWC٢M7feYػxZ^P]*K?g3ߠE}g Q^x='V4b /UDBC d\@QzAuf xzIѝ {l"k|"(ТqdȌnF0ňַߥ(oAOUr,q@ F3BT LTeCꪴ^Ckx3=MR͞g3M');_Yxu N}\7q}}Ms}7'ZHWs"2eYyfHIorUt<Ui3,(m懕_WF8%v6s֑ak wtnQ?wmP;$.u@HV*d[ =.w{JK^dAL h.$P]2J45 b>A>JՖ;]+K $ھbʜ?}CD)^+m576J_T: mk I";jŴ ٍ&Y"g +R+C7q !ǻ5ƒZ%YY"~Hۭgێ0Q|{/v%&\{tJE%RYLfJkk.r.XitBL [z npA>&c<%&],H!iJ-@O?X_0W7VڇċDeNĬt +aAT(+>Yl99R5DhV)>Q7bV<~ՇUsXlxÜdhOXO1q3oSƨ(  `I>"A<ބEҰG=J!T;ϋNlQ{)͵p/B~9 pV VED΄1$"7q/@"+e%}KVrkcG"$7ℳFPu; ?8(O膌B19|uKN^q\0- Q͚GC>C)<4|k 6 "G& R֟ܔ1й[F9:1'stZcBF 殂p|#'ߣ $`zzDQ"Tm<F3n|_Z6;(η^R3wdwlNPA!byA#K\Iq\P&ݩFA]БiQMP!Y 3۸2ݔ6z}1+!j{nldݲ_w_n)^|j~-n|WOG2o}su|sշB4|8!T6v"5 sm|Wa^E ;Pw MM)'Ox77gn}eb:}ߑ拽Nc{7sh~Ĺn}eb:}ߑf"N7nA h˦`@ 7)U5)<@HlG OoT;GH\gMqV f+2bm:JcE-K_*PL:ԚAZd*a+]+QТ5׃MD%[6qi{?Ni<0>ƷZk)mi_1enN+f+󋕞KRXg$X6]v3Rhf!D.eP+޻_~o|CַI}&Uaә3 w>Vm T|ewxlziabvUB PTrl0vJW&TIwDɬQiu2 }{D7({s3hQQiIBҝM(wN(̀1pt| '/w}8!@W(m/ ayZ2žSW[K᧼Xw"x S|/ F\´0 B!5 ' q޽0рp!Aࣝ$2 / |ٸ2|@@w[06\}rwCaʯ?ѲW'v%^"κ+"J74c)(%=1,Ç*.Tx\C ,1kFto07nb$~w~y_ \TȺ-kl񼬍Ƽa}ߖ12zZ0i84H&`!l`5zd8槫娮u6 & 6je{1Q],o?ˢ 1AR||||T fjtN$Wޱ(dqP:«Vy5+$5%^JVEWdغEm9fհm5Ô5R3+L\ 9r)Ȑ7eZ G^4sٚ0pxJd=D((--u5&wPeW.m0cPH[v5 Ђ̲lqfBs(я־<ŁD _]I[$L63/DJh)2tp9 aZ-9L;)Fi]K}7X.-XZAlv0[0ؓW?2ڜJ鉬 !,/eaL)9 {R;SS^7Ga~LT 訛 qJZtӍ"pqo,ωJjk`U7%),(8TCU rdeHb-TeO.}Q\ߗ'4_@ĭD S|VJNW;9HXHȱ'&(յ !`ٴ.PJjŠIp1j.Uuç% 0XeBc l>C;!"Y3`1QPak9M[26dܿȍ6ZL7GQ?b7C>:zM 4M<|%gro`C D^Î3xj%DI1'5<LjATo!ɲsFNYYevj9jQ,CʫY<#ǹ(܃yQJ -P!,LJ"b<~aj$; ꇛ9m_dI>=z,Y~bd9K=DdyDKsX(MW/+?OOnQ͙"9xg}1"_Q&hFQ渖Gsi47p,Nyl:y ?vrv 5i`O_z`=P822ϒ"g@a1 0 Ĥ7.b.@x~R{4O,xE5i R,?^p1?VjmT R6v pgDzUzr8%% c't!O'u[|8.8!O.M;C}NS*YC*mluV3 mK2!uyWz-+VD!ӅN>*6IZ['Ir{$$#?"uvE"$bW4TAR/k4sB t켮 `>;=Dh8IӴ/f*$C6M~ZoXHÛ~*\( ,NdC\oo}t fNRBM lIi@9Ó4CdG0i!\Ip!!/f'd Q-Cioԍg{! %I9a?nx"40f|N%uCIz!XY/gzu_)68P%rc:/g\ ]8c;iC97'u/IKGRG<~z~i>GMN5Yp֠$kZfauY x;nv}- }iNxm_GP_PJȁ?|}ZedU+pF7>;gԻrG g.=hĻNOp]!L-V2jb 1Yڐ+3DHWu<u4LҸ4^_ld7,fKX!VuJaBc!3)c٨W'YыJ&w;޽X lЌg: |ntᵃ4#xw꥕ޝ; iy$7nq~G~몎;ˬS y{O=[6#Rt^dЍr[06`uGP{5.ޥWn rdaz/u 9*BCX$$]>YǘW>+ Z"UP/F ԁ:#ӐhYyX?>t/=x{3RE[G͜^bWK8zG+COLvof&zRV r,ؙrv52*Z YGC[Tu:ޝ9B1Q!)" VAWMRQTEE|(p KN^ Y+S /x Z;H.ך&0p / @5~yoHIؾ0kVˣ ;}v@HA뭷{s^lh:[ny `,RQp`ɆCZ&j>7E4D`h':E ]ޢ>t~<~Q:hѕ=hr:HZHNL!d  1#$pH xK/dl&{U6)re" *SdDDDgh7STqa Vwe}mc U[[l6UKNryZq~vNq8e0OP5tY՘t*a:]4S7CtVv1B+D [YyQ>L,tZ7"|uӌ"]^1&S ̫cVDe@(5crkj~0Cl9ECFSkHßOlCkj_eM. r҈H +nZ[߯SV6-#/?\~h8"taV Jpj܎Iq7?g3NԎQi<);>1^]ǂDp:KrI6Wl QR䲘?LR JqT$z)TLG\v"ZQ}l@^%ڦH&t~Y֭e, ,J Ba46gF[-]D3Jm(fSmc V;b9,E 9~w\β\{rYn9NT) dNJބZR s=)M[O;IIwLȫ2j7ӍISe9Wh]Jl)q`c_P+*:9^1+Fv;>Q*bFA6rh]ɪ}%e X<: )]>BjljC/b-Y193"PΎ| <5luVW O5%YGx5(c6Ix}!K yo7"o[<d[= 3rF64CՂvr5ATD,j|a0ڿ2x=eb4Rr[lHIn{@$tLJ2F*]辋pUrh<W9WwɁNf:T!́DD9jʃP :h?p6|ɡ1Xf (y7ևqk!ֹa,Gsq`MT\wN6hG *B Liy_ !Ey®ɴJQGf}>1:P[*QH&'^4گݲVafa1@MΙ(0 BeKs2 ˆ 0zC_;Ն)Al;U t7c WnḴ쳭^EΣu ߫;8u8Ԧts`R\rVmd:r'ki8֎liNd4ugggQM^K ([ mhϏ'%,'&WOg%L @g2+:rFYw[_joW9 'lNpJT{uOw>-ǔ!X8r6 OAS+[R{ysѫ+`wdV[ { *%0;1-9"LEEgC:{2;wxMUPʧD̅jcj*kZ}q~l/n{M_ZǓ>,oo+],gg8\eu=]}P=ެ+p^dbRZ 4ڕmz̩ˆSBjc"o58ZQN@yM2Z^z%OVpE޸P# =nf#.jz7^1F#uftAQP*z*]̗XIwNϏQ>*A9Y)S"$rϴ 9N=-nԑmGTj~nQ-輱CWF'nVLlyNϮ`l:v{ GhST=o!>FewӱV֌Z u""4[Q됭Q.Jâs;Bwn` .9 % 0Y>l(K%t oآ7F_3&nc EBzk: \EROS0&yNDPQft#{6P(0>kWaqm6[6ylQ-d=F5/UhQ6 -5j9zj^-ȡ|5@t ld}yuxomrl\@Yd L|U*R6UY_/.n|#>w*%@d8RT BSEBʐ!%iqo`zdH$5fP6 >JۍZUl\ ⶢꗐ)PhwKw?5tYsqg]DsQ嘙NPX9=v#*T2Hz- n5SݤPƓ ,l92W&׳.:UƐm%H#9EIڲ,-[1HV7Y+{5}h1URZxWaˆaiJzޏdi"REK'.yJX %׃rFk{a&i6'T6GV٤ d=eKec6?]}FWlatxuFCL@b0>X)-O~T>'UN\(ֽHU?a͎ qY:]AYC1'VB.(R-ؼnჵ׬4\jլ`Kx*?|` j"3^Hz V)5,2tp3X|WǔBLqt|jO2~5_RiA;h+Fu7 6o <"CK ޱL٭Kiz#؄Ͷ6EVQVkGY^=j8 Tѹ~_7mi"'y1'),^%+4SKֻ?hv> O{(~_:9@ZJJၤƔWr~r)n.pU_-mil1 sX3Dž/0 8h/ތ Bc&<~?s@(V)9 ݻ7{u9٠ pBf+uW7js{o}Y 4{Qm74dߋ ]i 9 Eʶ/l|nDn4j_m~ #޸!G ~nBW*˟cwU Uǟ3V=p͖QN1`N:Dk4{3a<-^ J:f0#5/K27onҜ#ŅX<Eͩ ڱ; @1h<3uc~ʴ x`֘кڭ'Menkvڭ!jMK=_|4"9vkJhm Oэ9UyF w˽B2+|O'epucUm:O޽I|]ʤT-蝥E7ς?61Z _ `HIJۻIQY\vӮ[ _fQIIB;U,pCU9 ]O|4G&eO8jeER+>Azi%]>wD7(|S^$Ulz`k8ӁCCS(2gxHhBh0~2Y&fmd]Jݶw+FebO- wS-6xBKq%ᡥ_z&Й8_t)yGj2:"DHEƓ59p["0wƘ-!`,:drvy5ˀFKG-8d ^pAZIB fy[d2̓i E1$͉L ~NJGNb(PX.Fp[Ap>:w .~EfrwQpKiW\ &q˄IN(A|Kk.~.*Xlf|LlJqiR$ˆݣ9Z0mW|XԨ1zbÊyzۙ+|SnJ_pURRI Y7^eO~N󎛕Ea3pW$PiD"YcH4J+"Eg&\U#T+=PZ$R ߫RJnZ( %N D`ayA SX tޗTkr>tװG`Q&|p*WV$n61̠?0ťoqzHi zM.ȴje8A1Ӈnm7.aH4o|ӟK}0r0e#8 p@,`o@xr;1fx_Omby_\W=NF+xM/+2A};u$U6-Fi+K/F3_[A\z?:.2?K:lBTM[Eˢ yptdҙ:.7pԾ,J̩t1;7瑨Dѵ}T$RS{ԓ!>wP!PƩ-沯Px _lR%@^{m3O#P%A98gH{FQZÚ532֐ɷj#')Ns劚8T K0G-U)j3nZ?$}ԧ^M)׆S#[0 ]8Mf8K"&I65p&Ek,vz4__(Jf=M}qAsHYFsE 6g$Q[ϫdl65 .m.t ,NJbEKPʾΠK,|is/"Cj3Jx{֘te Ò~xh0Yh''WJp ajboۣRkG\v1ZJФ$>_Y0LKw_bD0 !Ĝ׷٤ Ճ;fc víOSg|>߽kcq:Ozcon4 Łvaҹ}bn|V6.gLRV.~ۧy``ݛklʲ]Y+_ zIfd&2hCQ$ $El}jpBɼy ܖ-XWjFkGgdտI0;V JC&g'uڏ%IgIP)")y+#TX9a[(B|4do~ ªhTa'Cf\feʽӦTZ E TAmiyYtߋ#ף)ǃpHyT`GLC[AP{Xz! Cԁ Z,hm"3q(PD{iJ2فE|YPq2gD=QA1pS\{E C etaMG1ccJV x)Qd*3U5HYZ3&TKU{>̿W@ ߱k7yeG =.847izsZ ^3 CjFیdvx~/ZI_RL)0@Wwx lPs)˭ƀ;͸ +Rn*a4__ ?OciUϓi-6FŧѯWqBYB(D02 rV&@SP׽ZՊ^Ɇ<Y_'$3 ~~w=;>CKhiOT7DOCkIi<.~*SWC\LP1guHR)ũ#&m5dݔ.\( J7Ҹrz4qtcC{ ҼXPHbExVC\tӺes*EHM-Lb9*Gg?˃bI2dgSgy*Q\cj@~/@;0Ccɜhj'1mq{!mhHe.??|~=e:Dm P`H|v ΈT𔍯Ο-0[س&3gRej[W/nåk@T-xL(-5[9mfs\k 쩎m!̡xV&վJߢdT*7t\ !+ۂԙ4I&?QZi8LҝF$sTe:l^ >qB4oWdI"b8pSq$ZG"+q3YfVVHzmtqV0gN8{Ξٛ^C?ah5렅&NDX%m%ZHNyG3j!舅n/| JZNWW|| & [Qj3f$YTQrgDHf5wq j2A`-v+_m0LqxQ̻PdapV8m|c;CcxLqcLe< 2x0yƙ'9;=%MHHka3s4GU8Ii(xG)Idwlw+AzTxk)3j-pQJg2@:ԶkŒ(w*1MQ%(|dCr"D3 ZҠK|hH[q"WKuWeɢgjz><>P[EZI=+o@ -wy_w~My0:iH@q8es.ׯ;px4ZA{sQY JvKCB|^] S`;!d-pdMQ H4G!aM&7@+O4f24dHwAvB滍xBɴEZ-R" +W$Z뙓[VRH# ƀ6Fs.+/C]ozh#.ĕ!X'8p٨Ydpry s&>hυN[H70(wW/YORL6.yo+6:.;ycV|o">㓯¸bbms{Br"-{HG.ޚ7r*E ?[ד%aDa (FHۦ  C C$ZZy=~RF+\`rPt_u}3guП&ڀc_E9<) (&_5p.^ ``m zd1'*6"[ *Sٗ IƽFDR& > `/D'DkɐģPjv~ہR+=|elOoXb@fXo@A( h T Te6b&>r\_ ܪfZ V_QZ~m\tq<&L9jo3@(3Ga89BcRT2ܒ?ū-0.jZBs?Z]VI/ݖGiN%׮PsI>6%~=ގ9UkAZy>6 mM^2xMnBErvr->#WӾ\QX0ĩ,r'<$1Q@GbRxPQW+s I CFpAl*h cdom ߦF_P oWRq&,.Gea닇q vWUbލr/pU&$*w/@kY1ˆ=­3Fʌŧxn4#"BY $J;mD )YYnB|ۋ(<XZRFi.PU-pfQA2 8Ur &S$u)Ha "5;Vݖd *ǭَ"}-FTJ~z5Q_TMޖѾTs`pa-e0Nog~!8N-\_ 㲌?xus]:ܛ]۫8GYr)?bfnS/ۓƱws7;8b:=;=L G0קю>9녡zŌ:7C8If&oW7r76BK=\o ^5Xw'Kl(TP , "^*٫V:D% rw^ 8&99j٥^Fws;QP H^W'(!lE ffܤy2ZGUf4'r)(oX""aHt㴴[BJl ZPC(;+N ~ j'R6\VzQ>䑽7F M$^b#kX!d`GvaPj#對údҩLo\$:!dT~WZmrI9BhpIAH)-!p8HWPk q!)\*j IK1ڐ#Kfȇg^b,hjKcbªư:BQ{&PGKD+(V* *U*LCFzŒO|ހw|[,N*IGhNۻz((O) hWFh3N7kcx`du󏘝dx~}5b}&T_%͘ZDP6ԁJљY?J'ɡ1|zv]RyuԂÿkؠ`󪷥i=Z~ڗ-BjdSΜnKK}nFvIB~"w2O>o =fWVhl0gJOxE[>WZtK {ʹ)k]sjD*i3Y}׿O_'ɗK%we{z\V lkqf8!JrHƚysR\GKlŹΣzy4O<՝þw A<8U3y֝Ǟ=Zal .0b *$&Z7$ ˓+C& tmǠn5?` Z< @l)/}fglWRէՈzU6| <)0yS8|>̹\%!M mf+ϛ}9? 6F  4n%a,C(p8S?Zuwt xp8[Z>< ˆRzb 8'fRb3&: L'B(.SK!XoپPx]hv+h>_[9>o/bےFl'`>u95Bt@!>ϥΕ⦔),ESUPf UZ0EUTV1랛,R:e%!C\a2Zji K# JUȳYǤ3%)Ӳ\6Vnq/,ׇY_VQIqרޔ_ ޏOA+qz|&f~LHͼ?Ǿ0I[|ɏ?TrŒoэq'ׯ 10zvbr.7;PV׆SS.>}toǏv(7)%0&0t13@MZsԬSbQ>k&춤"Ť'k :;Lb(ϩ)޳ᣅOx 17#T}þ} A"8Jf 7b x "ȼgp#[ zvQkU=Tko*"yU6 -6V,ՕReV<OcR*5h;VgbJZ9V3V,(9VlobQ)2;8:u+fajdf=QE} eq|3O{x[U3v&-X쁲7 Yazs _ߞ$WGT/mI`؋T8lSP'`1RaMbzZqg} <=tp]g]Z<O)5SmZmh2%.\eUdڟRK'+SS(nr%%3J2&aÞٵY^U ŒYPj(4.Գ.jj7*:k>RꞫt7Y ';)% G!PϺ Yrk}ZIY^̩.a+`g]*C$En̶ 7n|Ԓ #:D??Ir_fxۜLZHLXY"K2*!HףVaJVܢ{6&YdvǮ~u¼XӈYyj;g,30sF ,ɕ{8=?4|eBy%o['Cw>ޛah#+{@pxR9F']xAL]GVсtF8HЖXu#tP0Si9ϳ"2ezբһ7u׷:0&Xkz!FSHR2 9eL)R\r]bAyԊ!h' o{']]Z;yC ,@:ݨg]Ԥ~/V Ewwen .x+?Y!VB,# *EYѓ8e Yɓ2 8Y"XI$߲LXieڷ)Q$EMe)+Ƶ̔!r__]\H2I̠e/LQwàiMgKO]:]IN}GHr`-CeAduk'V5vU<{| =@øeFGƭʳss `|IM|]ѷZѠ 2ʦ7"{_#_<= cSʓj8Au,@Mˑmia$/:m@ҋ8!~WѬBxlfB{w[zfLG>EFF?+ %E{jCD~7e>NɪC:a~\ܙN'Y.^]MˏkRՒ\CLDw26P7D]lDHBp)w5uWq@Kщ}Gv;xکO0݆/\Dd裷׵[, |D'!p^㩔iTO6|"[\!Í7>#lpc'E:뀎#`Y5sOS:̯x+o-4Y%'G]ϠuѪzZCm6MB?.􋿵~9j{ncI} [1wQ?I \ ̛^hSC pl'`>\rb=? I 7)JK!w#6& $eaE,MM,+ rmEyGtI`]8N-Ϡ6C>$R*B;:EU/TJR(r RvuLW sFTV✑SזzASHЋe;njʓj:)b~;TLΘ=qgI.jT=[?WZ~zoVcl0ZZ.l-7iI=NKtÈh<"c!&In$.frB#N oxL/ 8pI"='`5cqDZIA5D#XJnfPcq|Fc'=ZLpsԄ熨faԭEI裐R< r5CMg)=m)5Rj1}SgSSZYRK$* /\gyIS*8"+ k:aF\jPTesk+~bVukK:V][K"?)u5p>9)?}NQhvar/home/core/zuul-output/logs/kubelet.log0000644000000000000000003220150615153517575017712 0ustar rootrootMar 09 09:05:55 crc systemd[1]: Starting Kubernetes Kubelet... Mar 09 09:05:55 crc restorecon[4700]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 09:05:55 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 09:05:56 crc restorecon[4700]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 09:05:56 crc restorecon[4700]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 09 09:05:57 crc kubenswrapper[4861]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 09 09:05:57 crc kubenswrapper[4861]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 09 09:05:57 crc kubenswrapper[4861]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 09 09:05:57 crc kubenswrapper[4861]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 09 09:05:57 crc kubenswrapper[4861]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 09 09:05:57 crc kubenswrapper[4861]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.373634 4861 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382043 4861 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382078 4861 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382090 4861 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382101 4861 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382110 4861 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382118 4861 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382133 4861 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382142 4861 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382149 4861 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382158 4861 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382165 4861 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382173 4861 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382181 4861 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382190 4861 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382198 4861 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382205 4861 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382213 4861 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382220 4861 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382228 4861 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382236 4861 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382243 4861 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382251 4861 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382258 4861 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382266 4861 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382274 4861 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382281 4861 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382289 4861 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382297 4861 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382304 4861 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382312 4861 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382320 4861 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382327 4861 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382335 4861 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382342 4861 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382350 4861 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382359 4861 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382366 4861 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382407 4861 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382414 4861 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382422 4861 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382431 4861 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382438 4861 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382446 4861 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382454 4861 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382462 4861 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382470 4861 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382478 4861 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382485 4861 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382492 4861 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382500 4861 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382508 4861 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382518 4861 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382529 4861 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382538 4861 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382548 4861 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382558 4861 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382567 4861 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382575 4861 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382584 4861 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382592 4861 feature_gate.go:330] unrecognized feature gate: Example Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382600 4861 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382609 4861 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382616 4861 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382624 4861 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382631 4861 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382643 4861 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382655 4861 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382664 4861 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382673 4861 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382681 4861 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.382689 4861 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.382863 4861 flags.go:64] FLAG: --address="0.0.0.0" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.382893 4861 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.382910 4861 flags.go:64] FLAG: --anonymous-auth="true" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.382922 4861 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.382933 4861 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.382943 4861 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.382957 4861 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.382968 4861 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.382978 4861 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.382988 4861 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.383037 4861 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.383047 4861 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.383057 4861 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.383065 4861 flags.go:64] FLAG: --cgroup-root="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.383074 4861 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.383083 4861 flags.go:64] FLAG: --client-ca-file="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.383092 4861 flags.go:64] FLAG: --cloud-config="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.383101 4861 flags.go:64] FLAG: --cloud-provider="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.383109 4861 flags.go:64] FLAG: --cluster-dns="[]" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.383119 4861 flags.go:64] FLAG: --cluster-domain="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.383128 4861 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.383137 4861 flags.go:64] FLAG: --config-dir="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.383146 4861 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.383156 4861 flags.go:64] FLAG: --container-log-max-files="5" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.383167 4861 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.383175 4861 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.383184 4861 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.383195 4861 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.383204 4861 flags.go:64] FLAG: --contention-profiling="false" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.383213 4861 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.383225 4861 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.383235 4861 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.383245 4861 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.383257 4861 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.383267 4861 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.383276 4861 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.383284 4861 flags.go:64] FLAG: --enable-load-reader="false" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.383294 4861 flags.go:64] FLAG: --enable-server="true" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.383303 4861 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.383315 4861 flags.go:64] FLAG: --event-burst="100" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.383324 4861 flags.go:64] FLAG: --event-qps="50" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.383333 4861 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.383343 4861 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.383464 4861 flags.go:64] FLAG: --eviction-hard="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.383892 4861 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.383968 4861 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.383994 4861 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.384014 4861 flags.go:64] FLAG: --eviction-soft="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.384032 4861 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.384047 4861 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.384066 4861 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.384083 4861 flags.go:64] FLAG: --experimental-mounter-path="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.384099 4861 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.384132 4861 flags.go:64] FLAG: --fail-swap-on="true" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.384147 4861 flags.go:64] FLAG: --feature-gates="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.384168 4861 flags.go:64] FLAG: --file-check-frequency="20s" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.384184 4861 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.384199 4861 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.384211 4861 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.384225 4861 flags.go:64] FLAG: --healthz-port="10248" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.384238 4861 flags.go:64] FLAG: --help="false" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.384262 4861 flags.go:64] FLAG: --hostname-override="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.384272 4861 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.384358 4861 flags.go:64] FLAG: --http-check-frequency="20s" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.384421 4861 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.384433 4861 flags.go:64] FLAG: --image-credential-provider-config="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.384443 4861 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.384454 4861 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.384464 4861 flags.go:64] FLAG: --image-service-endpoint="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.384475 4861 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.384497 4861 flags.go:64] FLAG: --kube-api-burst="100" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.384508 4861 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.384519 4861 flags.go:64] FLAG: --kube-api-qps="50" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.384529 4861 flags.go:64] FLAG: --kube-reserved="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.384539 4861 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.384549 4861 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.384560 4861 flags.go:64] FLAG: --kubelet-cgroups="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.384572 4861 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.384591 4861 flags.go:64] FLAG: --lock-file="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.384602 4861 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.384613 4861 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.384624 4861 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.384643 4861 flags.go:64] FLAG: --log-json-split-stream="false" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.384653 4861 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.384663 4861 flags.go:64] FLAG: --log-text-split-stream="false" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.384674 4861 flags.go:64] FLAG: --logging-format="text" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.384684 4861 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.384702 4861 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.384712 4861 flags.go:64] FLAG: --manifest-url="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.384722 4861 flags.go:64] FLAG: --manifest-url-header="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.384737 4861 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.384747 4861 flags.go:64] FLAG: --max-open-files="1000000" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.384761 4861 flags.go:64] FLAG: --max-pods="110" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.384772 4861 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.384783 4861 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.384800 4861 flags.go:64] FLAG: --memory-manager-policy="None" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.384811 4861 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.384822 4861 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.384833 4861 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.384846 4861 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.384881 4861 flags.go:64] FLAG: --node-status-max-images="50" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.384894 4861 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.384917 4861 flags.go:64] FLAG: --oom-score-adj="-999" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.384927 4861 flags.go:64] FLAG: --pod-cidr="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.384937 4861 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.384952 4861 flags.go:64] FLAG: --pod-manifest-path="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.384961 4861 flags.go:64] FLAG: --pod-max-pids="-1" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.384974 4861 flags.go:64] FLAG: --pods-per-core="0" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.384983 4861 flags.go:64] FLAG: --port="10250" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.384993 4861 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.385013 4861 flags.go:64] FLAG: --provider-id="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.385024 4861 flags.go:64] FLAG: --qos-reserved="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.385034 4861 flags.go:64] FLAG: --read-only-port="10255" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.385044 4861 flags.go:64] FLAG: --register-node="true" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.385054 4861 flags.go:64] FLAG: --register-schedulable="true" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.385064 4861 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.385083 4861 flags.go:64] FLAG: --registry-burst="10" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.385094 4861 flags.go:64] FLAG: --registry-qps="5" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.385113 4861 flags.go:64] FLAG: --reserved-cpus="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.385123 4861 flags.go:64] FLAG: --reserved-memory="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.385135 4861 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.385147 4861 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.385157 4861 flags.go:64] FLAG: --rotate-certificates="false" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.385167 4861 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.385176 4861 flags.go:64] FLAG: --runonce="false" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.385186 4861 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.385204 4861 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.385214 4861 flags.go:64] FLAG: --seccomp-default="false" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.385225 4861 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.385235 4861 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.385245 4861 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.385258 4861 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.385268 4861 flags.go:64] FLAG: --storage-driver-password="root" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.385278 4861 flags.go:64] FLAG: --storage-driver-secure="false" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.385289 4861 flags.go:64] FLAG: --storage-driver-table="stats" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.385308 4861 flags.go:64] FLAG: --storage-driver-user="root" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.385319 4861 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.385330 4861 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.385340 4861 flags.go:64] FLAG: --system-cgroups="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.385350 4861 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.385399 4861 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.385419 4861 flags.go:64] FLAG: --tls-cert-file="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.385429 4861 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.385449 4861 flags.go:64] FLAG: --tls-min-version="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.385459 4861 flags.go:64] FLAG: --tls-private-key-file="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.385488 4861 flags.go:64] FLAG: --topology-manager-policy="none" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.385621 4861 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.385644 4861 flags.go:64] FLAG: --topology-manager-scope="container" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.385903 4861 flags.go:64] FLAG: --v="2" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.385928 4861 flags.go:64] FLAG: --version="false" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.385944 4861 flags.go:64] FLAG: --vmodule="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.385959 4861 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.385972 4861 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386248 4861 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386268 4861 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386279 4861 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386292 4861 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386305 4861 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386318 4861 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386328 4861 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386338 4861 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386346 4861 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386354 4861 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386362 4861 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386406 4861 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386414 4861 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386422 4861 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386430 4861 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386438 4861 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386445 4861 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386455 4861 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386463 4861 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386471 4861 feature_gate.go:330] unrecognized feature gate: Example Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386479 4861 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386487 4861 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386495 4861 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386503 4861 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386513 4861 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386532 4861 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386540 4861 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386548 4861 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386556 4861 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386564 4861 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386572 4861 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386580 4861 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386588 4861 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386595 4861 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386603 4861 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386611 4861 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386619 4861 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386627 4861 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386634 4861 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386645 4861 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386656 4861 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386665 4861 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386675 4861 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386685 4861 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386695 4861 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386705 4861 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386714 4861 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386723 4861 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386732 4861 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386740 4861 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386748 4861 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386757 4861 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386765 4861 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386774 4861 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386781 4861 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386791 4861 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386802 4861 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386815 4861 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386824 4861 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386834 4861 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386843 4861 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386851 4861 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386860 4861 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386868 4861 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386876 4861 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386884 4861 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386891 4861 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386900 4861 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386908 4861 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386915 4861 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.386923 4861 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.386948 4861 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.399296 4861 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.399769 4861 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.399939 4861 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.399957 4861 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.399967 4861 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.399976 4861 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.399985 4861 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.399993 4861 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400002 4861 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400009 4861 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400017 4861 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400025 4861 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400034 4861 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400041 4861 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400050 4861 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400057 4861 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400065 4861 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400073 4861 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400081 4861 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400089 4861 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400098 4861 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400107 4861 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400116 4861 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400125 4861 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400133 4861 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400142 4861 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400150 4861 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400159 4861 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400168 4861 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400177 4861 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400185 4861 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400195 4861 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400203 4861 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400211 4861 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400219 4861 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400227 4861 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400235 4861 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400243 4861 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400251 4861 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400259 4861 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400267 4861 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400275 4861 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400285 4861 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400298 4861 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400307 4861 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400315 4861 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400323 4861 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400331 4861 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400339 4861 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400347 4861 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400355 4861 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400363 4861 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400399 4861 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400407 4861 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400416 4861 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400424 4861 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400432 4861 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400442 4861 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400450 4861 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400458 4861 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400466 4861 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400473 4861 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400481 4861 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400489 4861 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400497 4861 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400505 4861 feature_gate.go:330] unrecognized feature gate: Example Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400513 4861 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400521 4861 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400532 4861 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400543 4861 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400554 4861 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400564 4861 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400573 4861 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.400587 4861 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400841 4861 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400854 4861 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400863 4861 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400871 4861 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400880 4861 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400890 4861 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400898 4861 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400907 4861 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400915 4861 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400923 4861 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400931 4861 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400940 4861 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400947 4861 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400955 4861 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400963 4861 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400971 4861 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400980 4861 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400988 4861 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.400996 4861 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.401003 4861 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.401011 4861 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.401019 4861 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.401027 4861 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.401034 4861 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.401042 4861 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.401050 4861 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.401058 4861 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.401068 4861 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.401079 4861 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.401099 4861 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.401109 4861 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.401117 4861 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.401125 4861 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.401134 4861 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.401141 4861 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.401149 4861 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.401157 4861 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.401165 4861 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.401172 4861 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.401181 4861 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.401188 4861 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.401196 4861 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.401206 4861 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.401216 4861 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.401226 4861 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.401235 4861 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.401244 4861 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.401253 4861 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.401261 4861 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.401269 4861 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.401277 4861 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.401285 4861 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.401294 4861 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.401304 4861 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.401314 4861 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.401324 4861 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.401333 4861 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.401341 4861 feature_gate.go:330] unrecognized feature gate: Example Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.401350 4861 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.401358 4861 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.401390 4861 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.401399 4861 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.401407 4861 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.401415 4861 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.401422 4861 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.401431 4861 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.401439 4861 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.401447 4861 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.401455 4861 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.401463 4861 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.401471 4861 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.401486 4861 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.401834 4861 server.go:940] "Client rotation is on, will bootstrap in background" Mar 09 09:05:57 crc kubenswrapper[4861]: E0309 09:05:57.407806 4861 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.412679 4861 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.412862 4861 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.414892 4861 server.go:997] "Starting client certificate rotation" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.414949 4861 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.415231 4861 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.445274 4861 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.448467 4861 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 09 09:05:57 crc kubenswrapper[4861]: E0309 09:05:57.449128 4861 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.163:6443: connect: connection refused" logger="UnhandledError" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.470584 4861 log.go:25] "Validated CRI v1 runtime API" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.511650 4861 log.go:25] "Validated CRI v1 image API" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.513933 4861 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.521265 4861 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-09-09-00-25-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.521311 4861 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:43 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:42 fsType:tmpfs blockSize:0}] Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.549650 4861 manager.go:217] Machine: {Timestamp:2026-03-09 09:05:57.546594698 +0000 UTC m=+0.631634159 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654120448 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:6cdf8bd4-67ee-426a-bd44-5025c8d84b0b BootID:c804f4c5-c5a1-4765-ad37-8a6185c798f1 Filesystems:[{Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:42 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827060224 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:43 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:02:b3:6a Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:02:b3:6a Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:27:83:4c Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:4f:6a:91 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:6a:68:be Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:60:be:f3 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:4e:70:a1:10:03:95 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:3a:d4:bd:62:a0:df Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654120448 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.550023 4861 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.550287 4861 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.552006 4861 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.552366 4861 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.552447 4861 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.552777 4861 topology_manager.go:138] "Creating topology manager with none policy" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.552794 4861 container_manager_linux.go:303] "Creating device plugin manager" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.553249 4861 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.554166 4861 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.554474 4861 state_mem.go:36] "Initialized new in-memory state store" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.554610 4861 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.558700 4861 kubelet.go:418] "Attempting to sync node with API server" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.558737 4861 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.558762 4861 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.558783 4861 kubelet.go:324] "Adding apiserver pod source" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.558802 4861 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.563277 4861 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.564304 4861 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.163:6443: connect: connection refused Mar 09 09:05:57 crc kubenswrapper[4861]: E0309 09:05:57.564449 4861 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.163:6443: connect: connection refused" logger="UnhandledError" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.564502 4861 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.564706 4861 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.163:6443: connect: connection refused Mar 09 09:05:57 crc kubenswrapper[4861]: E0309 09:05:57.564798 4861 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.163:6443: connect: connection refused" logger="UnhandledError" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.567394 4861 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.569098 4861 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.569125 4861 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.569134 4861 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.569143 4861 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.569158 4861 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.569169 4861 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.569185 4861 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.569201 4861 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.569213 4861 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.569221 4861 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.569237 4861 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.569248 4861 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.571535 4861 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.572273 4861 server.go:1280] "Started kubelet" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.572439 4861 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.572602 4861 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.573787 4861 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.574821 4861 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.163:6443: connect: connection refused Mar 09 09:05:57 crc systemd[1]: Started Kubernetes Kubelet. Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.577444 4861 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.577658 4861 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 09 09:05:57 crc kubenswrapper[4861]: E0309 09:05:57.578636 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.578652 4861 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.578695 4861 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.578876 4861 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.580012 4861 server.go:460] "Adding debug handlers to kubelet server" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.580477 4861 factory.go:55] Registering systemd factory Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.580530 4861 factory.go:221] Registration of the systemd container factory successfully Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.580609 4861 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.163:6443: connect: connection refused Mar 09 09:05:57 crc kubenswrapper[4861]: E0309 09:05:57.581922 4861 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.163:6443: connect: connection refused" logger="UnhandledError" Mar 09 09:05:57 crc kubenswrapper[4861]: E0309 09:05:57.580869 4861 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.163:6443: connect: connection refused" interval="200ms" Mar 09 09:05:57 crc kubenswrapper[4861]: E0309 09:05:57.582854 4861 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.163:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189b20feed1bd557 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:57.572228439 +0000 UTC m=+0.657267840,LastTimestamp:2026-03-09 09:05:57.572228439 +0000 UTC m=+0.657267840,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.586742 4861 factory.go:153] Registering CRI-O factory Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.586796 4861 factory.go:221] Registration of the crio container factory successfully Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.587137 4861 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.591433 4861 factory.go:103] Registering Raw factory Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.591631 4861 manager.go:1196] Started watching for new ooms in manager Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.594306 4861 manager.go:319] Starting recovery of all containers Mar 09 09:05:57 crc kubenswrapper[4861]: E0309 09:05:57.595515 4861 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.163:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189b20feed1bd557 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:57.572228439 +0000 UTC m=+0.657267840,LastTimestamp:2026-03-09 09:05:57.572228439 +0000 UTC m=+0.657267840,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.605432 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.605511 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.605535 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.605557 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.605576 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.605595 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.605615 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.605635 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.605658 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.605678 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.605696 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.605716 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.605784 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.605805 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.605829 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.605850 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.605882 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.605902 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.605924 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.605941 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.605961 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.605978 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.605996 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.606013 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.606035 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.606054 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.606075 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.606097 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.606116 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.606136 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.606191 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.606221 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.606247 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.606271 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.606290 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.606310 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.606329 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.606417 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.606436 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.606472 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.606494 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.606515 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.606535 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.606557 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.606577 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.606595 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.606625 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.606646 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.606664 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.606684 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.606727 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.606745 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.606771 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.606792 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.606814 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.606833 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.606855 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.606875 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.606894 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.606912 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.606929 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.606947 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.606965 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.606985 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.607002 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.607031 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.607050 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.607069 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.607085 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.607104 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.607121 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.607138 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.607156 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.607175 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.607199 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.607224 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.607251 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.607270 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.607289 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.607310 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.607328 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.607348 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.607401 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.607421 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.607439 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.607456 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.607474 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.607492 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.607512 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.607532 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.607551 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.607568 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.607586 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.607609 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.607630 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.607686 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.607705 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.607729 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.607745 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.607765 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.607784 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.607802 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.607820 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.607839 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.607875 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.607897 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.607915 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.607984 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.608004 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.608024 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.608045 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.608068 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.608086 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.608105 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.608124 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.608144 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.608163 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.608180 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.608201 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.608226 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.608255 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.608275 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.608294 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.608311 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.608328 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.611669 4861 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.611770 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.611922 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.611977 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.612000 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.612023 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.612078 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.612161 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.612252 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.612299 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.612318 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.612338 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.612401 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.612421 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.612471 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.612490 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.612530 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.612550 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.612572 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.612622 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.612654 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.612674 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.612694 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.612714 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.612854 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.612875 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.612968 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.613035 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.613058 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.613742 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.613838 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.613868 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.613891 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.613928 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.613953 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.613974 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.613996 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.614085 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.614108 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.614130 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.614155 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.614174 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.614193 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.614214 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.614236 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.614256 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.614277 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.614297 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.614317 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.614337 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.614361 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.614411 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.614431 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.614451 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.614473 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.614494 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.614514 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.614537 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.614557 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.614579 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.614602 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.614623 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.614644 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.614663 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.614688 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.614713 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.614742 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.614768 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.614794 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.614816 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.614835 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.614858 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.614877 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.614896 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.614915 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.614936 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.614955 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.614975 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.614995 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.615015 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.615034 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.615053 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.615074 4861 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.615093 4861 reconstruct.go:97] "Volume reconstruction finished" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.615112 4861 reconciler.go:26] "Reconciler: start to sync state" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.627560 4861 manager.go:324] Recovery completed Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.647733 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.651783 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.651859 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.651877 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.653497 4861 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.654692 4861 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.654734 4861 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.654769 4861 state_mem.go:36] "Initialized new in-memory state store" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.656583 4861 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.656645 4861 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.656684 4861 kubelet.go:2335] "Starting kubelet main sync loop" Mar 09 09:05:57 crc kubenswrapper[4861]: E0309 09:05:57.656762 4861 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 09 09:05:57 crc kubenswrapper[4861]: W0309 09:05:57.658099 4861 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.163:6443: connect: connection refused Mar 09 09:05:57 crc kubenswrapper[4861]: E0309 09:05:57.658194 4861 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.163:6443: connect: connection refused" logger="UnhandledError" Mar 09 09:05:57 crc kubenswrapper[4861]: E0309 09:05:57.679337 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.681455 4861 policy_none.go:49] "None policy: Start" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.683051 4861 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.683097 4861 state_mem.go:35] "Initializing new in-memory state store" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.744360 4861 manager.go:334] "Starting Device Plugin manager" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.744482 4861 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.744546 4861 server.go:79] "Starting device plugin registration server" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.745316 4861 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.745346 4861 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.746055 4861 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.746149 4861 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.746158 4861 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 09 09:05:57 crc kubenswrapper[4861]: E0309 09:05:57.754834 4861 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.757105 4861 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.757352 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.759024 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.759068 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.759078 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.759262 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.759504 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.759563 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.760229 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.760291 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.760312 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.760596 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.760639 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.760664 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.761645 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.761693 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.761746 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.762334 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.762416 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.762434 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.762485 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.762548 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.762580 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.762649 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.762789 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.762839 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.764022 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.764069 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.764085 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.764235 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.764276 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.764300 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.764306 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.764671 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.764765 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.765943 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.765989 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.766007 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.766096 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.766129 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.766147 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.766306 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.766355 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.767690 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.767741 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.767785 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:05:57 crc kubenswrapper[4861]: E0309 09:05:57.784201 4861 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.163:6443: connect: connection refused" interval="400ms" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.817919 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.817982 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.818029 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.818107 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.818253 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.818306 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.818365 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.818437 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.818482 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.818525 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.818565 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.818597 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.818632 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.818662 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.818694 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.845926 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.847448 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.847507 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.847532 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.847586 4861 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 09:05:57 crc kubenswrapper[4861]: E0309 09:05:57.848323 4861 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.163:6443: connect: connection refused" node="crc" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.920071 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.920151 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.920168 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.920192 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.920213 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.920228 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.920246 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.920262 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.920280 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.920296 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.920344 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.920361 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.920392 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.920458 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.920405 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.920492 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.920463 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.920508 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.920551 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.920565 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.920570 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.920558 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.920509 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.920567 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.920534 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.920489 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.920753 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.920790 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.920590 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 09:05:57 crc kubenswrapper[4861]: I0309 09:05:57.920959 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 09:05:58 crc kubenswrapper[4861]: I0309 09:05:58.049031 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:05:58 crc kubenswrapper[4861]: I0309 09:05:58.050736 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:05:58 crc kubenswrapper[4861]: I0309 09:05:58.050786 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:05:58 crc kubenswrapper[4861]: I0309 09:05:58.050802 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:05:58 crc kubenswrapper[4861]: I0309 09:05:58.050838 4861 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 09:05:58 crc kubenswrapper[4861]: E0309 09:05:58.051302 4861 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.163:6443: connect: connection refused" node="crc" Mar 09 09:05:58 crc kubenswrapper[4861]: I0309 09:05:58.102545 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 09:05:58 crc kubenswrapper[4861]: I0309 09:05:58.126367 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 09:05:58 crc kubenswrapper[4861]: I0309 09:05:58.138880 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 09 09:05:58 crc kubenswrapper[4861]: W0309 09:05:58.163695 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-ca671806991200e8b07fe4d96b692513fff895a6cdbc2748b7730fd406abd4c3 WatchSource:0}: Error finding container ca671806991200e8b07fe4d96b692513fff895a6cdbc2748b7730fd406abd4c3: Status 404 returned error can't find the container with id ca671806991200e8b07fe4d96b692513fff895a6cdbc2748b7730fd406abd4c3 Mar 09 09:05:58 crc kubenswrapper[4861]: W0309 09:05:58.164626 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-813d0edc49e9e86f2621360cb27298d07992bce2d60dce3285c10a640f51d0dc WatchSource:0}: Error finding container 813d0edc49e9e86f2621360cb27298d07992bce2d60dce3285c10a640f51d0dc: Status 404 returned error can't find the container with id 813d0edc49e9e86f2621360cb27298d07992bce2d60dce3285c10a640f51d0dc Mar 09 09:05:58 crc kubenswrapper[4861]: I0309 09:05:58.167156 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:05:58 crc kubenswrapper[4861]: I0309 09:05:58.173242 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 09:05:58 crc kubenswrapper[4861]: W0309 09:05:58.183810 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-aca5c633ef0fe34c99521c56e024ab5e28e1537db3c3ab557ed4138dee568804 WatchSource:0}: Error finding container aca5c633ef0fe34c99521c56e024ab5e28e1537db3c3ab557ed4138dee568804: Status 404 returned error can't find the container with id aca5c633ef0fe34c99521c56e024ab5e28e1537db3c3ab557ed4138dee568804 Mar 09 09:05:58 crc kubenswrapper[4861]: E0309 09:05:58.184804 4861 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.163:6443: connect: connection refused" interval="800ms" Mar 09 09:05:58 crc kubenswrapper[4861]: W0309 09:05:58.193350 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-54965be272bf30b189748c4b62de81e3194f7ba8220ae1491d4eb6b00eef480b WatchSource:0}: Error finding container 54965be272bf30b189748c4b62de81e3194f7ba8220ae1491d4eb6b00eef480b: Status 404 returned error can't find the container with id 54965be272bf30b189748c4b62de81e3194f7ba8220ae1491d4eb6b00eef480b Mar 09 09:05:58 crc kubenswrapper[4861]: I0309 09:05:58.451765 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:05:58 crc kubenswrapper[4861]: I0309 09:05:58.454081 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:05:58 crc kubenswrapper[4861]: I0309 09:05:58.454147 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:05:58 crc kubenswrapper[4861]: I0309 09:05:58.454167 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:05:58 crc kubenswrapper[4861]: I0309 09:05:58.454210 4861 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 09:05:58 crc kubenswrapper[4861]: E0309 09:05:58.454822 4861 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.163:6443: connect: connection refused" node="crc" Mar 09 09:05:58 crc kubenswrapper[4861]: I0309 09:05:58.576442 4861 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.163:6443: connect: connection refused Mar 09 09:05:58 crc kubenswrapper[4861]: W0309 09:05:58.618602 4861 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.163:6443: connect: connection refused Mar 09 09:05:58 crc kubenswrapper[4861]: E0309 09:05:58.618720 4861 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.163:6443: connect: connection refused" logger="UnhandledError" Mar 09 09:05:58 crc kubenswrapper[4861]: I0309 09:05:58.663081 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d98f3cdc0e58b917c48936b4a638dd0e9fd310e6cafb7a520a3b8322988d7b40"} Mar 09 09:05:58 crc kubenswrapper[4861]: I0309 09:05:58.664732 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"ca671806991200e8b07fe4d96b692513fff895a6cdbc2748b7730fd406abd4c3"} Mar 09 09:05:58 crc kubenswrapper[4861]: I0309 09:05:58.666295 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"813d0edc49e9e86f2621360cb27298d07992bce2d60dce3285c10a640f51d0dc"} Mar 09 09:05:58 crc kubenswrapper[4861]: I0309 09:05:58.667417 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"54965be272bf30b189748c4b62de81e3194f7ba8220ae1491d4eb6b00eef480b"} Mar 09 09:05:58 crc kubenswrapper[4861]: I0309 09:05:58.668939 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"aca5c633ef0fe34c99521c56e024ab5e28e1537db3c3ab557ed4138dee568804"} Mar 09 09:05:58 crc kubenswrapper[4861]: W0309 09:05:58.869065 4861 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.163:6443: connect: connection refused Mar 09 09:05:58 crc kubenswrapper[4861]: E0309 09:05:58.869145 4861 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.163:6443: connect: connection refused" logger="UnhandledError" Mar 09 09:05:58 crc kubenswrapper[4861]: W0309 09:05:58.886864 4861 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.163:6443: connect: connection refused Mar 09 09:05:58 crc kubenswrapper[4861]: E0309 09:05:58.886904 4861 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.163:6443: connect: connection refused" logger="UnhandledError" Mar 09 09:05:58 crc kubenswrapper[4861]: E0309 09:05:58.986445 4861 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.163:6443: connect: connection refused" interval="1.6s" Mar 09 09:05:59 crc kubenswrapper[4861]: W0309 09:05:59.088079 4861 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.163:6443: connect: connection refused Mar 09 09:05:59 crc kubenswrapper[4861]: E0309 09:05:59.088190 4861 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.163:6443: connect: connection refused" logger="UnhandledError" Mar 09 09:05:59 crc kubenswrapper[4861]: I0309 09:05:59.255930 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:05:59 crc kubenswrapper[4861]: I0309 09:05:59.257138 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:05:59 crc kubenswrapper[4861]: I0309 09:05:59.257173 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:05:59 crc kubenswrapper[4861]: I0309 09:05:59.257184 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:05:59 crc kubenswrapper[4861]: I0309 09:05:59.257209 4861 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 09:05:59 crc kubenswrapper[4861]: E0309 09:05:59.257579 4861 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.163:6443: connect: connection refused" node="crc" Mar 09 09:05:59 crc kubenswrapper[4861]: I0309 09:05:59.528892 4861 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 09 09:05:59 crc kubenswrapper[4861]: E0309 09:05:59.530501 4861 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.163:6443: connect: connection refused" logger="UnhandledError" Mar 09 09:05:59 crc kubenswrapper[4861]: I0309 09:05:59.576819 4861 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.163:6443: connect: connection refused Mar 09 09:05:59 crc kubenswrapper[4861]: I0309 09:05:59.676004 4861 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="d6e8c068a0692d17703850923c04fd32155ded792305ccb241e97a4e8ec222b5" exitCode=0 Mar 09 09:05:59 crc kubenswrapper[4861]: I0309 09:05:59.676073 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"d6e8c068a0692d17703850923c04fd32155ded792305ccb241e97a4e8ec222b5"} Mar 09 09:05:59 crc kubenswrapper[4861]: I0309 09:05:59.676233 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:05:59 crc kubenswrapper[4861]: I0309 09:05:59.678098 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:05:59 crc kubenswrapper[4861]: I0309 09:05:59.678185 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:05:59 crc kubenswrapper[4861]: I0309 09:05:59.678247 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:05:59 crc kubenswrapper[4861]: I0309 09:05:59.680445 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"eb0b88730464e9a4ff7497dce67c06e031b3adecf14a7f0879b76a120a645e6f"} Mar 09 09:05:59 crc kubenswrapper[4861]: I0309 09:05:59.680466 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6fe2029410db45f17ff225ca929d10c3cadae5e94ecb67ae606b4b61debd62ef"} Mar 09 09:05:59 crc kubenswrapper[4861]: I0309 09:05:59.680479 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d4c2a2ed50a34aeccc62f360ea17dfb518792c922323e2b402c8d2614525ac6a"} Mar 09 09:05:59 crc kubenswrapper[4861]: I0309 09:05:59.680487 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"35f0274e9da4b30cfc1e2d7706951fd7501832f5c6f5c4e0eb2696b1179cd3ad"} Mar 09 09:05:59 crc kubenswrapper[4861]: I0309 09:05:59.680522 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:05:59 crc kubenswrapper[4861]: I0309 09:05:59.683144 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:05:59 crc kubenswrapper[4861]: I0309 09:05:59.683174 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:05:59 crc kubenswrapper[4861]: I0309 09:05:59.683184 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:05:59 crc kubenswrapper[4861]: I0309 09:05:59.686996 4861 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="591835de84a2d61d44c0723a30b71b9fae2661bb7db61e8938a9e3470dba5c25" exitCode=0 Mar 09 09:05:59 crc kubenswrapper[4861]: I0309 09:05:59.687147 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:05:59 crc kubenswrapper[4861]: I0309 09:05:59.687157 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"591835de84a2d61d44c0723a30b71b9fae2661bb7db61e8938a9e3470dba5c25"} Mar 09 09:05:59 crc kubenswrapper[4861]: I0309 09:05:59.688784 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:05:59 crc kubenswrapper[4861]: I0309 09:05:59.688836 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:05:59 crc kubenswrapper[4861]: I0309 09:05:59.688855 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:05:59 crc kubenswrapper[4861]: I0309 09:05:59.690609 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:05:59 crc kubenswrapper[4861]: I0309 09:05:59.691071 4861 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="62b60ee20085e42a24141efd4944576471b367bc7d63e17e9ab0b39997b230cf" exitCode=0 Mar 09 09:05:59 crc kubenswrapper[4861]: I0309 09:05:59.691204 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"62b60ee20085e42a24141efd4944576471b367bc7d63e17e9ab0b39997b230cf"} Mar 09 09:05:59 crc kubenswrapper[4861]: I0309 09:05:59.691291 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:05:59 crc kubenswrapper[4861]: I0309 09:05:59.691954 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:05:59 crc kubenswrapper[4861]: I0309 09:05:59.692022 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:05:59 crc kubenswrapper[4861]: I0309 09:05:59.692044 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:05:59 crc kubenswrapper[4861]: I0309 09:05:59.692992 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:05:59 crc kubenswrapper[4861]: I0309 09:05:59.693027 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:05:59 crc kubenswrapper[4861]: I0309 09:05:59.693044 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:05:59 crc kubenswrapper[4861]: I0309 09:05:59.694263 4861 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="b3890d561e9788da50117d5ae37e9bc3decefea7ca10d396f77a05a3b2874c0e" exitCode=0 Mar 09 09:05:59 crc kubenswrapper[4861]: I0309 09:05:59.694313 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"b3890d561e9788da50117d5ae37e9bc3decefea7ca10d396f77a05a3b2874c0e"} Mar 09 09:05:59 crc kubenswrapper[4861]: I0309 09:05:59.694454 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:05:59 crc kubenswrapper[4861]: I0309 09:05:59.695485 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:05:59 crc kubenswrapper[4861]: I0309 09:05:59.695638 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:05:59 crc kubenswrapper[4861]: I0309 09:05:59.695781 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:06:00 crc kubenswrapper[4861]: I0309 09:06:00.575979 4861 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.163:6443: connect: connection refused Mar 09 09:06:00 crc kubenswrapper[4861]: E0309 09:06:00.587753 4861 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.163:6443: connect: connection refused" interval="3.2s" Mar 09 09:06:00 crc kubenswrapper[4861]: I0309 09:06:00.698693 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"9a138696f23970dc61461f70321b50c12dd25a9695ec721ccc37f3ae03b2faa3"} Mar 09 09:06:00 crc kubenswrapper[4861]: I0309 09:06:00.698724 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:06:00 crc kubenswrapper[4861]: I0309 09:06:00.699630 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:06:00 crc kubenswrapper[4861]: I0309 09:06:00.699658 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:06:00 crc kubenswrapper[4861]: I0309 09:06:00.699669 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:06:00 crc kubenswrapper[4861]: I0309 09:06:00.702261 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:06:00 crc kubenswrapper[4861]: I0309 09:06:00.702251 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"99b130a4907f6009d13971dc8a15bb15123ba30affdaf385c024d41fdd04b4a8"} Mar 09 09:06:00 crc kubenswrapper[4861]: I0309 09:06:00.702334 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"613468c4726b8267b18babb4c3aca51c340adbff1d4e02416b18dc23eef78385"} Mar 09 09:06:00 crc kubenswrapper[4861]: I0309 09:06:00.702350 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"440f2c7481c3aacd9330ccb0ac1d0af30e1051ffdcfbcab02d791c8727fc38a6"} Mar 09 09:06:00 crc kubenswrapper[4861]: I0309 09:06:00.703886 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:06:00 crc kubenswrapper[4861]: I0309 09:06:00.703921 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:06:00 crc kubenswrapper[4861]: I0309 09:06:00.703944 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:06:00 crc kubenswrapper[4861]: I0309 09:06:00.710557 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7c14650b94b6bde491a2375ca5766d14fc93947f5b993395965a00a13a953489"} Mar 09 09:06:00 crc kubenswrapper[4861]: I0309 09:06:00.710610 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"dadb8247e79ce5a6d77850c3b946b50622b8aa99369385e0d303b7e232e97b54"} Mar 09 09:06:00 crc kubenswrapper[4861]: I0309 09:06:00.710620 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c2f4b30f196b7900b7eed61144f32f48056374dc7e92c28cd5bffbf774945702"} Mar 09 09:06:00 crc kubenswrapper[4861]: I0309 09:06:00.710630 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7b02ebcfd63db720474d3acbc618d87e9b419be0847a9f96152b5037ca454a36"} Mar 09 09:06:00 crc kubenswrapper[4861]: I0309 09:06:00.714922 4861 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="169ff25d96d64c2d22eed6f6699f18b06eac5cd3e900edfbf80ea5983a8faca3" exitCode=0 Mar 09 09:06:00 crc kubenswrapper[4861]: I0309 09:06:00.715080 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:06:00 crc kubenswrapper[4861]: I0309 09:06:00.715659 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:06:00 crc kubenswrapper[4861]: I0309 09:06:00.715940 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"169ff25d96d64c2d22eed6f6699f18b06eac5cd3e900edfbf80ea5983a8faca3"} Mar 09 09:06:00 crc kubenswrapper[4861]: I0309 09:06:00.718004 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:06:00 crc kubenswrapper[4861]: I0309 09:06:00.718035 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:06:00 crc kubenswrapper[4861]: I0309 09:06:00.718043 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:06:00 crc kubenswrapper[4861]: I0309 09:06:00.718040 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:06:00 crc kubenswrapper[4861]: I0309 09:06:00.718086 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:06:00 crc kubenswrapper[4861]: I0309 09:06:00.718097 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:06:00 crc kubenswrapper[4861]: W0309 09:06:00.849800 4861 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.163:6443: connect: connection refused Mar 09 09:06:00 crc kubenswrapper[4861]: E0309 09:06:00.849900 4861 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.163:6443: connect: connection refused" logger="UnhandledError" Mar 09 09:06:00 crc kubenswrapper[4861]: I0309 09:06:00.857847 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:06:00 crc kubenswrapper[4861]: I0309 09:06:00.859231 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:06:00 crc kubenswrapper[4861]: I0309 09:06:00.859278 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:06:00 crc kubenswrapper[4861]: I0309 09:06:00.859293 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:06:00 crc kubenswrapper[4861]: I0309 09:06:00.859328 4861 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 09:06:00 crc kubenswrapper[4861]: E0309 09:06:00.860218 4861 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.163:6443: connect: connection refused" node="crc" Mar 09 09:06:01 crc kubenswrapper[4861]: W0309 09:06:01.076846 4861 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.163:6443: connect: connection refused Mar 09 09:06:01 crc kubenswrapper[4861]: E0309 09:06:01.077003 4861 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.163:6443: connect: connection refused" logger="UnhandledError" Mar 09 09:06:01 crc kubenswrapper[4861]: I0309 09:06:01.722945 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"533ae5eafe96db0e7edccbd70fd11d35a762d41ae79fd6e9b4afef454dd6b35b"} Mar 09 09:06:01 crc kubenswrapper[4861]: I0309 09:06:01.723143 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:06:01 crc kubenswrapper[4861]: I0309 09:06:01.724360 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:06:01 crc kubenswrapper[4861]: I0309 09:06:01.724482 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:06:01 crc kubenswrapper[4861]: I0309 09:06:01.724510 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:06:01 crc kubenswrapper[4861]: I0309 09:06:01.726130 4861 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="5e705609a1566adbe198dbc71d84d557beefb485f1ea06918432c0eb0b2197c7" exitCode=0 Mar 09 09:06:01 crc kubenswrapper[4861]: I0309 09:06:01.726231 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:06:01 crc kubenswrapper[4861]: I0309 09:06:01.726234 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:06:01 crc kubenswrapper[4861]: I0309 09:06:01.726315 4861 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 09 09:06:01 crc kubenswrapper[4861]: I0309 09:06:01.726231 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"5e705609a1566adbe198dbc71d84d557beefb485f1ea06918432c0eb0b2197c7"} Mar 09 09:06:01 crc kubenswrapper[4861]: I0309 09:06:01.726426 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:06:01 crc kubenswrapper[4861]: I0309 09:06:01.727244 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:06:01 crc kubenswrapper[4861]: I0309 09:06:01.727281 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:06:01 crc kubenswrapper[4861]: I0309 09:06:01.727292 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:06:01 crc kubenswrapper[4861]: I0309 09:06:01.727338 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:06:01 crc kubenswrapper[4861]: I0309 09:06:01.727397 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:06:01 crc kubenswrapper[4861]: I0309 09:06:01.727415 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:06:01 crc kubenswrapper[4861]: I0309 09:06:01.728934 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:06:01 crc kubenswrapper[4861]: I0309 09:06:01.728960 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:06:01 crc kubenswrapper[4861]: I0309 09:06:01.728970 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:06:02 crc kubenswrapper[4861]: I0309 09:06:02.733492 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"668c2f4c76d7be7df3efca555255da9608459a878aed050de5ec96ee3daae42a"} Mar 09 09:06:02 crc kubenswrapper[4861]: I0309 09:06:02.733573 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:06:02 crc kubenswrapper[4861]: I0309 09:06:02.733579 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a7ee6b5b9f73d927b6b15dd256a588775756aa8554739197ce7caa65a55a047f"} Mar 09 09:06:02 crc kubenswrapper[4861]: I0309 09:06:02.733602 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"45da08871eb398781e6704dd9f0c7df525a4b6dbfff113df11914c6e7cfe98ce"} Mar 09 09:06:02 crc kubenswrapper[4861]: I0309 09:06:02.733708 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:06:02 crc kubenswrapper[4861]: I0309 09:06:02.734504 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:06:02 crc kubenswrapper[4861]: I0309 09:06:02.734548 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:06:02 crc kubenswrapper[4861]: I0309 09:06:02.734563 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:06:03 crc kubenswrapper[4861]: I0309 09:06:03.404623 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:06:03 crc kubenswrapper[4861]: I0309 09:06:03.604124 4861 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 09 09:06:03 crc kubenswrapper[4861]: I0309 09:06:03.742158 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a684c677e19273406060d2601149eb214cb3684f22434087708a8d327ef63078"} Mar 09 09:06:03 crc kubenswrapper[4861]: I0309 09:06:03.742235 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a7094414740b00d1bb493cf296c69fc16d1cdc1a17c7442c541a3745bf67ca01"} Mar 09 09:06:03 crc kubenswrapper[4861]: I0309 09:06:03.742245 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:06:03 crc kubenswrapper[4861]: I0309 09:06:03.742312 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:06:03 crc kubenswrapper[4861]: I0309 09:06:03.743935 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:06:03 crc kubenswrapper[4861]: I0309 09:06:03.743983 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:06:03 crc kubenswrapper[4861]: I0309 09:06:03.744006 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:06:03 crc kubenswrapper[4861]: I0309 09:06:03.744454 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:06:03 crc kubenswrapper[4861]: I0309 09:06:03.744515 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:06:03 crc kubenswrapper[4861]: I0309 09:06:03.744533 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:06:04 crc kubenswrapper[4861]: I0309 09:06:04.061107 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:06:04 crc kubenswrapper[4861]: I0309 09:06:04.063077 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:06:04 crc kubenswrapper[4861]: I0309 09:06:04.063279 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:06:04 crc kubenswrapper[4861]: I0309 09:06:04.063323 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:06:04 crc kubenswrapper[4861]: I0309 09:06:04.063455 4861 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 09:06:04 crc kubenswrapper[4861]: I0309 09:06:04.744541 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:06:04 crc kubenswrapper[4861]: I0309 09:06:04.745539 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:06:04 crc kubenswrapper[4861]: I0309 09:06:04.745881 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:06:04 crc kubenswrapper[4861]: I0309 09:06:04.745931 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:06:04 crc kubenswrapper[4861]: I0309 09:06:04.745949 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:06:04 crc kubenswrapper[4861]: I0309 09:06:04.747641 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:06:04 crc kubenswrapper[4861]: I0309 09:06:04.747705 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:06:04 crc kubenswrapper[4861]: I0309 09:06:04.747724 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:06:04 crc kubenswrapper[4861]: I0309 09:06:04.967548 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:06:05 crc kubenswrapper[4861]: I0309 09:06:05.728411 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 09:06:05 crc kubenswrapper[4861]: I0309 09:06:05.728700 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:06:05 crc kubenswrapper[4861]: I0309 09:06:05.730234 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:06:05 crc kubenswrapper[4861]: I0309 09:06:05.730298 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:06:05 crc kubenswrapper[4861]: I0309 09:06:05.730321 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:06:05 crc kubenswrapper[4861]: I0309 09:06:05.747286 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:06:05 crc kubenswrapper[4861]: I0309 09:06:05.748770 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:06:05 crc kubenswrapper[4861]: I0309 09:06:05.748806 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:06:05 crc kubenswrapper[4861]: I0309 09:06:05.748817 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:06:06 crc kubenswrapper[4861]: I0309 09:06:06.109197 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 09:06:06 crc kubenswrapper[4861]: I0309 09:06:06.109485 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:06:06 crc kubenswrapper[4861]: I0309 09:06:06.111096 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:06:06 crc kubenswrapper[4861]: I0309 09:06:06.111149 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:06:06 crc kubenswrapper[4861]: I0309 09:06:06.111168 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:06:06 crc kubenswrapper[4861]: I0309 09:06:06.445634 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 09:06:06 crc kubenswrapper[4861]: I0309 09:06:06.452636 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 09:06:06 crc kubenswrapper[4861]: I0309 09:06:06.461978 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 09 09:06:06 crc kubenswrapper[4861]: I0309 09:06:06.462256 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:06:06 crc kubenswrapper[4861]: I0309 09:06:06.464062 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:06:06 crc kubenswrapper[4861]: I0309 09:06:06.464118 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:06:06 crc kubenswrapper[4861]: I0309 09:06:06.464133 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:06:06 crc kubenswrapper[4861]: I0309 09:06:06.749712 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:06:06 crc kubenswrapper[4861]: I0309 09:06:06.749922 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 09:06:06 crc kubenswrapper[4861]: I0309 09:06:06.751153 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:06:06 crc kubenswrapper[4861]: I0309 09:06:06.751195 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:06:06 crc kubenswrapper[4861]: I0309 09:06:06.751215 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:06:07 crc kubenswrapper[4861]: I0309 09:06:07.250746 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 09:06:07 crc kubenswrapper[4861]: I0309 09:06:07.751521 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:06:07 crc kubenswrapper[4861]: I0309 09:06:07.753069 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:06:07 crc kubenswrapper[4861]: I0309 09:06:07.753165 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:06:07 crc kubenswrapper[4861]: I0309 09:06:07.753197 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:06:07 crc kubenswrapper[4861]: E0309 09:06:07.756302 4861 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 09:06:08 crc kubenswrapper[4861]: I0309 09:06:08.754872 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:06:08 crc kubenswrapper[4861]: I0309 09:06:08.755967 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:06:08 crc kubenswrapper[4861]: I0309 09:06:08.756002 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:06:08 crc kubenswrapper[4861]: I0309 09:06:08.756013 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:06:08 crc kubenswrapper[4861]: I0309 09:06:08.761694 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 09:06:09 crc kubenswrapper[4861]: I0309 09:06:09.052285 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 09 09:06:09 crc kubenswrapper[4861]: I0309 09:06:09.052602 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:06:09 crc kubenswrapper[4861]: I0309 09:06:09.054145 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:06:09 crc kubenswrapper[4861]: I0309 09:06:09.054179 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:06:09 crc kubenswrapper[4861]: I0309 09:06:09.054188 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:06:09 crc kubenswrapper[4861]: I0309 09:06:09.109179 4861 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 09:06:09 crc kubenswrapper[4861]: I0309 09:06:09.109251 4861 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 09:06:09 crc kubenswrapper[4861]: I0309 09:06:09.759225 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:06:09 crc kubenswrapper[4861]: I0309 09:06:09.760923 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:06:09 crc kubenswrapper[4861]: I0309 09:06:09.761009 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:06:09 crc kubenswrapper[4861]: I0309 09:06:09.761028 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:06:10 crc kubenswrapper[4861]: I0309 09:06:10.504216 4861 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Mar 09 09:06:10 crc kubenswrapper[4861]: I0309 09:06:10.504346 4861 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Mar 09 09:06:11 crc kubenswrapper[4861]: W0309 09:06:11.359839 4861 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 09 09:06:11 crc kubenswrapper[4861]: I0309 09:06:11.359940 4861 trace.go:236] Trace[1560156380]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (09-Mar-2026 09:06:01.358) (total time: 10001ms): Mar 09 09:06:11 crc kubenswrapper[4861]: Trace[1560156380]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (09:06:11.359) Mar 09 09:06:11 crc kubenswrapper[4861]: Trace[1560156380]: [10.001861818s] [10.001861818s] END Mar 09 09:06:11 crc kubenswrapper[4861]: E0309 09:06:11.359966 4861 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 09 09:06:11 crc kubenswrapper[4861]: I0309 09:06:11.576732 4861 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 09 09:06:11 crc kubenswrapper[4861]: W0309 09:06:11.827286 4861 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 09 09:06:11 crc kubenswrapper[4861]: I0309 09:06:11.827382 4861 trace.go:236] Trace[522198066]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (09-Mar-2026 09:06:01.826) (total time: 10001ms): Mar 09 09:06:11 crc kubenswrapper[4861]: Trace[522198066]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (09:06:11.827) Mar 09 09:06:11 crc kubenswrapper[4861]: Trace[522198066]: [10.001088873s] [10.001088873s] END Mar 09 09:06:11 crc kubenswrapper[4861]: E0309 09:06:11.827405 4861 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 09 09:06:12 crc kubenswrapper[4861]: E0309 09:06:12.319871 4861 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:06:12Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 09:06:12 crc kubenswrapper[4861]: E0309 09:06:12.322493 4861 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:06:12Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 09 09:06:12 crc kubenswrapper[4861]: W0309 09:06:12.327225 4861 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:06:12Z is after 2026-02-23T05:33:13Z Mar 09 09:06:12 crc kubenswrapper[4861]: E0309 09:06:12.327288 4861 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:06:12Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 09:06:12 crc kubenswrapper[4861]: E0309 09:06:12.329758 4861 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:06:12Z is after 2026-02-23T05:33:13Z" node="crc" Mar 09 09:06:12 crc kubenswrapper[4861]: E0309 09:06:12.330930 4861 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:06:12Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189b20feed1bd557 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:57.572228439 +0000 UTC m=+0.657267840,LastTimestamp:2026-03-09 09:05:57.572228439 +0000 UTC m=+0.657267840,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:12 crc kubenswrapper[4861]: W0309 09:06:12.334082 4861 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:06:12Z is after 2026-02-23T05:33:13Z Mar 09 09:06:12 crc kubenswrapper[4861]: E0309 09:06:12.334131 4861 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:06:12Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 09:06:12 crc kubenswrapper[4861]: I0309 09:06:12.339813 4861 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Mar 09 09:06:12 crc kubenswrapper[4861]: I0309 09:06:12.339877 4861 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 09 09:06:12 crc kubenswrapper[4861]: I0309 09:06:12.346756 4861 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Mar 09 09:06:12 crc kubenswrapper[4861]: I0309 09:06:12.346846 4861 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 09 09:06:12 crc kubenswrapper[4861]: I0309 09:06:12.579735 4861 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:06:12Z is after 2026-02-23T05:33:13Z Mar 09 09:06:12 crc kubenswrapper[4861]: I0309 09:06:12.767689 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 09 09:06:12 crc kubenswrapper[4861]: I0309 09:06:12.770148 4861 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="533ae5eafe96db0e7edccbd70fd11d35a762d41ae79fd6e9b4afef454dd6b35b" exitCode=255 Mar 09 09:06:12 crc kubenswrapper[4861]: I0309 09:06:12.770204 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"533ae5eafe96db0e7edccbd70fd11d35a762d41ae79fd6e9b4afef454dd6b35b"} Mar 09 09:06:12 crc kubenswrapper[4861]: I0309 09:06:12.770488 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:06:12 crc kubenswrapper[4861]: I0309 09:06:12.771683 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:06:12 crc kubenswrapper[4861]: I0309 09:06:12.771737 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:06:12 crc kubenswrapper[4861]: I0309 09:06:12.771754 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:06:12 crc kubenswrapper[4861]: I0309 09:06:12.772555 4861 scope.go:117] "RemoveContainer" containerID="533ae5eafe96db0e7edccbd70fd11d35a762d41ae79fd6e9b4afef454dd6b35b" Mar 09 09:06:13 crc kubenswrapper[4861]: I0309 09:06:13.579573 4861 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:06:13Z is after 2026-02-23T05:33:13Z Mar 09 09:06:13 crc kubenswrapper[4861]: I0309 09:06:13.776037 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 09 09:06:13 crc kubenswrapper[4861]: I0309 09:06:13.778541 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6ce63f21c27e9fa2c6a001f46e36f979b50a113b885f4a79ab95dffa1eab11c6"} Mar 09 09:06:13 crc kubenswrapper[4861]: I0309 09:06:13.778770 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:06:13 crc kubenswrapper[4861]: I0309 09:06:13.780084 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:06:13 crc kubenswrapper[4861]: I0309 09:06:13.780131 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:06:13 crc kubenswrapper[4861]: I0309 09:06:13.780155 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:06:14 crc kubenswrapper[4861]: I0309 09:06:14.580773 4861 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:06:14Z is after 2026-02-23T05:33:13Z Mar 09 09:06:14 crc kubenswrapper[4861]: I0309 09:06:14.783062 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 09 09:06:14 crc kubenswrapper[4861]: I0309 09:06:14.783810 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 09 09:06:14 crc kubenswrapper[4861]: I0309 09:06:14.786143 4861 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6ce63f21c27e9fa2c6a001f46e36f979b50a113b885f4a79ab95dffa1eab11c6" exitCode=255 Mar 09 09:06:14 crc kubenswrapper[4861]: I0309 09:06:14.786206 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"6ce63f21c27e9fa2c6a001f46e36f979b50a113b885f4a79ab95dffa1eab11c6"} Mar 09 09:06:14 crc kubenswrapper[4861]: I0309 09:06:14.786260 4861 scope.go:117] "RemoveContainer" containerID="533ae5eafe96db0e7edccbd70fd11d35a762d41ae79fd6e9b4afef454dd6b35b" Mar 09 09:06:14 crc kubenswrapper[4861]: I0309 09:06:14.786402 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:06:14 crc kubenswrapper[4861]: I0309 09:06:14.787545 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:06:14 crc kubenswrapper[4861]: I0309 09:06:14.787615 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:06:14 crc kubenswrapper[4861]: I0309 09:06:14.787633 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:06:14 crc kubenswrapper[4861]: I0309 09:06:14.788630 4861 scope.go:117] "RemoveContainer" containerID="6ce63f21c27e9fa2c6a001f46e36f979b50a113b885f4a79ab95dffa1eab11c6" Mar 09 09:06:14 crc kubenswrapper[4861]: E0309 09:06:14.788957 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 09:06:14 crc kubenswrapper[4861]: I0309 09:06:14.976837 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:06:15 crc kubenswrapper[4861]: I0309 09:06:15.581799 4861 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:06:15Z is after 2026-02-23T05:33:13Z Mar 09 09:06:15 crc kubenswrapper[4861]: I0309 09:06:15.791905 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 09 09:06:15 crc kubenswrapper[4861]: I0309 09:06:15.794828 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:06:15 crc kubenswrapper[4861]: I0309 09:06:15.796942 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:06:15 crc kubenswrapper[4861]: I0309 09:06:15.797001 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:06:15 crc kubenswrapper[4861]: I0309 09:06:15.797018 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:06:15 crc kubenswrapper[4861]: I0309 09:06:15.797897 4861 scope.go:117] "RemoveContainer" containerID="6ce63f21c27e9fa2c6a001f46e36f979b50a113b885f4a79ab95dffa1eab11c6" Mar 09 09:06:15 crc kubenswrapper[4861]: E0309 09:06:15.798173 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 09:06:15 crc kubenswrapper[4861]: I0309 09:06:15.801757 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:06:16 crc kubenswrapper[4861]: I0309 09:06:16.580584 4861 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:06:16Z is after 2026-02-23T05:33:13Z Mar 09 09:06:16 crc kubenswrapper[4861]: W0309 09:06:16.738625 4861 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:06:16Z is after 2026-02-23T05:33:13Z Mar 09 09:06:16 crc kubenswrapper[4861]: E0309 09:06:16.738733 4861 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:06:16Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 09:06:16 crc kubenswrapper[4861]: I0309 09:06:16.798256 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:06:16 crc kubenswrapper[4861]: I0309 09:06:16.802111 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:06:16 crc kubenswrapper[4861]: I0309 09:06:16.802220 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:06:16 crc kubenswrapper[4861]: I0309 09:06:16.802806 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:06:16 crc kubenswrapper[4861]: I0309 09:06:16.803882 4861 scope.go:117] "RemoveContainer" containerID="6ce63f21c27e9fa2c6a001f46e36f979b50a113b885f4a79ab95dffa1eab11c6" Mar 09 09:06:16 crc kubenswrapper[4861]: E0309 09:06:16.804183 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 09:06:17 crc kubenswrapper[4861]: W0309 09:06:17.327619 4861 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:06:17Z is after 2026-02-23T05:33:13Z Mar 09 09:06:17 crc kubenswrapper[4861]: E0309 09:06:17.327754 4861 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:06:17Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 09:06:17 crc kubenswrapper[4861]: I0309 09:06:17.580024 4861 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:06:17Z is after 2026-02-23T05:33:13Z Mar 09 09:06:17 crc kubenswrapper[4861]: I0309 09:06:17.638775 4861 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:06:17 crc kubenswrapper[4861]: E0309 09:06:17.756596 4861 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 09:06:17 crc kubenswrapper[4861]: I0309 09:06:17.801359 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:06:17 crc kubenswrapper[4861]: I0309 09:06:17.803045 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:06:17 crc kubenswrapper[4861]: I0309 09:06:17.803109 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:06:17 crc kubenswrapper[4861]: I0309 09:06:17.803123 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:06:17 crc kubenswrapper[4861]: I0309 09:06:17.803989 4861 scope.go:117] "RemoveContainer" containerID="6ce63f21c27e9fa2c6a001f46e36f979b50a113b885f4a79ab95dffa1eab11c6" Mar 09 09:06:17 crc kubenswrapper[4861]: E0309 09:06:17.804286 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 09:06:18 crc kubenswrapper[4861]: I0309 09:06:18.580960 4861 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:06:18Z is after 2026-02-23T05:33:13Z Mar 09 09:06:18 crc kubenswrapper[4861]: E0309 09:06:18.728903 4861 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:06:18Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 09 09:06:18 crc kubenswrapper[4861]: I0309 09:06:18.729933 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:06:18 crc kubenswrapper[4861]: I0309 09:06:18.731577 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:06:18 crc kubenswrapper[4861]: I0309 09:06:18.731648 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:06:18 crc kubenswrapper[4861]: I0309 09:06:18.731672 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:06:18 crc kubenswrapper[4861]: I0309 09:06:18.731714 4861 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 09:06:18 crc kubenswrapper[4861]: E0309 09:06:18.736980 4861 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:06:18Z is after 2026-02-23T05:33:13Z" node="crc" Mar 09 09:06:19 crc kubenswrapper[4861]: I0309 09:06:19.085895 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 09 09:06:19 crc kubenswrapper[4861]: I0309 09:06:19.086218 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:06:19 crc kubenswrapper[4861]: I0309 09:06:19.087995 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:06:19 crc kubenswrapper[4861]: I0309 09:06:19.088069 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:06:19 crc kubenswrapper[4861]: I0309 09:06:19.088094 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:06:19 crc kubenswrapper[4861]: I0309 09:06:19.105497 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 09 09:06:19 crc kubenswrapper[4861]: I0309 09:06:19.111273 4861 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 09:06:19 crc kubenswrapper[4861]: I0309 09:06:19.111402 4861 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 09:06:19 crc kubenswrapper[4861]: I0309 09:06:19.580272 4861 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:06:19Z is after 2026-02-23T05:33:13Z Mar 09 09:06:19 crc kubenswrapper[4861]: I0309 09:06:19.807047 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:06:19 crc kubenswrapper[4861]: I0309 09:06:19.808299 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:06:19 crc kubenswrapper[4861]: I0309 09:06:19.808419 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:06:19 crc kubenswrapper[4861]: I0309 09:06:19.808453 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:06:20 crc kubenswrapper[4861]: I0309 09:06:20.503341 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:06:20 crc kubenswrapper[4861]: I0309 09:06:20.503595 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:06:20 crc kubenswrapper[4861]: I0309 09:06:20.505264 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:06:20 crc kubenswrapper[4861]: I0309 09:06:20.505449 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:06:20 crc kubenswrapper[4861]: I0309 09:06:20.505472 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:06:20 crc kubenswrapper[4861]: I0309 09:06:20.507205 4861 scope.go:117] "RemoveContainer" containerID="6ce63f21c27e9fa2c6a001f46e36f979b50a113b885f4a79ab95dffa1eab11c6" Mar 09 09:06:20 crc kubenswrapper[4861]: E0309 09:06:20.508026 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 09:06:20 crc kubenswrapper[4861]: I0309 09:06:20.579849 4861 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:06:20Z is after 2026-02-23T05:33:13Z Mar 09 09:06:21 crc kubenswrapper[4861]: I0309 09:06:21.058090 4861 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 09 09:06:21 crc kubenswrapper[4861]: E0309 09:06:21.064004 4861 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:06:21Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 09:06:21 crc kubenswrapper[4861]: I0309 09:06:21.579341 4861 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:06:21Z is after 2026-02-23T05:33:13Z Mar 09 09:06:22 crc kubenswrapper[4861]: W0309 09:06:22.126599 4861 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:06:22Z is after 2026-02-23T05:33:13Z Mar 09 09:06:22 crc kubenswrapper[4861]: E0309 09:06:22.126736 4861 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:06:22Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 09:06:22 crc kubenswrapper[4861]: E0309 09:06:22.337594 4861 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:06:22Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189b20feed1bd557 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:57.572228439 +0000 UTC m=+0.657267840,LastTimestamp:2026-03-09 09:05:57.572228439 +0000 UTC m=+0.657267840,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:22 crc kubenswrapper[4861]: I0309 09:06:22.580351 4861 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:06:22Z is after 2026-02-23T05:33:13Z Mar 09 09:06:23 crc kubenswrapper[4861]: I0309 09:06:23.580886 4861 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:06:23Z is after 2026-02-23T05:33:13Z Mar 09 09:06:23 crc kubenswrapper[4861]: W0309 09:06:23.640589 4861 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:06:23Z is after 2026-02-23T05:33:13Z Mar 09 09:06:23 crc kubenswrapper[4861]: E0309 09:06:23.640739 4861 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:06:23Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 09:06:23 crc kubenswrapper[4861]: W0309 09:06:23.904515 4861 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:06:23Z is after 2026-02-23T05:33:13Z Mar 09 09:06:23 crc kubenswrapper[4861]: E0309 09:06:23.904642 4861 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:06:23Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 09:06:24 crc kubenswrapper[4861]: I0309 09:06:24.581026 4861 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:06:24Z is after 2026-02-23T05:33:13Z Mar 09 09:06:25 crc kubenswrapper[4861]: I0309 09:06:25.580706 4861 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:06:25Z is after 2026-02-23T05:33:13Z Mar 09 09:06:25 crc kubenswrapper[4861]: E0309 09:06:25.733033 4861 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:06:25Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 09 09:06:25 crc kubenswrapper[4861]: I0309 09:06:25.737466 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:06:25 crc kubenswrapper[4861]: I0309 09:06:25.739040 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:06:25 crc kubenswrapper[4861]: I0309 09:06:25.739080 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:06:25 crc kubenswrapper[4861]: I0309 09:06:25.739093 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:06:25 crc kubenswrapper[4861]: I0309 09:06:25.739116 4861 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 09:06:25 crc kubenswrapper[4861]: E0309 09:06:25.744885 4861 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:06:25Z is after 2026-02-23T05:33:13Z" node="crc" Mar 09 09:06:26 crc kubenswrapper[4861]: I0309 09:06:26.582073 4861 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:06:26Z is after 2026-02-23T05:33:13Z Mar 09 09:06:27 crc kubenswrapper[4861]: I0309 09:06:27.579254 4861 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:06:27Z is after 2026-02-23T05:33:13Z Mar 09 09:06:27 crc kubenswrapper[4861]: W0309 09:06:27.627931 4861 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:06:27Z is after 2026-02-23T05:33:13Z Mar 09 09:06:27 crc kubenswrapper[4861]: E0309 09:06:27.628117 4861 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:06:27Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 09:06:27 crc kubenswrapper[4861]: E0309 09:06:27.756909 4861 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 09:06:28 crc kubenswrapper[4861]: I0309 09:06:28.580164 4861 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:06:28Z is after 2026-02-23T05:33:13Z Mar 09 09:06:29 crc kubenswrapper[4861]: I0309 09:06:29.109886 4861 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 09:06:29 crc kubenswrapper[4861]: I0309 09:06:29.109997 4861 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 09:06:29 crc kubenswrapper[4861]: I0309 09:06:29.110137 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 09:06:29 crc kubenswrapper[4861]: I0309 09:06:29.110337 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:06:29 crc kubenswrapper[4861]: I0309 09:06:29.111771 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:06:29 crc kubenswrapper[4861]: I0309 09:06:29.111834 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:06:29 crc kubenswrapper[4861]: I0309 09:06:29.111856 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:06:29 crc kubenswrapper[4861]: I0309 09:06:29.112558 4861 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"d4c2a2ed50a34aeccc62f360ea17dfb518792c922323e2b402c8d2614525ac6a"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 09 09:06:29 crc kubenswrapper[4861]: I0309 09:06:29.112800 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://d4c2a2ed50a34aeccc62f360ea17dfb518792c922323e2b402c8d2614525ac6a" gracePeriod=30 Mar 09 09:06:29 crc kubenswrapper[4861]: I0309 09:06:29.581108 4861 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:06:29Z is after 2026-02-23T05:33:13Z Mar 09 09:06:29 crc kubenswrapper[4861]: I0309 09:06:29.839900 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 09 09:06:29 crc kubenswrapper[4861]: I0309 09:06:29.840638 4861 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="d4c2a2ed50a34aeccc62f360ea17dfb518792c922323e2b402c8d2614525ac6a" exitCode=255 Mar 09 09:06:29 crc kubenswrapper[4861]: I0309 09:06:29.840667 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"d4c2a2ed50a34aeccc62f360ea17dfb518792c922323e2b402c8d2614525ac6a"} Mar 09 09:06:29 crc kubenswrapper[4861]: I0309 09:06:29.840776 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b6a7082e0993fb9713dbe26f493b1affc4960e6cc5136891cdf2f7de317572c6"} Mar 09 09:06:29 crc kubenswrapper[4861]: I0309 09:06:29.840942 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:06:29 crc kubenswrapper[4861]: I0309 09:06:29.842258 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:06:29 crc kubenswrapper[4861]: I0309 09:06:29.842309 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:06:29 crc kubenswrapper[4861]: I0309 09:06:29.842328 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:06:30 crc kubenswrapper[4861]: I0309 09:06:30.583339 4861 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:06:31 crc kubenswrapper[4861]: I0309 09:06:31.581964 4861 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.345887 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b20feed1bd557 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:57.572228439 +0000 UTC m=+0.657267840,LastTimestamp:2026-03-09 09:05:57.572228439 +0000 UTC m=+0.657267840,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.352981 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b20fef1dab233 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:57.651845683 +0000 UTC m=+0.736885104,LastTimestamp:2026-03-09 09:05:57.651845683 +0000 UTC m=+0.736885104,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.357786 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b20fef1db13d2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:57.651870674 +0000 UTC m=+0.736910095,LastTimestamp:2026-03-09 09:05:57.651870674 +0000 UTC m=+0.736910095,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.361944 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b20fef1db4dc0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:57.651885504 +0000 UTC m=+0.736924915,LastTimestamp:2026-03-09 09:05:57.651885504 +0000 UTC m=+0.736924915,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.367970 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b20fef7ab5c36 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:57.749406774 +0000 UTC m=+0.834446175,LastTimestamp:2026-03-09 09:05:57.749406774 +0000 UTC m=+0.834446175,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.375014 4861 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b20fef1dab233\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b20fef1dab233 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:57.651845683 +0000 UTC m=+0.736885104,LastTimestamp:2026-03-09 09:05:57.759051856 +0000 UTC m=+0.844091257,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.382911 4861 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b20fef1db13d2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b20fef1db13d2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:57.651870674 +0000 UTC m=+0.736910095,LastTimestamp:2026-03-09 09:05:57.759074526 +0000 UTC m=+0.844113927,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.391093 4861 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b20fef1db4dc0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b20fef1db4dc0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:57.651885504 +0000 UTC m=+0.736924915,LastTimestamp:2026-03-09 09:05:57.759084167 +0000 UTC m=+0.844123568,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.398732 4861 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b20fef1dab233\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b20fef1dab233 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:57.651845683 +0000 UTC m=+0.736885104,LastTimestamp:2026-03-09 09:05:57.760265067 +0000 UTC m=+0.845304508,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.405398 4861 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b20fef1db13d2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b20fef1db13d2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:57.651870674 +0000 UTC m=+0.736910095,LastTimestamp:2026-03-09 09:05:57.760303879 +0000 UTC m=+0.845343310,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.411891 4861 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b20fef1db4dc0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b20fef1db4dc0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:57.651885504 +0000 UTC m=+0.736924915,LastTimestamp:2026-03-09 09:05:57.760322659 +0000 UTC m=+0.845362100,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.418940 4861 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b20fef1dab233\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b20fef1dab233 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:57.651845683 +0000 UTC m=+0.736885104,LastTimestamp:2026-03-09 09:05:57.761673756 +0000 UTC m=+0.846713177,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.426047 4861 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b20fef1db13d2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b20fef1db13d2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:57.651870674 +0000 UTC m=+0.736910095,LastTimestamp:2026-03-09 09:05:57.761703527 +0000 UTC m=+0.846742948,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.433225 4861 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b20fef1db4dc0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b20fef1db4dc0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:57.651885504 +0000 UTC m=+0.736924915,LastTimestamp:2026-03-09 09:05:57.761758219 +0000 UTC m=+0.846797640,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.440600 4861 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b20fef1dab233\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b20fef1dab233 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:57.651845683 +0000 UTC m=+0.736885104,LastTimestamp:2026-03-09 09:05:57.762362559 +0000 UTC m=+0.847402000,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.447337 4861 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b20fef1db13d2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b20fef1db13d2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:57.651870674 +0000 UTC m=+0.736910095,LastTimestamp:2026-03-09 09:05:57.762428061 +0000 UTC m=+0.847467492,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.453972 4861 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b20fef1db4dc0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b20fef1db4dc0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:57.651885504 +0000 UTC m=+0.736924915,LastTimestamp:2026-03-09 09:05:57.762443392 +0000 UTC m=+0.847482823,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.461481 4861 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b20fef1dab233\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b20fef1dab233 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:57.651845683 +0000 UTC m=+0.736885104,LastTimestamp:2026-03-09 09:05:57.762530245 +0000 UTC m=+0.847569686,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.468005 4861 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b20fef1db13d2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b20fef1db13d2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:57.651870674 +0000 UTC m=+0.736910095,LastTimestamp:2026-03-09 09:05:57.762565757 +0000 UTC m=+0.847605208,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.474974 4861 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b20fef1db4dc0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b20fef1db4dc0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:57.651885504 +0000 UTC m=+0.736924915,LastTimestamp:2026-03-09 09:05:57.762592588 +0000 UTC m=+0.847632039,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.481729 4861 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b20fef1dab233\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b20fef1dab233 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:57.651845683 +0000 UTC m=+0.736885104,LastTimestamp:2026-03-09 09:05:57.764053348 +0000 UTC m=+0.849092759,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.488137 4861 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b20fef1db13d2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b20fef1db13d2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:57.651870674 +0000 UTC m=+0.736910095,LastTimestamp:2026-03-09 09:05:57.764078999 +0000 UTC m=+0.849118420,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.495629 4861 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b20fef1db4dc0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b20fef1db4dc0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:57.651885504 +0000 UTC m=+0.736924915,LastTimestamp:2026-03-09 09:05:57.764094519 +0000 UTC m=+0.849133940,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.504769 4861 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b20fef1dab233\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b20fef1dab233 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:57.651845683 +0000 UTC m=+0.736885104,LastTimestamp:2026-03-09 09:05:57.764263445 +0000 UTC m=+0.849302876,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.511697 4861 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b20fef1db13d2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b20fef1db13d2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:57.651870674 +0000 UTC m=+0.736910095,LastTimestamp:2026-03-09 09:05:57.764288586 +0000 UTC m=+0.849328017,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.520256 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b20ff10f85a5e openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:58.173882974 +0000 UTC m=+1.258922425,LastTimestamp:2026-03-09 09:05:58.173882974 +0000 UTC m=+1.258922425,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.528064 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b20ff11079019 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:58.174879769 +0000 UTC m=+1.259919180,LastTimestamp:2026-03-09 09:05:58.174879769 +0000 UTC m=+1.259919180,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.534627 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b20ff115a160f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:58.180288015 +0000 UTC m=+1.265327426,LastTimestamp:2026-03-09 09:05:58.180288015 +0000 UTC m=+1.265327426,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.542434 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b20ff1212c29b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:58.192390811 +0000 UTC m=+1.277430232,LastTimestamp:2026-03-09 09:05:58.192390811 +0000 UTC m=+1.277430232,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.550958 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b20ff1293a767 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:58.200837991 +0000 UTC m=+1.285877402,LastTimestamp:2026-03-09 09:05:58.200837991 +0000 UTC m=+1.285877402,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.558504 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b20ff3332b6d7 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:58.748133079 +0000 UTC m=+1.833172480,LastTimestamp:2026-03-09 09:05:58.748133079 +0000 UTC m=+1.833172480,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.565535 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b20ff335aabf1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:58.750751729 +0000 UTC m=+1.835791170,LastTimestamp:2026-03-09 09:05:58.750751729 +0000 UTC m=+1.835791170,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.569662 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b20ff33f18175 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:58.760636789 +0000 UTC m=+1.845676220,LastTimestamp:2026-03-09 09:05:58.760636789 +0000 UTC m=+1.845676220,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.573657 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b20ff34072a7e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:58.762056318 +0000 UTC m=+1.847095749,LastTimestamp:2026-03-09 09:05:58.762056318 +0000 UTC m=+1.847095749,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: I0309 09:06:32.577146 4861 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.577570 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b20ff3419ce60 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:58.76327792 +0000 UTC m=+1.848317321,LastTimestamp:2026-03-09 09:05:58.76327792 +0000 UTC m=+1.848317321,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.582261 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b20ff34790654 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:58.769518164 +0000 UTC m=+1.854557575,LastTimestamp:2026-03-09 09:05:58.769518164 +0000 UTC m=+1.854557575,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.585929 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b20ff34a9cbc2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:58.772714434 +0000 UTC m=+1.857753835,LastTimestamp:2026-03-09 09:05:58.772714434 +0000 UTC m=+1.857753835,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.590104 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b20ff34efc3ef openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:58.777299951 +0000 UTC m=+1.862339352,LastTimestamp:2026-03-09 09:05:58.777299951 +0000 UTC m=+1.862339352,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.594679 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b20ff35cd8e05 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:58.791835141 +0000 UTC m=+1.876874582,LastTimestamp:2026-03-09 09:05:58.791835141 +0000 UTC m=+1.876874582,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.596543 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b20ff36093927 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:58.795745575 +0000 UTC m=+1.880784976,LastTimestamp:2026-03-09 09:05:58.795745575 +0000 UTC m=+1.880784976,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.602091 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b20ff45c50bf4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:59.059713012 +0000 UTC m=+2.144752443,LastTimestamp:2026-03-09 09:05:59.059713012 +0000 UTC m=+2.144752443,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.609413 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b20ff46821b3a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:59.072103226 +0000 UTC m=+2.157142657,LastTimestamp:2026-03-09 09:05:59.072103226 +0000 UTC m=+2.157142657,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.615178 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b20ff469a2b87 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:59.073680263 +0000 UTC m=+2.158719664,LastTimestamp:2026-03-09 09:05:59.073680263 +0000 UTC m=+2.158719664,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.621964 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b20ff51c6d2d4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:59.261156052 +0000 UTC m=+2.346195493,LastTimestamp:2026-03-09 09:05:59.261156052 +0000 UTC m=+2.346195493,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.627987 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b20ff52c35bf8 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:59.277706232 +0000 UTC m=+2.362745633,LastTimestamp:2026-03-09 09:05:59.277706232 +0000 UTC m=+2.362745633,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.633618 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b20ff52d21706 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:59.278671622 +0000 UTC m=+2.363711063,LastTimestamp:2026-03-09 09:05:59.278671622 +0000 UTC m=+2.363711063,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.639968 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b20ff5440cb6a openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:59.302703978 +0000 UTC m=+2.387743379,LastTimestamp:2026-03-09 09:05:59.302703978 +0000 UTC m=+2.387743379,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.645894 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b20ff5e92618f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:59.475822991 +0000 UTC m=+2.560862432,LastTimestamp:2026-03-09 09:05:59.475822991 +0000 UTC m=+2.560862432,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.652499 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b20ff628f2826 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:59.54272055 +0000 UTC m=+2.627759991,LastTimestamp:2026-03-09 09:05:59.54272055 +0000 UTC m=+2.627759991,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.659426 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b20ff6ac7eb58 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:59.680658264 +0000 UTC m=+2.765697665,LastTimestamp:2026-03-09 09:05:59.680658264 +0000 UTC m=+2.765697665,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.666971 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b20ff6b5be790 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:59.690356624 +0000 UTC m=+2.775396065,LastTimestamp:2026-03-09 09:05:59.690356624 +0000 UTC m=+2.775396065,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.674410 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b20ff6bb8be46 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:59.696440902 +0000 UTC m=+2.781480343,LastTimestamp:2026-03-09 09:05:59.696440902 +0000 UTC m=+2.781480343,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.681070 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b20ff6be94bdc openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:59.699622876 +0000 UTC m=+2.784662277,LastTimestamp:2026-03-09 09:05:59.699622876 +0000 UTC m=+2.784662277,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.688068 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b20ff78fee999 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:59.919143321 +0000 UTC m=+3.004182712,LastTimestamp:2026-03-09 09:05:59.919143321 +0000 UTC m=+3.004182712,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.694497 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b20ff796fdb3d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:59.926545213 +0000 UTC m=+3.011584614,LastTimestamp:2026-03-09 09:05:59.926545213 +0000 UTC m=+3.011584614,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.701534 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b20ff79a885a3 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:59.930258851 +0000 UTC m=+3.015298262,LastTimestamp:2026-03-09 09:05:59.930258851 +0000 UTC m=+3.015298262,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.708126 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b20ff79cf7733 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:59.932811059 +0000 UTC m=+3.017850460,LastTimestamp:2026-03-09 09:05:59.932811059 +0000 UTC m=+3.017850460,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.714545 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b20ff79dc807c openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:59.933665404 +0000 UTC m=+3.018704805,LastTimestamp:2026-03-09 09:05:59.933665404 +0000 UTC m=+3.018704805,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.721532 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b20ff7a1f5d94 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:59.93804738 +0000 UTC m=+3.023086781,LastTimestamp:2026-03-09 09:05:59.93804738 +0000 UTC m=+3.023086781,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.729968 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b20ff7a48249d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:59.940719773 +0000 UTC m=+3.025759174,LastTimestamp:2026-03-09 09:05:59.940719773 +0000 UTC m=+3.025759174,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.740955 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b20ff7a91011f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:59.945494815 +0000 UTC m=+3.030534216,LastTimestamp:2026-03-09 09:05:59.945494815 +0000 UTC m=+3.030534216,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.742215 4861 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 09 09:06:32 crc kubenswrapper[4861]: I0309 09:06:32.745184 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:06:32 crc kubenswrapper[4861]: I0309 09:06:32.748763 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:06:32 crc kubenswrapper[4861]: I0309 09:06:32.748835 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:06:32 crc kubenswrapper[4861]: I0309 09:06:32.748864 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:06:32 crc kubenswrapper[4861]: I0309 09:06:32.748928 4861 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.751992 4861 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.752468 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b20ff7ac8f886 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:59.94916263 +0000 UTC m=+3.034202031,LastTimestamp:2026-03-09 09:05:59.94916263 +0000 UTC m=+3.034202031,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.756141 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b20ff7c18e175 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:59.971176821 +0000 UTC m=+3.056216222,LastTimestamp:2026-03-09 09:05:59.971176821 +0000 UTC m=+3.056216222,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.761618 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b20ff84f5cbcc openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:06:00.11987246 +0000 UTC m=+3.204911871,LastTimestamp:2026-03-09 09:06:00.11987246 +0000 UTC m=+3.204911871,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.769069 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b20ff8561a6c5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:06:00.126940869 +0000 UTC m=+3.211980260,LastTimestamp:2026-03-09 09:06:00.126940869 +0000 UTC m=+3.211980260,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.776359 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b20ff861b146f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:06:00.139093103 +0000 UTC m=+3.224132504,LastTimestamp:2026-03-09 09:06:00.139093103 +0000 UTC m=+3.224132504,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.781083 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b20ff863cffa6 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:06:00.141316006 +0000 UTC m=+3.226355407,LastTimestamp:2026-03-09 09:06:00.141316006 +0000 UTC m=+3.226355407,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.785985 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b20ff8647b7ad openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:06:00.142018477 +0000 UTC m=+3.227057878,LastTimestamp:2026-03-09 09:06:00.142018477 +0000 UTC m=+3.227057878,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.791232 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b20ff865bd396 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:06:00.143336342 +0000 UTC m=+3.228375733,LastTimestamp:2026-03-09 09:06:00.143336342 +0000 UTC m=+3.228375733,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.798832 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b20ff94166ee4 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:06:00.373669604 +0000 UTC m=+3.458709005,LastTimestamp:2026-03-09 09:06:00.373669604 +0000 UTC m=+3.458709005,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.804280 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b20ff942a7a0b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:06:00.374983179 +0000 UTC m=+3.460022590,LastTimestamp:2026-03-09 09:06:00.374983179 +0000 UTC m=+3.460022590,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.810667 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b20ff94e7ad7e openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:06:00.387382654 +0000 UTC m=+3.472422055,LastTimestamp:2026-03-09 09:06:00.387382654 +0000 UTC m=+3.472422055,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.817240 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b20ff95988ba8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:06:00.398973864 +0000 UTC m=+3.484013265,LastTimestamp:2026-03-09 09:06:00.398973864 +0000 UTC m=+3.484013265,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.823579 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b20ff95aa5661 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:06:00.400139873 +0000 UTC m=+3.485179274,LastTimestamp:2026-03-09 09:06:00.400139873 +0000 UTC m=+3.485179274,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.830186 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b20ffa0023420 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:06:00.573670432 +0000 UTC m=+3.658709833,LastTimestamp:2026-03-09 09:06:00.573670432 +0000 UTC m=+3.658709833,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.834025 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b20ffa0e30d8c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:06:00.588406156 +0000 UTC m=+3.673445557,LastTimestamp:2026-03-09 09:06:00.588406156 +0000 UTC m=+3.673445557,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.838020 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b20ffa0f44f46 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:06:00.589537094 +0000 UTC m=+3.674576505,LastTimestamp:2026-03-09 09:06:00.589537094 +0000 UTC m=+3.674576505,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.842740 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b20ffa8c3567c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:06:00.720545404 +0000 UTC m=+3.805584805,LastTimestamp:2026-03-09 09:06:00.720545404 +0000 UTC m=+3.805584805,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.847101 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b20ffacdc0305 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:06:00.789271301 +0000 UTC m=+3.874310702,LastTimestamp:2026-03-09 09:06:00.789271301 +0000 UTC m=+3.874310702,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.850938 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b20ffada9ff5e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:06:00.802770782 +0000 UTC m=+3.887810183,LastTimestamp:2026-03-09 09:06:00.802770782 +0000 UTC m=+3.887810183,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.854726 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b20ffb5886d72 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:06:00.934788466 +0000 UTC m=+4.019827877,LastTimestamp:2026-03-09 09:06:00.934788466 +0000 UTC m=+4.019827877,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.860288 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b20ffb6b4ff33 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:06:00.954486579 +0000 UTC m=+4.039525990,LastTimestamp:2026-03-09 09:06:00.954486579 +0000 UTC m=+4.039525990,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.864826 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b20ffe4dde81e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:06:01.728919582 +0000 UTC m=+4.813959013,LastTimestamp:2026-03-09 09:06:01.728919582 +0000 UTC m=+4.813959013,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.871214 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b20fff1c5e80e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:06:01.94545051 +0000 UTC m=+5.030489911,LastTimestamp:2026-03-09 09:06:01.94545051 +0000 UTC m=+5.030489911,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.874644 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b20fff2b7f307 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:06:01.961313031 +0000 UTC m=+5.046352482,LastTimestamp:2026-03-09 09:06:01.961313031 +0000 UTC m=+5.046352482,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.878823 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b20fff2d28e19 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:06:01.963056665 +0000 UTC m=+5.048096086,LastTimestamp:2026-03-09 09:06:01.963056665 +0000 UTC m=+5.048096086,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.883065 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b21000af29434 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:06:02.367808564 +0000 UTC m=+5.452848005,LastTimestamp:2026-03-09 09:06:02.367808564 +0000 UTC m=+5.452848005,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.887032 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b21000bd520e9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:06:02.382655721 +0000 UTC m=+5.467695172,LastTimestamp:2026-03-09 09:06:02.382655721 +0000 UTC m=+5.467695172,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.891363 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b21000c069433 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:06:02.385896499 +0000 UTC m=+5.470935940,LastTimestamp:2026-03-09 09:06:02.385896499 +0000 UTC m=+5.470935940,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.896018 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b210019444879 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:06:02.608044153 +0000 UTC m=+5.693083594,LastTimestamp:2026-03-09 09:06:02.608044153 +0000 UTC m=+5.693083594,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.900014 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b21001a348f65 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:06:02.623790949 +0000 UTC m=+5.708830380,LastTimestamp:2026-03-09 09:06:02.623790949 +0000 UTC m=+5.708830380,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.906931 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b21001a4f8357 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:06:02.625557335 +0000 UTC m=+5.710596766,LastTimestamp:2026-03-09 09:06:02.625557335 +0000 UTC m=+5.710596766,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.912580 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b210027d750c7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:06:02.852561095 +0000 UTC m=+5.937600536,LastTimestamp:2026-03-09 09:06:02.852561095 +0000 UTC m=+5.937600536,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.920484 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b210028bc6375 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:06:02.867573621 +0000 UTC m=+5.952613052,LastTimestamp:2026-03-09 09:06:02.867573621 +0000 UTC m=+5.952613052,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.925189 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b210028d354ac openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:06:02.869077164 +0000 UTC m=+5.954116595,LastTimestamp:2026-03-09 09:06:02.869077164 +0000 UTC m=+5.954116595,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.931948 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b210036f9700b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:06:03.106455563 +0000 UTC m=+6.191494964,LastTimestamp:2026-03-09 09:06:03.106455563 +0000 UTC m=+6.191494964,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.937267 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b210037bbffc6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:06:03.119206342 +0000 UTC m=+6.204245743,LastTimestamp:2026-03-09 09:06:03.119206342 +0000 UTC m=+6.204245743,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.946926 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 09 09:06:32 crc kubenswrapper[4861]: &Event{ObjectMeta:{kube-controller-manager-crc.189b21019cc475b0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 09 09:06:32 crc kubenswrapper[4861]: body: Mar 09 09:06:32 crc kubenswrapper[4861]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:06:09.109226928 +0000 UTC m=+12.194266329,LastTimestamp:2026-03-09 09:06:09.109226928 +0000 UTC m=+12.194266329,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 09:06:32 crc kubenswrapper[4861]: > Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.953233 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b21019cc5757a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:06:09.10929241 +0000 UTC m=+12.194331811,LastTimestamp:2026-03-09 09:06:09.10929241 +0000 UTC m=+12.194331811,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.958493 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 09 09:06:32 crc kubenswrapper[4861]: &Event{ObjectMeta:{kube-apiserver-crc.189b2101efeb94d3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Readiness probe error: Get "https://192.168.126.11:17697/healthz": dial tcp 192.168.126.11:17697: connect: connection refused Mar 09 09:06:32 crc kubenswrapper[4861]: body: Mar 09 09:06:32 crc kubenswrapper[4861]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:06:10.504299731 +0000 UTC m=+13.589339182,LastTimestamp:2026-03-09 09:06:10.504299731 +0000 UTC m=+13.589339182,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 09:06:32 crc kubenswrapper[4861]: > Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.962943 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b2101efedd14d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Unhealthy,Message:Readiness probe failed: Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:06:10.504446285 +0000 UTC m=+13.589485736,LastTimestamp:2026-03-09 09:06:10.504446285 +0000 UTC m=+13.589485736,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.968074 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 09 09:06:32 crc kubenswrapper[4861]: &Event{ObjectMeta:{kube-apiserver-crc.189b21025d5401ae openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 09 09:06:32 crc kubenswrapper[4861]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Mar 09 09:06:32 crc kubenswrapper[4861]: Mar 09 09:06:32 crc kubenswrapper[4861]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:06:12.339859886 +0000 UTC m=+15.424899297,LastTimestamp:2026-03-09 09:06:12.339859886 +0000 UTC m=+15.424899297,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 09:06:32 crc kubenswrapper[4861]: > Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.972754 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b21025d54a2d1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:06:12.339901137 +0000 UTC m=+15.424940548,LastTimestamp:2026-03-09 09:06:12.339901137 +0000 UTC m=+15.424940548,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.976509 4861 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b21025d5401ae\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 09 09:06:32 crc kubenswrapper[4861]: &Event{ObjectMeta:{kube-apiserver-crc.189b21025d5401ae openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 09 09:06:32 crc kubenswrapper[4861]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Mar 09 09:06:32 crc kubenswrapper[4861]: Mar 09 09:06:32 crc kubenswrapper[4861]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:06:12.339859886 +0000 UTC m=+15.424899297,LastTimestamp:2026-03-09 09:06:12.346820975 +0000 UTC m=+15.431860416,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 09:06:32 crc kubenswrapper[4861]: > Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.979686 4861 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b21025d54a2d1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b21025d54a2d1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:06:12.339901137 +0000 UTC m=+15.424940548,LastTimestamp:2026-03-09 09:06:12.346902277 +0000 UTC m=+15.431941718,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.981153 4861 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b20ffa0f44f46\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b20ffa0f44f46 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:06:00.589537094 +0000 UTC m=+3.674576505,LastTimestamp:2026-03-09 09:06:12.777262374 +0000 UTC m=+15.862301815,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.989072 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 09 09:06:32 crc kubenswrapper[4861]: &Event{ObjectMeta:{kube-controller-manager-crc.189b2103f0f0b8c9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 09 09:06:32 crc kubenswrapper[4861]: body: Mar 09 09:06:32 crc kubenswrapper[4861]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:06:19.111348425 +0000 UTC m=+22.196387856,LastTimestamp:2026-03-09 09:06:19.111348425 +0000 UTC m=+22.196387856,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 09:06:32 crc kubenswrapper[4861]: > Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.993513 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2103f0f2263a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:06:19.111441978 +0000 UTC m=+22.196481419,LastTimestamp:2026-03-09 09:06:19.111441978 +0000 UTC m=+22.196481419,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:32 crc kubenswrapper[4861]: E0309 09:06:32.998358 4861 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b2103f0f0b8c9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 09 09:06:32 crc kubenswrapper[4861]: &Event{ObjectMeta:{kube-controller-manager-crc.189b2103f0f0b8c9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 09 09:06:32 crc kubenswrapper[4861]: body: Mar 09 09:06:32 crc kubenswrapper[4861]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:06:19.111348425 +0000 UTC m=+22.196387856,LastTimestamp:2026-03-09 09:06:29.109969669 +0000 UTC m=+32.195009110,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 09:06:32 crc kubenswrapper[4861]: > Mar 09 09:06:33 crc kubenswrapper[4861]: E0309 09:06:33.003423 4861 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b2103f0f2263a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b2103f0f2263a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:06:19.111441978 +0000 UTC m=+22.196481419,LastTimestamp:2026-03-09 09:06:29.110044551 +0000 UTC m=+32.195083992,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:33 crc kubenswrapper[4861]: E0309 09:06:33.008205 4861 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b210645125965 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:06:29.112772965 +0000 UTC m=+32.197812406,LastTimestamp:2026-03-09 09:06:29.112772965 +0000 UTC m=+32.197812406,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:33 crc kubenswrapper[4861]: E0309 09:06:33.013034 4861 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b20ff3419ce60\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b20ff3419ce60 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:58.76327792 +0000 UTC m=+1.848317321,LastTimestamp:2026-03-09 09:06:29.232065583 +0000 UTC m=+32.317105014,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:33 crc kubenswrapper[4861]: E0309 09:06:33.017505 4861 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b20ff45c50bf4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b20ff45c50bf4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:59.059713012 +0000 UTC m=+2.144752443,LastTimestamp:2026-03-09 09:06:29.450351006 +0000 UTC m=+32.535390437,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:33 crc kubenswrapper[4861]: E0309 09:06:33.022861 4861 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b20ff46821b3a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b20ff46821b3a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:05:59.072103226 +0000 UTC m=+2.157142657,LastTimestamp:2026-03-09 09:06:29.462530888 +0000 UTC m=+32.547570319,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:06:33 crc kubenswrapper[4861]: I0309 09:06:33.581404 4861 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:06:34 crc kubenswrapper[4861]: I0309 09:06:34.583778 4861 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:06:34 crc kubenswrapper[4861]: I0309 09:06:34.657247 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:06:34 crc kubenswrapper[4861]: I0309 09:06:34.658990 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:06:34 crc kubenswrapper[4861]: I0309 09:06:34.659044 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:06:34 crc kubenswrapper[4861]: I0309 09:06:34.659061 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:06:34 crc kubenswrapper[4861]: I0309 09:06:34.660012 4861 scope.go:117] "RemoveContainer" containerID="6ce63f21c27e9fa2c6a001f46e36f979b50a113b885f4a79ab95dffa1eab11c6" Mar 09 09:06:35 crc kubenswrapper[4861]: I0309 09:06:35.583059 4861 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:06:35 crc kubenswrapper[4861]: I0309 09:06:35.858500 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 09 09:06:35 crc kubenswrapper[4861]: I0309 09:06:35.860344 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a68eb728eb7ca7538eca2afaa31cb79a83d99a1bca1f64dcfe07270be7395218"} Mar 09 09:06:35 crc kubenswrapper[4861]: I0309 09:06:35.860585 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:06:35 crc kubenswrapper[4861]: I0309 09:06:35.861487 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:06:35 crc kubenswrapper[4861]: I0309 09:06:35.861510 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:06:35 crc kubenswrapper[4861]: I0309 09:06:35.861517 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:06:36 crc kubenswrapper[4861]: I0309 09:06:36.109956 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 09:06:36 crc kubenswrapper[4861]: I0309 09:06:36.110430 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:06:36 crc kubenswrapper[4861]: I0309 09:06:36.111748 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:06:36 crc kubenswrapper[4861]: I0309 09:06:36.111802 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:06:36 crc kubenswrapper[4861]: I0309 09:06:36.111821 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:06:36 crc kubenswrapper[4861]: I0309 09:06:36.582528 4861 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:06:36 crc kubenswrapper[4861]: I0309 09:06:36.867214 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 09 09:06:36 crc kubenswrapper[4861]: I0309 09:06:36.868067 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 09 09:06:36 crc kubenswrapper[4861]: I0309 09:06:36.871253 4861 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a68eb728eb7ca7538eca2afaa31cb79a83d99a1bca1f64dcfe07270be7395218" exitCode=255 Mar 09 09:06:36 crc kubenswrapper[4861]: I0309 09:06:36.871332 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a68eb728eb7ca7538eca2afaa31cb79a83d99a1bca1f64dcfe07270be7395218"} Mar 09 09:06:36 crc kubenswrapper[4861]: I0309 09:06:36.871453 4861 scope.go:117] "RemoveContainer" containerID="6ce63f21c27e9fa2c6a001f46e36f979b50a113b885f4a79ab95dffa1eab11c6" Mar 09 09:06:36 crc kubenswrapper[4861]: I0309 09:06:36.871703 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:06:36 crc kubenswrapper[4861]: I0309 09:06:36.872972 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:06:36 crc kubenswrapper[4861]: I0309 09:06:36.873037 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:06:36 crc kubenswrapper[4861]: I0309 09:06:36.873061 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:06:36 crc kubenswrapper[4861]: I0309 09:06:36.874182 4861 scope.go:117] "RemoveContainer" containerID="a68eb728eb7ca7538eca2afaa31cb79a83d99a1bca1f64dcfe07270be7395218" Mar 09 09:06:36 crc kubenswrapper[4861]: E0309 09:06:36.874598 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 09:06:36 crc kubenswrapper[4861]: W0309 09:06:36.879164 4861 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 09 09:06:36 crc kubenswrapper[4861]: E0309 09:06:36.879240 4861 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 09 09:06:37 crc kubenswrapper[4861]: I0309 09:06:37.112533 4861 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 09 09:06:37 crc kubenswrapper[4861]: I0309 09:06:37.134345 4861 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 09 09:06:37 crc kubenswrapper[4861]: I0309 09:06:37.251263 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 09:06:37 crc kubenswrapper[4861]: I0309 09:06:37.251531 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:06:37 crc kubenswrapper[4861]: I0309 09:06:37.253102 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:06:37 crc kubenswrapper[4861]: I0309 09:06:37.253162 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:06:37 crc kubenswrapper[4861]: I0309 09:06:37.253184 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:06:37 crc kubenswrapper[4861]: I0309 09:06:37.583092 4861 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:06:37 crc kubenswrapper[4861]: I0309 09:06:37.638794 4861 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:06:37 crc kubenswrapper[4861]: E0309 09:06:37.757097 4861 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 09:06:37 crc kubenswrapper[4861]: I0309 09:06:37.874973 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 09 09:06:37 crc kubenswrapper[4861]: I0309 09:06:37.876978 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:06:37 crc kubenswrapper[4861]: I0309 09:06:37.878017 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:06:37 crc kubenswrapper[4861]: I0309 09:06:37.878063 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:06:37 crc kubenswrapper[4861]: I0309 09:06:37.878082 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:06:37 crc kubenswrapper[4861]: I0309 09:06:37.878889 4861 scope.go:117] "RemoveContainer" containerID="a68eb728eb7ca7538eca2afaa31cb79a83d99a1bca1f64dcfe07270be7395218" Mar 09 09:06:37 crc kubenswrapper[4861]: E0309 09:06:37.879121 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 09:06:38 crc kubenswrapper[4861]: W0309 09:06:38.184188 4861 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 09 09:06:38 crc kubenswrapper[4861]: E0309 09:06:38.184261 4861 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 09 09:06:38 crc kubenswrapper[4861]: I0309 09:06:38.584512 4861 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:06:38 crc kubenswrapper[4861]: I0309 09:06:38.897177 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 09:06:38 crc kubenswrapper[4861]: I0309 09:06:38.897451 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:06:38 crc kubenswrapper[4861]: I0309 09:06:38.899105 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:06:38 crc kubenswrapper[4861]: I0309 09:06:38.899155 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:06:38 crc kubenswrapper[4861]: I0309 09:06:38.899173 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:06:39 crc kubenswrapper[4861]: I0309 09:06:39.580873 4861 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:06:39 crc kubenswrapper[4861]: E0309 09:06:39.748333 4861 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 09 09:06:39 crc kubenswrapper[4861]: I0309 09:06:39.752448 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:06:39 crc kubenswrapper[4861]: I0309 09:06:39.754028 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:06:39 crc kubenswrapper[4861]: I0309 09:06:39.754075 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:06:39 crc kubenswrapper[4861]: I0309 09:06:39.754092 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:06:39 crc kubenswrapper[4861]: I0309 09:06:39.754132 4861 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 09:06:39 crc kubenswrapper[4861]: E0309 09:06:39.758132 4861 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 09 09:06:40 crc kubenswrapper[4861]: I0309 09:06:40.503549 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:06:40 crc kubenswrapper[4861]: I0309 09:06:40.503867 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:06:40 crc kubenswrapper[4861]: I0309 09:06:40.505499 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:06:40 crc kubenswrapper[4861]: I0309 09:06:40.505548 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:06:40 crc kubenswrapper[4861]: I0309 09:06:40.505558 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:06:40 crc kubenswrapper[4861]: I0309 09:06:40.506229 4861 scope.go:117] "RemoveContainer" containerID="a68eb728eb7ca7538eca2afaa31cb79a83d99a1bca1f64dcfe07270be7395218" Mar 09 09:06:40 crc kubenswrapper[4861]: E0309 09:06:40.506462 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 09:06:40 crc kubenswrapper[4861]: I0309 09:06:40.585264 4861 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:06:40 crc kubenswrapper[4861]: W0309 09:06:40.732705 4861 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 09 09:06:40 crc kubenswrapper[4861]: E0309 09:06:40.732774 4861 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 09 09:06:41 crc kubenswrapper[4861]: I0309 09:06:41.583048 4861 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:06:42 crc kubenswrapper[4861]: I0309 09:06:42.581410 4861 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:06:43 crc kubenswrapper[4861]: I0309 09:06:43.583128 4861 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:06:44 crc kubenswrapper[4861]: W0309 09:06:44.192983 4861 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 09 09:06:44 crc kubenswrapper[4861]: E0309 09:06:44.193058 4861 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 09 09:06:44 crc kubenswrapper[4861]: I0309 09:06:44.583822 4861 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:06:45 crc kubenswrapper[4861]: I0309 09:06:45.587923 4861 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:06:45 crc kubenswrapper[4861]: I0309 09:06:45.736668 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 09:06:45 crc kubenswrapper[4861]: I0309 09:06:45.736882 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:06:45 crc kubenswrapper[4861]: I0309 09:06:45.738276 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:06:45 crc kubenswrapper[4861]: I0309 09:06:45.738354 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:06:45 crc kubenswrapper[4861]: I0309 09:06:45.738392 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:06:46 crc kubenswrapper[4861]: I0309 09:06:46.580177 4861 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:06:46 crc kubenswrapper[4861]: E0309 09:06:46.754076 4861 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 09 09:06:46 crc kubenswrapper[4861]: I0309 09:06:46.759112 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:06:46 crc kubenswrapper[4861]: I0309 09:06:46.760429 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:06:46 crc kubenswrapper[4861]: I0309 09:06:46.760495 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:06:46 crc kubenswrapper[4861]: I0309 09:06:46.760513 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:06:46 crc kubenswrapper[4861]: I0309 09:06:46.760552 4861 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 09:06:46 crc kubenswrapper[4861]: E0309 09:06:46.767421 4861 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 09 09:06:47 crc kubenswrapper[4861]: I0309 09:06:47.254955 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 09:06:47 crc kubenswrapper[4861]: I0309 09:06:47.255505 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:06:47 crc kubenswrapper[4861]: I0309 09:06:47.256852 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:06:47 crc kubenswrapper[4861]: I0309 09:06:47.257063 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:06:47 crc kubenswrapper[4861]: I0309 09:06:47.257215 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:06:47 crc kubenswrapper[4861]: I0309 09:06:47.582633 4861 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:06:47 crc kubenswrapper[4861]: E0309 09:06:47.757293 4861 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 09:06:48 crc kubenswrapper[4861]: I0309 09:06:48.582074 4861 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:06:49 crc kubenswrapper[4861]: I0309 09:06:49.577113 4861 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:06:50 crc kubenswrapper[4861]: I0309 09:06:50.582521 4861 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:06:51 crc kubenswrapper[4861]: I0309 09:06:51.580262 4861 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:06:51 crc kubenswrapper[4861]: I0309 09:06:51.657629 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:06:51 crc kubenswrapper[4861]: I0309 09:06:51.658840 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:06:51 crc kubenswrapper[4861]: I0309 09:06:51.658971 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:06:51 crc kubenswrapper[4861]: I0309 09:06:51.659075 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:06:51 crc kubenswrapper[4861]: I0309 09:06:51.659735 4861 scope.go:117] "RemoveContainer" containerID="a68eb728eb7ca7538eca2afaa31cb79a83d99a1bca1f64dcfe07270be7395218" Mar 09 09:06:51 crc kubenswrapper[4861]: E0309 09:06:51.662719 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 09:06:52 crc kubenswrapper[4861]: I0309 09:06:52.584838 4861 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:06:53 crc kubenswrapper[4861]: I0309 09:06:53.584225 4861 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:06:53 crc kubenswrapper[4861]: E0309 09:06:53.758911 4861 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 09 09:06:53 crc kubenswrapper[4861]: I0309 09:06:53.768037 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:06:53 crc kubenswrapper[4861]: I0309 09:06:53.769618 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:06:53 crc kubenswrapper[4861]: I0309 09:06:53.769678 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:06:53 crc kubenswrapper[4861]: I0309 09:06:53.769718 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:06:53 crc kubenswrapper[4861]: I0309 09:06:53.769766 4861 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 09:06:53 crc kubenswrapper[4861]: E0309 09:06:53.776781 4861 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 09 09:06:54 crc kubenswrapper[4861]: I0309 09:06:54.582351 4861 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:06:55 crc kubenswrapper[4861]: I0309 09:06:55.582610 4861 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:06:56 crc kubenswrapper[4861]: I0309 09:06:56.582716 4861 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:06:57 crc kubenswrapper[4861]: I0309 09:06:57.582692 4861 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:06:57 crc kubenswrapper[4861]: E0309 09:06:57.757932 4861 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 09:06:58 crc kubenswrapper[4861]: I0309 09:06:58.581440 4861 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:06:59 crc kubenswrapper[4861]: I0309 09:06:59.581118 4861 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:07:00 crc kubenswrapper[4861]: I0309 09:07:00.582342 4861 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:07:00 crc kubenswrapper[4861]: E0309 09:07:00.765516 4861 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 09 09:07:00 crc kubenswrapper[4861]: I0309 09:07:00.777642 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:00 crc kubenswrapper[4861]: I0309 09:07:00.779599 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:00 crc kubenswrapper[4861]: I0309 09:07:00.779650 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:00 crc kubenswrapper[4861]: I0309 09:07:00.779675 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:00 crc kubenswrapper[4861]: I0309 09:07:00.779764 4861 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 09:07:00 crc kubenswrapper[4861]: E0309 09:07:00.784673 4861 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 09 09:07:01 crc kubenswrapper[4861]: I0309 09:07:01.582641 4861 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:07:02 crc kubenswrapper[4861]: I0309 09:07:02.581706 4861 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:07:03 crc kubenswrapper[4861]: I0309 09:07:03.580274 4861 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:07:03 crc kubenswrapper[4861]: I0309 09:07:03.657271 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:03 crc kubenswrapper[4861]: I0309 09:07:03.658945 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:03 crc kubenswrapper[4861]: I0309 09:07:03.659023 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:03 crc kubenswrapper[4861]: I0309 09:07:03.659055 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:03 crc kubenswrapper[4861]: I0309 09:07:03.659951 4861 scope.go:117] "RemoveContainer" containerID="a68eb728eb7ca7538eca2afaa31cb79a83d99a1bca1f64dcfe07270be7395218" Mar 09 09:07:03 crc kubenswrapper[4861]: I0309 09:07:03.952679 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 09 09:07:03 crc kubenswrapper[4861]: I0309 09:07:03.955096 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f88f2e129d4bfc17b0ac4416da5c5096bf314e097dfa40a48858a83425ca91e1"} Mar 09 09:07:03 crc kubenswrapper[4861]: I0309 09:07:03.955224 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:03 crc kubenswrapper[4861]: I0309 09:07:03.956169 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:03 crc kubenswrapper[4861]: I0309 09:07:03.956236 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:03 crc kubenswrapper[4861]: I0309 09:07:03.956262 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:04 crc kubenswrapper[4861]: I0309 09:07:04.580432 4861 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:07:04 crc kubenswrapper[4861]: I0309 09:07:04.958753 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 09 09:07:04 crc kubenswrapper[4861]: I0309 09:07:04.959541 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 09 09:07:04 crc kubenswrapper[4861]: I0309 09:07:04.961381 4861 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f88f2e129d4bfc17b0ac4416da5c5096bf314e097dfa40a48858a83425ca91e1" exitCode=255 Mar 09 09:07:04 crc kubenswrapper[4861]: I0309 09:07:04.961436 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"f88f2e129d4bfc17b0ac4416da5c5096bf314e097dfa40a48858a83425ca91e1"} Mar 09 09:07:04 crc kubenswrapper[4861]: I0309 09:07:04.961470 4861 scope.go:117] "RemoveContainer" containerID="a68eb728eb7ca7538eca2afaa31cb79a83d99a1bca1f64dcfe07270be7395218" Mar 09 09:07:04 crc kubenswrapper[4861]: I0309 09:07:04.961607 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:04 crc kubenswrapper[4861]: I0309 09:07:04.962270 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:04 crc kubenswrapper[4861]: I0309 09:07:04.962289 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:04 crc kubenswrapper[4861]: I0309 09:07:04.962297 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:04 crc kubenswrapper[4861]: I0309 09:07:04.962751 4861 scope.go:117] "RemoveContainer" containerID="f88f2e129d4bfc17b0ac4416da5c5096bf314e097dfa40a48858a83425ca91e1" Mar 09 09:07:04 crc kubenswrapper[4861]: E0309 09:07:04.962910 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 09:07:05 crc kubenswrapper[4861]: I0309 09:07:05.580770 4861 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:07:05 crc kubenswrapper[4861]: I0309 09:07:05.964780 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 09 09:07:06 crc kubenswrapper[4861]: I0309 09:07:06.581999 4861 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 09:07:07 crc kubenswrapper[4861]: I0309 09:07:07.355745 4861 csr.go:261] certificate signing request csr-7r8c8 is approved, waiting to be issued Mar 09 09:07:07 crc kubenswrapper[4861]: I0309 09:07:07.363149 4861 csr.go:257] certificate signing request csr-7r8c8 is issued Mar 09 09:07:07 crc kubenswrapper[4861]: I0309 09:07:07.414363 4861 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 09 09:07:07 crc kubenswrapper[4861]: I0309 09:07:07.444599 4861 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 09 09:07:07 crc kubenswrapper[4861]: I0309 09:07:07.639019 4861 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:07:07 crc kubenswrapper[4861]: I0309 09:07:07.639237 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:07 crc kubenswrapper[4861]: I0309 09:07:07.640899 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:07 crc kubenswrapper[4861]: I0309 09:07:07.640938 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:07 crc kubenswrapper[4861]: I0309 09:07:07.640947 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:07 crc kubenswrapper[4861]: I0309 09:07:07.641674 4861 scope.go:117] "RemoveContainer" containerID="f88f2e129d4bfc17b0ac4416da5c5096bf314e097dfa40a48858a83425ca91e1" Mar 09 09:07:07 crc kubenswrapper[4861]: E0309 09:07:07.641822 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 09:07:07 crc kubenswrapper[4861]: E0309 09:07:07.759057 4861 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 09:07:07 crc kubenswrapper[4861]: I0309 09:07:07.785595 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:07 crc kubenswrapper[4861]: I0309 09:07:07.786815 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:07 crc kubenswrapper[4861]: I0309 09:07:07.786862 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:07 crc kubenswrapper[4861]: I0309 09:07:07.786874 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:07 crc kubenswrapper[4861]: I0309 09:07:07.786953 4861 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 09:07:07 crc kubenswrapper[4861]: I0309 09:07:07.795298 4861 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 09 09:07:07 crc kubenswrapper[4861]: I0309 09:07:07.795613 4861 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 09 09:07:07 crc kubenswrapper[4861]: E0309 09:07:07.795640 4861 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 09 09:07:07 crc kubenswrapper[4861]: I0309 09:07:07.799507 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:07 crc kubenswrapper[4861]: I0309 09:07:07.799541 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:07 crc kubenswrapper[4861]: I0309 09:07:07.799552 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:07 crc kubenswrapper[4861]: I0309 09:07:07.799566 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:07 crc kubenswrapper[4861]: I0309 09:07:07.799579 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:07Z","lastTransitionTime":"2026-03-09T09:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:07 crc kubenswrapper[4861]: E0309 09:07:07.812076 4861 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:07:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:07:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:07:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:07:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c804f4c5-c5a1-4765-ad37-8a6185c798f1\\\",\\\"systemUUID\\\":\\\"6cdf8bd4-67ee-426a-bd44-5025c8d84b0b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:07:07 crc kubenswrapper[4861]: I0309 09:07:07.820257 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:07 crc kubenswrapper[4861]: I0309 09:07:07.820322 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:07 crc kubenswrapper[4861]: I0309 09:07:07.820347 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:07 crc kubenswrapper[4861]: I0309 09:07:07.820408 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:07 crc kubenswrapper[4861]: I0309 09:07:07.820433 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:07Z","lastTransitionTime":"2026-03-09T09:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:07 crc kubenswrapper[4861]: E0309 09:07:07.836472 4861 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:07:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:07:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:07:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:07:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c804f4c5-c5a1-4765-ad37-8a6185c798f1\\\",\\\"systemUUID\\\":\\\"6cdf8bd4-67ee-426a-bd44-5025c8d84b0b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:07:07 crc kubenswrapper[4861]: I0309 09:07:07.844295 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:07 crc kubenswrapper[4861]: I0309 09:07:07.844333 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:07 crc kubenswrapper[4861]: I0309 09:07:07.844345 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:07 crc kubenswrapper[4861]: I0309 09:07:07.844360 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:07 crc kubenswrapper[4861]: I0309 09:07:07.844387 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:07Z","lastTransitionTime":"2026-03-09T09:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:07 crc kubenswrapper[4861]: E0309 09:07:07.856326 4861 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:07:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:07:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:07:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:07:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c804f4c5-c5a1-4765-ad37-8a6185c798f1\\\",\\\"systemUUID\\\":\\\"6cdf8bd4-67ee-426a-bd44-5025c8d84b0b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:07:07 crc kubenswrapper[4861]: I0309 09:07:07.867013 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:07 crc kubenswrapper[4861]: I0309 09:07:07.867053 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:07 crc kubenswrapper[4861]: I0309 09:07:07.867064 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:07 crc kubenswrapper[4861]: I0309 09:07:07.867080 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:07 crc kubenswrapper[4861]: I0309 09:07:07.867089 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:07Z","lastTransitionTime":"2026-03-09T09:07:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:07 crc kubenswrapper[4861]: E0309 09:07:07.881862 4861 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:07:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:07:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:07:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:07:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c804f4c5-c5a1-4765-ad37-8a6185c798f1\\\",\\\"systemUUID\\\":\\\"6cdf8bd4-67ee-426a-bd44-5025c8d84b0b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:07:07 crc kubenswrapper[4861]: E0309 09:07:07.882035 4861 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 09:07:07 crc kubenswrapper[4861]: E0309 09:07:07.882064 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:07 crc kubenswrapper[4861]: E0309 09:07:07.982125 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:08 crc kubenswrapper[4861]: E0309 09:07:08.082678 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:08 crc kubenswrapper[4861]: E0309 09:07:08.183465 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:08 crc kubenswrapper[4861]: E0309 09:07:08.283808 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:08 crc kubenswrapper[4861]: I0309 09:07:08.364643 4861 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-14 09:15:58.507226034 +0000 UTC Mar 09 09:07:08 crc kubenswrapper[4861]: I0309 09:07:08.364692 4861 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7464h8m50.142537234s for next certificate rotation Mar 09 09:07:08 crc kubenswrapper[4861]: E0309 09:07:08.385042 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:08 crc kubenswrapper[4861]: E0309 09:07:08.485628 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:08 crc kubenswrapper[4861]: E0309 09:07:08.586520 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:08 crc kubenswrapper[4861]: E0309 09:07:08.687182 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:08 crc kubenswrapper[4861]: E0309 09:07:08.788197 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:08 crc kubenswrapper[4861]: E0309 09:07:08.889334 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:08 crc kubenswrapper[4861]: E0309 09:07:08.990474 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:09 crc kubenswrapper[4861]: E0309 09:07:09.090861 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:09 crc kubenswrapper[4861]: I0309 09:07:09.111342 4861 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 09 09:07:09 crc kubenswrapper[4861]: E0309 09:07:09.191069 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:09 crc kubenswrapper[4861]: E0309 09:07:09.291915 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:09 crc kubenswrapper[4861]: E0309 09:07:09.393064 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:09 crc kubenswrapper[4861]: E0309 09:07:09.493709 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:09 crc kubenswrapper[4861]: E0309 09:07:09.594740 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:09 crc kubenswrapper[4861]: E0309 09:07:09.695678 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:09 crc kubenswrapper[4861]: E0309 09:07:09.796464 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:09 crc kubenswrapper[4861]: E0309 09:07:09.897331 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:09 crc kubenswrapper[4861]: E0309 09:07:09.998225 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:10 crc kubenswrapper[4861]: E0309 09:07:10.098718 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:10 crc kubenswrapper[4861]: E0309 09:07:10.199031 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:10 crc kubenswrapper[4861]: E0309 09:07:10.300206 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:10 crc kubenswrapper[4861]: E0309 09:07:10.401236 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:10 crc kubenswrapper[4861]: E0309 09:07:10.501359 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:10 crc kubenswrapper[4861]: I0309 09:07:10.503645 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:07:10 crc kubenswrapper[4861]: I0309 09:07:10.503858 4861 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 09:07:10 crc kubenswrapper[4861]: I0309 09:07:10.505384 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:10 crc kubenswrapper[4861]: I0309 09:07:10.505417 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:10 crc kubenswrapper[4861]: I0309 09:07:10.505429 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:10 crc kubenswrapper[4861]: I0309 09:07:10.506101 4861 scope.go:117] "RemoveContainer" containerID="f88f2e129d4bfc17b0ac4416da5c5096bf314e097dfa40a48858a83425ca91e1" Mar 09 09:07:10 crc kubenswrapper[4861]: E0309 09:07:10.506294 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 09:07:10 crc kubenswrapper[4861]: I0309 09:07:10.565074 4861 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 09 09:07:10 crc kubenswrapper[4861]: E0309 09:07:10.601519 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:10 crc kubenswrapper[4861]: E0309 09:07:10.702312 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:10 crc kubenswrapper[4861]: E0309 09:07:10.802876 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:10 crc kubenswrapper[4861]: E0309 09:07:10.903270 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:11 crc kubenswrapper[4861]: E0309 09:07:11.003939 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:11 crc kubenswrapper[4861]: E0309 09:07:11.105268 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:11 crc kubenswrapper[4861]: E0309 09:07:11.206285 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:11 crc kubenswrapper[4861]: E0309 09:07:11.306974 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:11 crc kubenswrapper[4861]: E0309 09:07:11.407997 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:11 crc kubenswrapper[4861]: E0309 09:07:11.508592 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:11 crc kubenswrapper[4861]: E0309 09:07:11.608991 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:11 crc kubenswrapper[4861]: E0309 09:07:11.709618 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:11 crc kubenswrapper[4861]: E0309 09:07:11.810398 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:11 crc kubenswrapper[4861]: E0309 09:07:11.911451 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:12 crc kubenswrapper[4861]: E0309 09:07:12.011933 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:12 crc kubenswrapper[4861]: E0309 09:07:12.112097 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:12 crc kubenswrapper[4861]: E0309 09:07:12.213223 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:12 crc kubenswrapper[4861]: E0309 09:07:12.313485 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:12 crc kubenswrapper[4861]: E0309 09:07:12.414346 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:12 crc kubenswrapper[4861]: E0309 09:07:12.515275 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:12 crc kubenswrapper[4861]: E0309 09:07:12.616277 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:12 crc kubenswrapper[4861]: E0309 09:07:12.717122 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:12 crc kubenswrapper[4861]: E0309 09:07:12.817646 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:12 crc kubenswrapper[4861]: E0309 09:07:12.918633 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:13 crc kubenswrapper[4861]: E0309 09:07:13.019331 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:13 crc kubenswrapper[4861]: E0309 09:07:13.120616 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:13 crc kubenswrapper[4861]: E0309 09:07:13.221719 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:13 crc kubenswrapper[4861]: E0309 09:07:13.322617 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:13 crc kubenswrapper[4861]: E0309 09:07:13.423559 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:13 crc kubenswrapper[4861]: E0309 09:07:13.523997 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:13 crc kubenswrapper[4861]: E0309 09:07:13.624175 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:13 crc kubenswrapper[4861]: E0309 09:07:13.724746 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:13 crc kubenswrapper[4861]: E0309 09:07:13.825259 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:13 crc kubenswrapper[4861]: E0309 09:07:13.925704 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:14 crc kubenswrapper[4861]: E0309 09:07:14.026457 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:14 crc kubenswrapper[4861]: E0309 09:07:14.126639 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:14 crc kubenswrapper[4861]: E0309 09:07:14.227657 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:14 crc kubenswrapper[4861]: E0309 09:07:14.328442 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:14 crc kubenswrapper[4861]: E0309 09:07:14.429465 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:14 crc kubenswrapper[4861]: E0309 09:07:14.529889 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:14 crc kubenswrapper[4861]: E0309 09:07:14.630049 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:14 crc kubenswrapper[4861]: E0309 09:07:14.730566 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:14 crc kubenswrapper[4861]: E0309 09:07:14.831029 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:14 crc kubenswrapper[4861]: E0309 09:07:14.931939 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:15 crc kubenswrapper[4861]: E0309 09:07:15.032462 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:15 crc kubenswrapper[4861]: E0309 09:07:15.133001 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:15 crc kubenswrapper[4861]: E0309 09:07:15.233422 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:15 crc kubenswrapper[4861]: E0309 09:07:15.334048 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:15 crc kubenswrapper[4861]: E0309 09:07:15.434853 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:15 crc kubenswrapper[4861]: E0309 09:07:15.536013 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:15 crc kubenswrapper[4861]: E0309 09:07:15.637116 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:15 crc kubenswrapper[4861]: E0309 09:07:15.738096 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:15 crc kubenswrapper[4861]: E0309 09:07:15.839732 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:15 crc kubenswrapper[4861]: E0309 09:07:15.940401 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:16 crc kubenswrapper[4861]: E0309 09:07:16.041033 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:16 crc kubenswrapper[4861]: E0309 09:07:16.141182 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:16 crc kubenswrapper[4861]: E0309 09:07:16.242255 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:16 crc kubenswrapper[4861]: E0309 09:07:16.343443 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:16 crc kubenswrapper[4861]: E0309 09:07:16.444640 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:16 crc kubenswrapper[4861]: E0309 09:07:16.545568 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:16 crc kubenswrapper[4861]: E0309 09:07:16.646122 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:16 crc kubenswrapper[4861]: E0309 09:07:16.746642 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:16 crc kubenswrapper[4861]: E0309 09:07:16.848446 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:16 crc kubenswrapper[4861]: E0309 09:07:16.949259 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:17 crc kubenswrapper[4861]: E0309 09:07:17.050426 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:17 crc kubenswrapper[4861]: E0309 09:07:17.150780 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:17 crc kubenswrapper[4861]: E0309 09:07:17.251009 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:17 crc kubenswrapper[4861]: E0309 09:07:17.351186 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:17 crc kubenswrapper[4861]: E0309 09:07:17.452422 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:17 crc kubenswrapper[4861]: E0309 09:07:17.552867 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:17 crc kubenswrapper[4861]: E0309 09:07:17.653762 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:17 crc kubenswrapper[4861]: E0309 09:07:17.754323 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:17 crc kubenswrapper[4861]: E0309 09:07:17.759614 4861 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 09:07:17 crc kubenswrapper[4861]: E0309 09:07:17.855426 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:17 crc kubenswrapper[4861]: E0309 09:07:17.956261 4861 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 09:07:18 crc kubenswrapper[4861]: E0309 09:07:18.023466 4861 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.028984 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.029029 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.029069 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.029156 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.029176 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:18Z","lastTransitionTime":"2026-03-09T09:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.030045 4861 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 09 09:07:18 crc kubenswrapper[4861]: E0309 09:07:18.045429 4861 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c804f4c5-c5a1-4765-ad37-8a6185c798f1\\\",\\\"systemUUID\\\":\\\"6cdf8bd4-67ee-426a-bd44-5025c8d84b0b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.050141 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.050183 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.050195 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.050211 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.050223 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:18Z","lastTransitionTime":"2026-03-09T09:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:18 crc kubenswrapper[4861]: E0309 09:07:18.061879 4861 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c804f4c5-c5a1-4765-ad37-8a6185c798f1\\\",\\\"systemUUID\\\":\\\"6cdf8bd4-67ee-426a-bd44-5025c8d84b0b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.065833 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.065892 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.065909 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.065932 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.065947 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:18Z","lastTransitionTime":"2026-03-09T09:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:18 crc kubenswrapper[4861]: E0309 09:07:18.076719 4861 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c804f4c5-c5a1-4765-ad37-8a6185c798f1\\\",\\\"systemUUID\\\":\\\"6cdf8bd4-67ee-426a-bd44-5025c8d84b0b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.080430 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.080465 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.080478 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.080493 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.080506 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:18Z","lastTransitionTime":"2026-03-09T09:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:18 crc kubenswrapper[4861]: E0309 09:07:18.092520 4861 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"c804f4c5-c5a1-4765-ad37-8a6185c798f1\\\",\\\"systemUUID\\\":\\\"6cdf8bd4-67ee-426a-bd44-5025c8d84b0b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:07:18 crc kubenswrapper[4861]: E0309 09:07:18.092746 4861 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.094545 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.094591 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.094600 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.094613 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.094635 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:18Z","lastTransitionTime":"2026-03-09T09:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.197338 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.197439 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.197466 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.197496 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.197520 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:18Z","lastTransitionTime":"2026-03-09T09:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.301081 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.301144 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.301161 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.301179 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.301201 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:18Z","lastTransitionTime":"2026-03-09T09:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.404336 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.404493 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.404521 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.404544 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.404563 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:18Z","lastTransitionTime":"2026-03-09T09:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.506610 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.506680 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.506699 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.506723 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.506740 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:18Z","lastTransitionTime":"2026-03-09T09:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.598409 4861 apiserver.go:52] "Watching apiserver" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.603600 4861 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.603905 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.604436 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.604501 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:07:18 crc kubenswrapper[4861]: E0309 09:07:18.605343 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.604543 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.604571 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.604611 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.604510 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:07:18 crc kubenswrapper[4861]: E0309 09:07:18.605601 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:07:18 crc kubenswrapper[4861]: E0309 09:07:18.605661 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.608950 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.609025 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.609049 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.609080 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.609105 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:18Z","lastTransitionTime":"2026-03-09T09:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.616184 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.616568 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.616600 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.616574 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.616812 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.617423 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.619449 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.619881 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.620334 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.645178 4861 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.658352 4861 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.670623 4861 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.671869 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.680608 4861 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.682599 4861 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.694472 4861 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.709944 4861 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.712341 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.712490 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.712526 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.712563 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.712588 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:18Z","lastTransitionTime":"2026-03-09T09:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.716905 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.716968 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.717006 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.717039 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.717082 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.717129 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.717177 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.717227 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.717278 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.717329 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.717416 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.717475 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.717537 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.717589 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.717647 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.717703 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.717751 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.717813 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.717874 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.717964 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.718025 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.718078 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.718138 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.718191 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.718242 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.718292 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.718339 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.718490 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.718542 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.718592 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.718646 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.718698 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.718753 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.718808 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.718963 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.719066 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.719119 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.719180 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.719230 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.719294 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.719347 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.719441 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.717639 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.719503 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.719563 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.719682 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.719808 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.719876 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.719931 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.720005 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.720050 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.720087 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.720123 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.720156 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.720444 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.720489 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.720524 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.717752 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.717763 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.718011 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.718119 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.718133 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.718169 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.718234 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.718274 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.721500 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.718391 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.718485 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.718618 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.718749 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.718883 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.719082 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.719330 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.719352 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.719452 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.719536 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.719591 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.719811 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.720051 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.720194 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.720817 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.721303 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.721361 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.721517 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.721508 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.721997 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.722080 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.722074 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.722107 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.722186 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.722227 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.722335 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.722404 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.722441 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.722454 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.722475 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.722510 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.722539 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.722545 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.722641 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.722858 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.722884 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.722887 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.722898 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.722976 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.723001 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.723019 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.723040 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.723057 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.723072 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.723090 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.723106 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.723122 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.723139 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.723156 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.723052 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.723230 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.723248 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.723267 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.723303 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.723344 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:07:18 crc kubenswrapper[4861]: E0309 09:07:18.723433 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:07:19.223413412 +0000 UTC m=+82.308452803 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.723451 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.723471 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.723491 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.723507 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.723522 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.723545 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.723561 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.723576 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.723592 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.723607 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.723622 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.723670 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.723688 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.723705 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.723721 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.723737 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.723752 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.723767 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.723783 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.723798 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.723814 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.723829 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.723845 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.723860 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.723876 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.723893 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.723910 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.723931 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.723948 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.723967 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.723982 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.723997 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.724012 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.724029 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.724046 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.724064 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.724079 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.724098 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.724115 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.724134 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.724153 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.724170 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.724187 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.724205 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.724222 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.724241 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.724258 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.724275 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.724292 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.724307 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.724323 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.724351 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.724389 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.724420 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.724446 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.724462 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.724478 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.724495 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.724512 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.724529 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.724547 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.724566 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.724584 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.724603 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.724622 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.724639 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.724656 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.724672 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.724689 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.724709 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.724727 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.724742 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.724759 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.724774 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.724790 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.724848 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.724866 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.724883 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.724901 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.724918 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.724937 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.724957 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.724974 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.724994 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.725009 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.725028 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.725050 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.725066 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.725086 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.725102 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.725119 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.725136 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.725157 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.725172 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.725189 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.725207 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.725230 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.725251 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.725272 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.725290 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.725307 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.725326 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.725343 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.725359 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.725402 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.725423 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.725441 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.725461 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.725478 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.725494 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.725541 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.725568 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.725591 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.725610 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.725631 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.725650 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.725667 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.725690 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.725708 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.725726 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.725747 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.725769 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.725786 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.725807 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.725867 4861 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.725881 4861 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.725893 4861 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.725903 4861 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.725913 4861 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.725949 4861 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.725959 4861 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.725968 4861 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.725978 4861 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.725987 4861 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.725997 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.726013 4861 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.726023 4861 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.726033 4861 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.726044 4861 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.726056 4861 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.726065 4861 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.726075 4861 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.726084 4861 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.726094 4861 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.726104 4861 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.726113 4861 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.726123 4861 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.726133 4861 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.726181 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.726192 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.726202 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.726211 4861 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.726222 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.726254 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.726267 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.726277 4861 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.726287 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.726296 4861 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.726306 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.728955 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.729717 4861 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.732497 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.736264 4861 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.723466 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.723859 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.724116 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.735669 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.724534 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.724899 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.725171 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.725463 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.725580 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.725742 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.726654 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.726658 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.726698 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.726794 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: E0309 09:07:18.727718 4861 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.728711 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.729327 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.729423 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.730200 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.730696 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.730996 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.731039 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.731894 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.731986 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.732007 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.732139 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.732180 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.732198 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.732234 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.732319 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: E0309 09:07:18.732531 4861 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.732792 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.733063 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.733501 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.733530 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.733642 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.734071 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.734102 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.734322 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.734365 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.734449 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.734570 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.734812 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.734871 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.734939 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.735216 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.735208 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.735257 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.735288 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.735399 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.735760 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.735972 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.736200 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.736293 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.736292 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.736453 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.736586 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.736826 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.736876 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.737232 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: E0309 09:07:18.738202 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 09:07:19.238117477 +0000 UTC m=+82.323156928 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 09:07:18 crc kubenswrapper[4861]: E0309 09:07:18.738350 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 09:07:19.238336933 +0000 UTC m=+82.323376444 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.736713 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.738700 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.738846 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.738993 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.738875 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.739629 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.739105 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.739848 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.739725 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.740306 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.740561 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.742151 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.742815 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.743431 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.743778 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.744019 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.744067 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.744320 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.744427 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.744427 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.744598 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.744623 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.744742 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.744753 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.744918 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.744969 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.745281 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.745423 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.745502 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.745840 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.745912 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.745985 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.746650 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.746659 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.746784 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.747776 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.748777 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.748864 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.749707 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.754468 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.754716 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.755586 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.755622 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.755685 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.756043 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.756919 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.757355 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.757643 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: E0309 09:07:18.758043 4861 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 09:07:18 crc kubenswrapper[4861]: E0309 09:07:18.758098 4861 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 09:07:18 crc kubenswrapper[4861]: E0309 09:07:18.758113 4861 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:07:18 crc kubenswrapper[4861]: E0309 09:07:18.758183 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 09:07:19.258162636 +0000 UTC m=+82.343202047 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.760352 4861 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:07:18 crc kubenswrapper[4861]: E0309 09:07:18.762994 4861 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 09:07:18 crc kubenswrapper[4861]: E0309 09:07:18.763035 4861 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 09:07:18 crc kubenswrapper[4861]: E0309 09:07:18.763052 4861 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.763067 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: E0309 09:07:18.763121 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 09:07:19.2631024 +0000 UTC m=+82.348141901 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.763628 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.765131 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.765730 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.767319 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.767488 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.767587 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.767796 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.767696 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.767992 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.767956 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.768136 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.768165 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.768304 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.769118 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.769512 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.769692 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.770246 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.770350 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.770635 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.770700 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.770973 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.772048 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.772074 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.772295 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.772212 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.772277 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.772537 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.772547 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.773128 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.773418 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.773572 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.773975 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.774053 4861 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.774131 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.774408 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.774829 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.774807 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.774904 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.775188 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.775237 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.775301 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.775467 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.775618 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.775780 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.775837 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.776135 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.778499 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.778843 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.779588 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.779755 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.780229 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.780400 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.780504 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.780886 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.797491 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.798102 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.802670 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.812536 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.816485 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.816526 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.816574 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.816597 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.816613 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:18Z","lastTransitionTime":"2026-03-09T09:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.827309 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.827414 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.827532 4861 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.827534 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.827559 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.827553 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.827626 4861 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.827645 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.827661 4861 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.827677 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.827694 4861 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.827876 4861 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.827892 4861 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.827907 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.827924 4861 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.827968 4861 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.827988 4861 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.828004 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.828019 4861 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.828034 4861 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.828049 4861 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.828064 4861 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.828080 4861 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.828096 4861 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.828115 4861 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.828133 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.828148 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.828164 4861 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.828180 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.828196 4861 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.828212 4861 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.828227 4861 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.828244 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.828258 4861 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.828273 4861 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.828289 4861 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.828303 4861 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.828319 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.828333 4861 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.828348 4861 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.828364 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.828411 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.828426 4861 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.828441 4861 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.828456 4861 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.828471 4861 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.828487 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.828502 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.828519 4861 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.828534 4861 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.828549 4861 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.828564 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.828580 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.828596 4861 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.828611 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.828651 4861 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.828673 4861 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.828689 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.828705 4861 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.828720 4861 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.828736 4861 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.828751 4861 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.828766 4861 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.828781 4861 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.828797 4861 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.828811 4861 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.828826 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.828841 4861 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.828858 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.828873 4861 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.828887 4861 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.828902 4861 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.828916 4861 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.828931 4861 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.828948 4861 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.828962 4861 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.828977 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.828992 4861 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.829009 4861 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.829026 4861 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.829040 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.829059 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.829074 4861 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.829090 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.829106 4861 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.829121 4861 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.829137 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.829153 4861 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.829170 4861 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.829186 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.829201 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.829217 4861 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.829233 4861 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.829249 4861 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.829266 4861 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.829284 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.829301 4861 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.829317 4861 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.829333 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.829349 4861 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.829365 4861 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.829404 4861 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.829420 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.829436 4861 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.829455 4861 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.829474 4861 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.829490 4861 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.829504 4861 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.829520 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.829539 4861 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.829555 4861 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.829571 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.829586 4861 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.829601 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.829617 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.829632 4861 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.829646 4861 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.829683 4861 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.829701 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.829716 4861 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.829732 4861 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.829749 4861 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.829764 4861 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.829781 4861 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.829797 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.829812 4861 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.829827 4861 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.829843 4861 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.829857 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.829872 4861 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.829889 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.829904 4861 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.829919 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.829933 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.829985 4861 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.830000 4861 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.830016 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.830032 4861 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.830048 4861 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.830064 4861 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.830080 4861 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.830096 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.830112 4861 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.830126 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.830187 4861 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.830204 4861 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.830219 4861 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.830234 4861 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.830249 4861 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.830264 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.830282 4861 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.830298 4861 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.830314 4861 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.830331 4861 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.830347 4861 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.830365 4861 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.830402 4861 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.830417 4861 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.830434 4861 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.830449 4861 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.830465 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.830521 4861 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.830539 4861 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.919450 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.919504 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.919521 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.919546 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.919563 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:18Z","lastTransitionTime":"2026-03-09T09:07:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.930806 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.941015 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 09:07:18 crc kubenswrapper[4861]: I0309 09:07:18.945470 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 09:07:18 crc kubenswrapper[4861]: W0309 09:07:18.952070 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-d1c254f76f779ea41ceeb3d6772fbcb251c477b7610e237212cce011306165c9 WatchSource:0}: Error finding container d1c254f76f779ea41ceeb3d6772fbcb251c477b7610e237212cce011306165c9: Status 404 returned error can't find the container with id d1c254f76f779ea41ceeb3d6772fbcb251c477b7610e237212cce011306165c9 Mar 09 09:07:18 crc kubenswrapper[4861]: W0309 09:07:18.973956 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-c8bbae2a0939eda570b8934902b4bb4fbfb6f5b9b5c465633b9b90bed8cef09f WatchSource:0}: Error finding container c8bbae2a0939eda570b8934902b4bb4fbfb6f5b9b5c465633b9b90bed8cef09f: Status 404 returned error can't find the container with id c8bbae2a0939eda570b8934902b4bb4fbfb6f5b9b5c465633b9b90bed8cef09f Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.003299 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"c8bbae2a0939eda570b8934902b4bb4fbfb6f5b9b5c465633b9b90bed8cef09f"} Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.004642 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"d1c254f76f779ea41ceeb3d6772fbcb251c477b7610e237212cce011306165c9"} Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.006334 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"3d404017e001e50a22ef67a166a4cb91a41a01e330ba957153868bcbfbd53ece"} Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.021702 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.021762 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.021780 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.021804 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.021823 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:19Z","lastTransitionTime":"2026-03-09T09:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.124032 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.124090 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.124108 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.124130 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.124146 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:19Z","lastTransitionTime":"2026-03-09T09:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.226614 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.226648 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.226657 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.226673 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.226682 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:19Z","lastTransitionTime":"2026-03-09T09:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.233071 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:07:19 crc kubenswrapper[4861]: E0309 09:07:19.233210 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:07:20.233193646 +0000 UTC m=+83.318233047 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.329469 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.329504 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.329513 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.329528 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.329537 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:19Z","lastTransitionTime":"2026-03-09T09:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.334361 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.334492 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.334549 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.334614 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:07:19 crc kubenswrapper[4861]: E0309 09:07:19.334688 4861 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 09:07:19 crc kubenswrapper[4861]: E0309 09:07:19.334708 4861 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 09:07:19 crc kubenswrapper[4861]: E0309 09:07:19.334720 4861 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 09:07:19 crc kubenswrapper[4861]: E0309 09:07:19.334746 4861 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 09:07:19 crc kubenswrapper[4861]: E0309 09:07:19.334759 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 09:07:20.334740538 +0000 UTC m=+83.419780039 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 09:07:19 crc kubenswrapper[4861]: E0309 09:07:19.334766 4861 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:07:19 crc kubenswrapper[4861]: E0309 09:07:19.334803 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 09:07:20.334781669 +0000 UTC m=+83.419821100 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 09:07:19 crc kubenswrapper[4861]: E0309 09:07:19.334832 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 09:07:20.334814159 +0000 UTC m=+83.419853600 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:07:19 crc kubenswrapper[4861]: E0309 09:07:19.334906 4861 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 09:07:19 crc kubenswrapper[4861]: E0309 09:07:19.334960 4861 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 09:07:19 crc kubenswrapper[4861]: E0309 09:07:19.334986 4861 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:07:19 crc kubenswrapper[4861]: E0309 09:07:19.335101 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 09:07:20.335069456 +0000 UTC m=+83.420108897 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.432643 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.432683 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.432692 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.432709 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.432719 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:19Z","lastTransitionTime":"2026-03-09T09:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.535735 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.535783 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.535798 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.535818 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.535832 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:19Z","lastTransitionTime":"2026-03-09T09:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.639932 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.640013 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.640036 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.640075 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.640100 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:19Z","lastTransitionTime":"2026-03-09T09:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.666039 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.667168 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.669527 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.672022 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.674107 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.675579 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.677076 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.679137 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.680412 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.682172 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.683073 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.684803 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.685784 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.686527 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.687696 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.688359 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.689594 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.690058 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.690701 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.691851 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.692389 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.693444 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.693933 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.695144 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.695676 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.696527 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.697933 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.698631 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.700674 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.702449 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.704187 4861 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.704454 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.707663 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.708895 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.709918 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.712705 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.713914 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.715286 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.716186 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.717698 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.719196 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.721653 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.723255 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.725598 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.727076 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.728868 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.731165 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.732818 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.733815 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.735990 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.738088 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.740925 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.742484 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.744061 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.744119 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.744137 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.744162 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.744179 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:19Z","lastTransitionTime":"2026-03-09T09:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.744756 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.848070 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.848170 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.848197 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.848228 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.848251 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:19Z","lastTransitionTime":"2026-03-09T09:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.951459 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.951559 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.951610 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.951638 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:19 crc kubenswrapper[4861]: I0309 09:07:19.951686 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:19Z","lastTransitionTime":"2026-03-09T09:07:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.012517 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"6feef2c388d1996afdc05df20c33bfa3a31b698acff5c0fbd5540a89232bd77d"} Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.012634 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"083d0704addd86d959d059e1aa35de6dc12a77441fe075c63e07683d71f8e45c"} Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.015603 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"5dfc8681e1432c42ca87803558410f2206409c7340927892c3dc3424948ce58d"} Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.033154 4861 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:07:20Z is after 2025-08-24T17:21:41Z" Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.049531 4861 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:07:20Z is after 2025-08-24T17:21:41Z" Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.054275 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.054312 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.054323 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.054348 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.054363 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:20Z","lastTransitionTime":"2026-03-09T09:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.063979 4861 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:07:20Z is after 2025-08-24T17:21:41Z" Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.077967 4861 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6feef2c388d1996afdc05df20c33bfa3a31b698acff5c0fbd5540a89232bd77d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://083d0704addd86d959d059e1aa35de6dc12a77441fe075c63e07683d71f8e45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:07:20Z is after 2025-08-24T17:21:41Z" Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.094230 4861 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c3c32db-aaa0-410d-90f7-2056906878f4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:06:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:06:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a138696f23970dc61461f70321b50c12dd25a9695ec721ccc37f3ae03b2faa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3890d561e9788da50117d5ae37e9bc3decefea7ca10d396f77a05a3b2874c0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3890d561e9788da50117d5ae37e9bc3decefea7ca10d396f77a05a3b2874c0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:05:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:05:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:07:20Z is after 2025-08-24T17:21:41Z" Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.112359 4861 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:07:20Z is after 2025-08-24T17:21:41Z" Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.136739 4861 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:07:20Z is after 2025-08-24T17:21:41Z" Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.156346 4861 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:07:20Z is after 2025-08-24T17:21:41Z" Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.156911 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.156952 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.156968 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.156990 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.157006 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:20Z","lastTransitionTime":"2026-03-09T09:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.169591 4861 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:07:20Z is after 2025-08-24T17:21:41Z" Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.183301 4861 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:07:20Z is after 2025-08-24T17:21:41Z" Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.199123 4861 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6feef2c388d1996afdc05df20c33bfa3a31b698acff5c0fbd5540a89232bd77d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://083d0704addd86d959d059e1aa35de6dc12a77441fe075c63e07683d71f8e45c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:07:20Z is after 2025-08-24T17:21:41Z" Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.215228 4861 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6c3c32db-aaa0-410d-90f7-2056906878f4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:05:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:06:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:06:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T09:05:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a138696f23970dc61461f70321b50c12dd25a9695ec721ccc37f3ae03b2faa3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:05:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b3890d561e9788da50117d5ae37e9bc3decefea7ca10d396f77a05a3b2874c0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3890d561e9788da50117d5ae37e9bc3decefea7ca10d396f77a05a3b2874c0e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T09:05:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T09:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T09:05:57Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:07:20Z is after 2025-08-24T17:21:41Z" Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.231998 4861 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:18Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:07:20Z is after 2025-08-24T17:21:41Z" Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.243047 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:07:20 crc kubenswrapper[4861]: E0309 09:07:20.243336 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:07:22.24329237 +0000 UTC m=+85.328331811 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.248288 4861 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:20Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T09:07:20Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5dfc8681e1432c42ca87803558410f2206409c7340927892c3dc3424948ce58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T09:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T09:07:20Z is after 2025-08-24T17:21:41Z" Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.259336 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.259447 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.259459 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.259476 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.259487 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:20Z","lastTransitionTime":"2026-03-09T09:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.343966 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.344052 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.344122 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.344175 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:07:20 crc kubenswrapper[4861]: E0309 09:07:20.344253 4861 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 09:07:20 crc kubenswrapper[4861]: E0309 09:07:20.344335 4861 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 09:07:20 crc kubenswrapper[4861]: E0309 09:07:20.344399 4861 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 09:07:20 crc kubenswrapper[4861]: E0309 09:07:20.344408 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 09:07:22.344344768 +0000 UTC m=+85.429384209 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 09:07:20 crc kubenswrapper[4861]: E0309 09:07:20.344413 4861 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:07:20 crc kubenswrapper[4861]: E0309 09:07:20.344345 4861 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 09:07:20 crc kubenswrapper[4861]: E0309 09:07:20.344528 4861 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 09:07:20 crc kubenswrapper[4861]: E0309 09:07:20.344546 4861 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:07:20 crc kubenswrapper[4861]: E0309 09:07:20.344333 4861 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 09:07:20 crc kubenswrapper[4861]: E0309 09:07:20.344493 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 09:07:22.344472791 +0000 UTC m=+85.429512232 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:07:20 crc kubenswrapper[4861]: E0309 09:07:20.344705 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 09:07:22.344674338 +0000 UTC m=+85.429713829 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:07:20 crc kubenswrapper[4861]: E0309 09:07:20.344749 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 09:07:22.344716609 +0000 UTC m=+85.429756160 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.362704 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.362750 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.362759 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.362773 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.362782 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:20Z","lastTransitionTime":"2026-03-09T09:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.465003 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.465062 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.465088 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.465102 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.465113 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:20Z","lastTransitionTime":"2026-03-09T09:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.568450 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.568506 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.568516 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.568534 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.568545 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:20Z","lastTransitionTime":"2026-03-09T09:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.657272 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.657363 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.657303 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:07:20 crc kubenswrapper[4861]: E0309 09:07:20.657500 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:07:20 crc kubenswrapper[4861]: E0309 09:07:20.657597 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:07:20 crc kubenswrapper[4861]: E0309 09:07:20.657705 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.671438 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.671490 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.671498 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.671513 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.671523 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:20Z","lastTransitionTime":"2026-03-09T09:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.773960 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.774007 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.774017 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.774034 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.774045 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:20Z","lastTransitionTime":"2026-03-09T09:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.876915 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.876990 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.877008 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.877035 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.877055 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:20Z","lastTransitionTime":"2026-03-09T09:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.980013 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.980077 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.980095 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.980120 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:20 crc kubenswrapper[4861]: I0309 09:07:20.980144 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:20Z","lastTransitionTime":"2026-03-09T09:07:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:21 crc kubenswrapper[4861]: I0309 09:07:21.083196 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:21 crc kubenswrapper[4861]: I0309 09:07:21.083238 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:21 crc kubenswrapper[4861]: I0309 09:07:21.083249 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:21 crc kubenswrapper[4861]: I0309 09:07:21.083265 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:21 crc kubenswrapper[4861]: I0309 09:07:21.083276 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:21Z","lastTransitionTime":"2026-03-09T09:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:21 crc kubenswrapper[4861]: I0309 09:07:21.186428 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:21 crc kubenswrapper[4861]: I0309 09:07:21.186503 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:21 crc kubenswrapper[4861]: I0309 09:07:21.186515 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:21 crc kubenswrapper[4861]: I0309 09:07:21.186554 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:21 crc kubenswrapper[4861]: I0309 09:07:21.186568 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:21Z","lastTransitionTime":"2026-03-09T09:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:21 crc kubenswrapper[4861]: I0309 09:07:21.289428 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:21 crc kubenswrapper[4861]: I0309 09:07:21.289502 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:21 crc kubenswrapper[4861]: I0309 09:07:21.289522 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:21 crc kubenswrapper[4861]: I0309 09:07:21.289545 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:21 crc kubenswrapper[4861]: I0309 09:07:21.289561 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:21Z","lastTransitionTime":"2026-03-09T09:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:21 crc kubenswrapper[4861]: I0309 09:07:21.391493 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:21 crc kubenswrapper[4861]: I0309 09:07:21.391547 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:21 crc kubenswrapper[4861]: I0309 09:07:21.391560 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:21 crc kubenswrapper[4861]: I0309 09:07:21.391578 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:21 crc kubenswrapper[4861]: I0309 09:07:21.391589 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:21Z","lastTransitionTime":"2026-03-09T09:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:21 crc kubenswrapper[4861]: I0309 09:07:21.493784 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:21 crc kubenswrapper[4861]: I0309 09:07:21.493852 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:21 crc kubenswrapper[4861]: I0309 09:07:21.493867 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:21 crc kubenswrapper[4861]: I0309 09:07:21.493888 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:21 crc kubenswrapper[4861]: I0309 09:07:21.493902 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:21Z","lastTransitionTime":"2026-03-09T09:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:21 crc kubenswrapper[4861]: I0309 09:07:21.595877 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:21 crc kubenswrapper[4861]: I0309 09:07:21.595946 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:21 crc kubenswrapper[4861]: I0309 09:07:21.595964 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:21 crc kubenswrapper[4861]: I0309 09:07:21.595989 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:21 crc kubenswrapper[4861]: I0309 09:07:21.596006 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:21Z","lastTransitionTime":"2026-03-09T09:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:21 crc kubenswrapper[4861]: I0309 09:07:21.698322 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:21 crc kubenswrapper[4861]: I0309 09:07:21.698394 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:21 crc kubenswrapper[4861]: I0309 09:07:21.698409 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:21 crc kubenswrapper[4861]: I0309 09:07:21.698425 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:21 crc kubenswrapper[4861]: I0309 09:07:21.698436 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:21Z","lastTransitionTime":"2026-03-09T09:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:21 crc kubenswrapper[4861]: I0309 09:07:21.801120 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:21 crc kubenswrapper[4861]: I0309 09:07:21.801186 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:21 crc kubenswrapper[4861]: I0309 09:07:21.801203 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:21 crc kubenswrapper[4861]: I0309 09:07:21.801229 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:21 crc kubenswrapper[4861]: I0309 09:07:21.801247 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:21Z","lastTransitionTime":"2026-03-09T09:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:21 crc kubenswrapper[4861]: I0309 09:07:21.904059 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:21 crc kubenswrapper[4861]: I0309 09:07:21.904103 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:21 crc kubenswrapper[4861]: I0309 09:07:21.904119 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:21 crc kubenswrapper[4861]: I0309 09:07:21.904141 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:21 crc kubenswrapper[4861]: I0309 09:07:21.904159 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:21Z","lastTransitionTime":"2026-03-09T09:07:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:22 crc kubenswrapper[4861]: I0309 09:07:22.006435 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:22 crc kubenswrapper[4861]: I0309 09:07:22.006491 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:22 crc kubenswrapper[4861]: I0309 09:07:22.006508 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:22 crc kubenswrapper[4861]: I0309 09:07:22.006531 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:22 crc kubenswrapper[4861]: I0309 09:07:22.006548 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:22Z","lastTransitionTime":"2026-03-09T09:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:22 crc kubenswrapper[4861]: I0309 09:07:22.022844 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"250347225fe5b95612658e017cb044083f164adde2841db5fba96c189142f3e9"} Mar 09 09:07:22 crc kubenswrapper[4861]: I0309 09:07:22.110455 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:22 crc kubenswrapper[4861]: I0309 09:07:22.110522 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:22 crc kubenswrapper[4861]: I0309 09:07:22.110542 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:22 crc kubenswrapper[4861]: I0309 09:07:22.110568 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:22 crc kubenswrapper[4861]: I0309 09:07:22.110589 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:22Z","lastTransitionTime":"2026-03-09T09:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:22 crc kubenswrapper[4861]: I0309 09:07:22.170345 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=4.170314211 podStartE2EDuration="4.170314211s" podCreationTimestamp="2026-03-09 09:07:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:07:22.169850579 +0000 UTC m=+85.254890030" watchObservedRunningTime="2026-03-09 09:07:22.170314211 +0000 UTC m=+85.255353662" Mar 09 09:07:22 crc kubenswrapper[4861]: I0309 09:07:22.213761 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:22 crc kubenswrapper[4861]: I0309 09:07:22.213815 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:22 crc kubenswrapper[4861]: I0309 09:07:22.213833 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:22 crc kubenswrapper[4861]: I0309 09:07:22.213858 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:22 crc kubenswrapper[4861]: I0309 09:07:22.213876 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:22Z","lastTransitionTime":"2026-03-09T09:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:22 crc kubenswrapper[4861]: I0309 09:07:22.265750 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:07:22 crc kubenswrapper[4861]: E0309 09:07:22.265892 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:07:26.265862542 +0000 UTC m=+89.350901973 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:22 crc kubenswrapper[4861]: I0309 09:07:22.316751 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:22 crc kubenswrapper[4861]: I0309 09:07:22.316801 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:22 crc kubenswrapper[4861]: I0309 09:07:22.316818 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:22 crc kubenswrapper[4861]: I0309 09:07:22.316842 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:22 crc kubenswrapper[4861]: I0309 09:07:22.316859 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:22Z","lastTransitionTime":"2026-03-09T09:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:22 crc kubenswrapper[4861]: I0309 09:07:22.366740 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:07:22 crc kubenswrapper[4861]: I0309 09:07:22.366894 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:07:22 crc kubenswrapper[4861]: I0309 09:07:22.366952 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:07:22 crc kubenswrapper[4861]: I0309 09:07:22.367007 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:07:22 crc kubenswrapper[4861]: E0309 09:07:22.367127 4861 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 09:07:22 crc kubenswrapper[4861]: E0309 09:07:22.367155 4861 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 09:07:22 crc kubenswrapper[4861]: E0309 09:07:22.367236 4861 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 09:07:22 crc kubenswrapper[4861]: E0309 09:07:22.367289 4861 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 09:07:22 crc kubenswrapper[4861]: E0309 09:07:22.367312 4861 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:07:22 crc kubenswrapper[4861]: E0309 09:07:22.367247 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 09:07:26.367215948 +0000 UTC m=+89.452255399 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 09:07:22 crc kubenswrapper[4861]: E0309 09:07:22.367168 4861 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 09:07:22 crc kubenswrapper[4861]: E0309 09:07:22.367443 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 09:07:26.367409923 +0000 UTC m=+89.452449364 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 09:07:22 crc kubenswrapper[4861]: E0309 09:07:22.367454 4861 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 09:07:22 crc kubenswrapper[4861]: E0309 09:07:22.367476 4861 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:07:22 crc kubenswrapper[4861]: E0309 09:07:22.367481 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 09:07:26.367467835 +0000 UTC m=+89.452507266 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:07:22 crc kubenswrapper[4861]: E0309 09:07:22.367568 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 09:07:26.367534887 +0000 UTC m=+89.452574328 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:07:22 crc kubenswrapper[4861]: I0309 09:07:22.419225 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:22 crc kubenswrapper[4861]: I0309 09:07:22.419282 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:22 crc kubenswrapper[4861]: I0309 09:07:22.419296 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:22 crc kubenswrapper[4861]: I0309 09:07:22.419314 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:22 crc kubenswrapper[4861]: I0309 09:07:22.419330 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:22Z","lastTransitionTime":"2026-03-09T09:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:22 crc kubenswrapper[4861]: I0309 09:07:22.522808 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:22 crc kubenswrapper[4861]: I0309 09:07:22.522877 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:22 crc kubenswrapper[4861]: I0309 09:07:22.522935 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:22 crc kubenswrapper[4861]: I0309 09:07:22.522963 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:22 crc kubenswrapper[4861]: I0309 09:07:22.522980 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:22Z","lastTransitionTime":"2026-03-09T09:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:22 crc kubenswrapper[4861]: I0309 09:07:22.626434 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:22 crc kubenswrapper[4861]: I0309 09:07:22.626523 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:22 crc kubenswrapper[4861]: I0309 09:07:22.626546 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:22 crc kubenswrapper[4861]: I0309 09:07:22.626579 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:22 crc kubenswrapper[4861]: I0309 09:07:22.626598 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:22Z","lastTransitionTime":"2026-03-09T09:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:22 crc kubenswrapper[4861]: I0309 09:07:22.657777 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:07:22 crc kubenswrapper[4861]: I0309 09:07:22.657853 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:07:22 crc kubenswrapper[4861]: I0309 09:07:22.657780 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:07:22 crc kubenswrapper[4861]: E0309 09:07:22.658025 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:07:22 crc kubenswrapper[4861]: E0309 09:07:22.658126 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:07:22 crc kubenswrapper[4861]: E0309 09:07:22.658559 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:07:22 crc kubenswrapper[4861]: I0309 09:07:22.673856 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 09 09:07:22 crc kubenswrapper[4861]: I0309 09:07:22.674061 4861 scope.go:117] "RemoveContainer" containerID="f88f2e129d4bfc17b0ac4416da5c5096bf314e097dfa40a48858a83425ca91e1" Mar 09 09:07:22 crc kubenswrapper[4861]: E0309 09:07:22.674478 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 09:07:22 crc kubenswrapper[4861]: I0309 09:07:22.729203 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:22 crc kubenswrapper[4861]: I0309 09:07:22.729254 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:22 crc kubenswrapper[4861]: I0309 09:07:22.729270 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:22 crc kubenswrapper[4861]: I0309 09:07:22.729293 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:22 crc kubenswrapper[4861]: I0309 09:07:22.729310 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:22Z","lastTransitionTime":"2026-03-09T09:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:22 crc kubenswrapper[4861]: I0309 09:07:22.832446 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:22 crc kubenswrapper[4861]: I0309 09:07:22.832487 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:22 crc kubenswrapper[4861]: I0309 09:07:22.832495 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:22 crc kubenswrapper[4861]: I0309 09:07:22.832509 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:22 crc kubenswrapper[4861]: I0309 09:07:22.832518 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:22Z","lastTransitionTime":"2026-03-09T09:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:22 crc kubenswrapper[4861]: I0309 09:07:22.936064 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:22 crc kubenswrapper[4861]: I0309 09:07:22.936145 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:22 crc kubenswrapper[4861]: I0309 09:07:22.936169 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:22 crc kubenswrapper[4861]: I0309 09:07:22.936201 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:22 crc kubenswrapper[4861]: I0309 09:07:22.936226 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:22Z","lastTransitionTime":"2026-03-09T09:07:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:23 crc kubenswrapper[4861]: I0309 09:07:23.027183 4861 scope.go:117] "RemoveContainer" containerID="f88f2e129d4bfc17b0ac4416da5c5096bf314e097dfa40a48858a83425ca91e1" Mar 09 09:07:23 crc kubenswrapper[4861]: E0309 09:07:23.027458 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 09:07:23 crc kubenswrapper[4861]: I0309 09:07:23.038344 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:23 crc kubenswrapper[4861]: I0309 09:07:23.038446 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:23 crc kubenswrapper[4861]: I0309 09:07:23.038479 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:23 crc kubenswrapper[4861]: I0309 09:07:23.038506 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:23 crc kubenswrapper[4861]: I0309 09:07:23.038523 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:23Z","lastTransitionTime":"2026-03-09T09:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:23 crc kubenswrapper[4861]: I0309 09:07:23.142191 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:23 crc kubenswrapper[4861]: I0309 09:07:23.142266 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:23 crc kubenswrapper[4861]: I0309 09:07:23.142289 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:23 crc kubenswrapper[4861]: I0309 09:07:23.142317 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:23 crc kubenswrapper[4861]: I0309 09:07:23.142340 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:23Z","lastTransitionTime":"2026-03-09T09:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:23 crc kubenswrapper[4861]: I0309 09:07:23.245360 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:23 crc kubenswrapper[4861]: I0309 09:07:23.245469 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:23 crc kubenswrapper[4861]: I0309 09:07:23.245491 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:23 crc kubenswrapper[4861]: I0309 09:07:23.245514 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:23 crc kubenswrapper[4861]: I0309 09:07:23.245531 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:23Z","lastTransitionTime":"2026-03-09T09:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:23 crc kubenswrapper[4861]: I0309 09:07:23.349448 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:23 crc kubenswrapper[4861]: I0309 09:07:23.349529 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:23 crc kubenswrapper[4861]: I0309 09:07:23.349555 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:23 crc kubenswrapper[4861]: I0309 09:07:23.349591 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:23 crc kubenswrapper[4861]: I0309 09:07:23.349612 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:23Z","lastTransitionTime":"2026-03-09T09:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:23 crc kubenswrapper[4861]: I0309 09:07:23.452730 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:23 crc kubenswrapper[4861]: I0309 09:07:23.452800 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:23 crc kubenswrapper[4861]: I0309 09:07:23.452823 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:23 crc kubenswrapper[4861]: I0309 09:07:23.452851 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:23 crc kubenswrapper[4861]: I0309 09:07:23.452873 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:23Z","lastTransitionTime":"2026-03-09T09:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:23 crc kubenswrapper[4861]: I0309 09:07:23.556035 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:23 crc kubenswrapper[4861]: I0309 09:07:23.556085 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:23 crc kubenswrapper[4861]: I0309 09:07:23.556097 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:23 crc kubenswrapper[4861]: I0309 09:07:23.556115 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:23 crc kubenswrapper[4861]: I0309 09:07:23.556150 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:23Z","lastTransitionTime":"2026-03-09T09:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:23 crc kubenswrapper[4861]: I0309 09:07:23.658574 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:23 crc kubenswrapper[4861]: I0309 09:07:23.658608 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:23 crc kubenswrapper[4861]: I0309 09:07:23.658615 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:23 crc kubenswrapper[4861]: I0309 09:07:23.658629 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:23 crc kubenswrapper[4861]: I0309 09:07:23.658638 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:23Z","lastTransitionTime":"2026-03-09T09:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:23 crc kubenswrapper[4861]: I0309 09:07:23.761290 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:23 crc kubenswrapper[4861]: I0309 09:07:23.761403 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:23 crc kubenswrapper[4861]: I0309 09:07:23.761427 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:23 crc kubenswrapper[4861]: I0309 09:07:23.761459 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:23 crc kubenswrapper[4861]: I0309 09:07:23.761480 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:23Z","lastTransitionTime":"2026-03-09T09:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:23 crc kubenswrapper[4861]: I0309 09:07:23.863952 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:23 crc kubenswrapper[4861]: I0309 09:07:23.863989 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:23 crc kubenswrapper[4861]: I0309 09:07:23.864000 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:23 crc kubenswrapper[4861]: I0309 09:07:23.864013 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:23 crc kubenswrapper[4861]: I0309 09:07:23.864020 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:23Z","lastTransitionTime":"2026-03-09T09:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:23 crc kubenswrapper[4861]: I0309 09:07:23.966041 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:23 crc kubenswrapper[4861]: I0309 09:07:23.966082 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:23 crc kubenswrapper[4861]: I0309 09:07:23.966094 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:23 crc kubenswrapper[4861]: I0309 09:07:23.966111 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:23 crc kubenswrapper[4861]: I0309 09:07:23.966127 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:23Z","lastTransitionTime":"2026-03-09T09:07:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.068945 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.069007 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.069033 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.069062 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.069084 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:24Z","lastTransitionTime":"2026-03-09T09:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.172708 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.172764 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.172775 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.172793 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.172805 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:24Z","lastTransitionTime":"2026-03-09T09:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.253748 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-lccwl"] Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.254181 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-lccwl" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.257366 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.257985 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.258077 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.269064 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-5g7gc"] Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.269659 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.272345 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-7fs7j"] Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.274196 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-7fs7j" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.274304 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.275174 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.275743 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.276048 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.277886 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.277966 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.277996 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.278029 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.278065 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:24Z","lastTransitionTime":"2026-03-09T09:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.280213 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.281303 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.281351 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.281879 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-dnjcp"] Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.281445 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.282316 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.281883 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.283188 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-dnjcp" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.287858 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.287902 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.304850 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-kmjsq"] Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.306503 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.310052 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.310268 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.310358 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.310388 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.314722 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.314941 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.315287 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.380689 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.380729 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.380738 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.380753 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.380765 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:24Z","lastTransitionTime":"2026-03-09T09:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.385520 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2a7b6abe-370e-4514-9777-7483bb64e1f0-multus-daemon-config\") pod \"multus-dnjcp\" (UID: \"2a7b6abe-370e-4514-9777-7483bb64e1f0\") " pod="openshift-multus/multus-dnjcp" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.385580 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-run-openvswitch\") pod \"ovnkube-node-kmjsq\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.385611 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-log-socket\") pod \"ovnkube-node-kmjsq\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.385641 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3a306fd7-dca6-4973-b8fa-4bd07840a104-cni-binary-copy\") pod \"multus-additional-cni-plugins-7fs7j\" (UID: \"3a306fd7-dca6-4973-b8fa-4bd07840a104\") " pod="openshift-multus/multus-additional-cni-plugins-7fs7j" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.385671 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2a7b6abe-370e-4514-9777-7483bb64e1f0-hostroot\") pod \"multus-dnjcp\" (UID: \"2a7b6abe-370e-4514-9777-7483bb64e1f0\") " pod="openshift-multus/multus-dnjcp" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.385702 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-host-run-netns\") pod \"ovnkube-node-kmjsq\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.385731 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/752be2d4-f338-4c5e-b51e-452fd8391c73-ovn-node-metrics-cert\") pod \"ovnkube-node-kmjsq\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.385759 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/752be2d4-f338-4c5e-b51e-452fd8391c73-ovnkube-script-lib\") pod \"ovnkube-node-kmjsq\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.385787 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3a306fd7-dca6-4973-b8fa-4bd07840a104-os-release\") pod \"multus-additional-cni-plugins-7fs7j\" (UID: \"3a306fd7-dca6-4973-b8fa-4bd07840a104\") " pod="openshift-multus/multus-additional-cni-plugins-7fs7j" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.385851 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-host-cni-bin\") pod \"ovnkube-node-kmjsq\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.385888 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2a7b6abe-370e-4514-9777-7483bb64e1f0-multus-cni-dir\") pod \"multus-dnjcp\" (UID: \"2a7b6abe-370e-4514-9777-7483bb64e1f0\") " pod="openshift-multus/multus-dnjcp" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.385920 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2a7b6abe-370e-4514-9777-7483bb64e1f0-host-var-lib-kubelet\") pod \"multus-dnjcp\" (UID: \"2a7b6abe-370e-4514-9777-7483bb64e1f0\") " pod="openshift-multus/multus-dnjcp" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.385949 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2a7b6abe-370e-4514-9777-7483bb64e1f0-os-release\") pod \"multus-dnjcp\" (UID: \"2a7b6abe-370e-4514-9777-7483bb64e1f0\") " pod="openshift-multus/multus-dnjcp" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.385977 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2a7b6abe-370e-4514-9777-7483bb64e1f0-host-var-lib-cni-multus\") pod \"multus-dnjcp\" (UID: \"2a7b6abe-370e-4514-9777-7483bb64e1f0\") " pod="openshift-multus/multus-dnjcp" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.386008 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3a306fd7-dca6-4973-b8fa-4bd07840a104-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7fs7j\" (UID: \"3a306fd7-dca6-4973-b8fa-4bd07840a104\") " pod="openshift-multus/multus-additional-cni-plugins-7fs7j" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.386036 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssrjl\" (UniqueName: \"kubernetes.io/projected/3a306fd7-dca6-4973-b8fa-4bd07840a104-kube-api-access-ssrjl\") pod \"multus-additional-cni-plugins-7fs7j\" (UID: \"3a306fd7-dca6-4973-b8fa-4bd07840a104\") " pod="openshift-multus/multus-additional-cni-plugins-7fs7j" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.386065 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxs46\" (UniqueName: \"kubernetes.io/projected/752be2d4-f338-4c5e-b51e-452fd8391c73-kube-api-access-fxs46\") pod \"ovnkube-node-kmjsq\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.386116 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2a7b6abe-370e-4514-9777-7483bb64e1f0-host-run-k8s-cni-cncf-io\") pod \"multus-dnjcp\" (UID: \"2a7b6abe-370e-4514-9777-7483bb64e1f0\") " pod="openshift-multus/multus-dnjcp" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.386150 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-systemd-units\") pod \"ovnkube-node-kmjsq\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.386180 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/752be2d4-f338-4c5e-b51e-452fd8391c73-ovnkube-config\") pod \"ovnkube-node-kmjsq\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.386208 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2a7b6abe-370e-4514-9777-7483bb64e1f0-host-run-netns\") pod \"multus-dnjcp\" (UID: \"2a7b6abe-370e-4514-9777-7483bb64e1f0\") " pod="openshift-multus/multus-dnjcp" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.386248 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2a7b6abe-370e-4514-9777-7483bb64e1f0-etc-kubernetes\") pod \"multus-dnjcp\" (UID: \"2a7b6abe-370e-4514-9777-7483bb64e1f0\") " pod="openshift-multus/multus-dnjcp" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.386280 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/752be2d4-f338-4c5e-b51e-452fd8391c73-env-overrides\") pod \"ovnkube-node-kmjsq\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.386309 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/6f7875e3-174f-4c67-8675-d878de74aa4f-rootfs\") pod \"machine-config-daemon-5g7gc\" (UID: \"6f7875e3-174f-4c67-8675-d878de74aa4f\") " pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.386339 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkqlf\" (UniqueName: \"kubernetes.io/projected/6f7875e3-174f-4c67-8675-d878de74aa4f-kube-api-access-zkqlf\") pod \"machine-config-daemon-5g7gc\" (UID: \"6f7875e3-174f-4c67-8675-d878de74aa4f\") " pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.386401 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3a306fd7-dca6-4973-b8fa-4bd07840a104-system-cni-dir\") pod \"multus-additional-cni-plugins-7fs7j\" (UID: \"3a306fd7-dca6-4973-b8fa-4bd07840a104\") " pod="openshift-multus/multus-additional-cni-plugins-7fs7j" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.386452 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-run-systemd\") pod \"ovnkube-node-kmjsq\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.386475 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-run-ovn\") pod \"ovnkube-node-kmjsq\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.386490 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-host-run-ovn-kubernetes\") pod \"ovnkube-node-kmjsq\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.386518 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2a7b6abe-370e-4514-9777-7483bb64e1f0-cnibin\") pod \"multus-dnjcp\" (UID: \"2a7b6abe-370e-4514-9777-7483bb64e1f0\") " pod="openshift-multus/multus-dnjcp" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.386571 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-host-slash\") pod \"ovnkube-node-kmjsq\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.386616 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-etc-openvswitch\") pod \"ovnkube-node-kmjsq\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.386646 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3a306fd7-dca6-4973-b8fa-4bd07840a104-cnibin\") pod \"multus-additional-cni-plugins-7fs7j\" (UID: \"3a306fd7-dca6-4973-b8fa-4bd07840a104\") " pod="openshift-multus/multus-additional-cni-plugins-7fs7j" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.386675 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-node-log\") pod \"ovnkube-node-kmjsq\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.386703 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6f7875e3-174f-4c67-8675-d878de74aa4f-mcd-auth-proxy-config\") pod \"machine-config-daemon-5g7gc\" (UID: \"6f7875e3-174f-4c67-8675-d878de74aa4f\") " pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.386749 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2a7b6abe-370e-4514-9777-7483bb64e1f0-cni-binary-copy\") pod \"multus-dnjcp\" (UID: \"2a7b6abe-370e-4514-9777-7483bb64e1f0\") " pod="openshift-multus/multus-dnjcp" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.386795 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-var-lib-openvswitch\") pod \"ovnkube-node-kmjsq\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.386824 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6f7875e3-174f-4c67-8675-d878de74aa4f-proxy-tls\") pod \"machine-config-daemon-5g7gc\" (UID: \"6f7875e3-174f-4c67-8675-d878de74aa4f\") " pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.386853 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2a7b6abe-370e-4514-9777-7483bb64e1f0-host-var-lib-cni-bin\") pod \"multus-dnjcp\" (UID: \"2a7b6abe-370e-4514-9777-7483bb64e1f0\") " pod="openshift-multus/multus-dnjcp" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.386881 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/78a93e94-047c-45c2-81b3-161408c41b0a-hosts-file\") pod \"node-resolver-lccwl\" (UID: \"78a93e94-047c-45c2-81b3-161408c41b0a\") " pod="openshift-dns/node-resolver-lccwl" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.386912 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4mg7\" (UniqueName: \"kubernetes.io/projected/78a93e94-047c-45c2-81b3-161408c41b0a-kube-api-access-z4mg7\") pod \"node-resolver-lccwl\" (UID: \"78a93e94-047c-45c2-81b3-161408c41b0a\") " pod="openshift-dns/node-resolver-lccwl" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.386941 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-host-cni-netd\") pod \"ovnkube-node-kmjsq\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.386968 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2a7b6abe-370e-4514-9777-7483bb64e1f0-system-cni-dir\") pod \"multus-dnjcp\" (UID: \"2a7b6abe-370e-4514-9777-7483bb64e1f0\") " pod="openshift-multus/multus-dnjcp" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.387009 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2a7b6abe-370e-4514-9777-7483bb64e1f0-host-run-multus-certs\") pod \"multus-dnjcp\" (UID: \"2a7b6abe-370e-4514-9777-7483bb64e1f0\") " pod="openshift-multus/multus-dnjcp" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.387037 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3a306fd7-dca6-4973-b8fa-4bd07840a104-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7fs7j\" (UID: \"3a306fd7-dca6-4973-b8fa-4bd07840a104\") " pod="openshift-multus/multus-additional-cni-plugins-7fs7j" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.387065 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vt5b6\" (UniqueName: \"kubernetes.io/projected/2a7b6abe-370e-4514-9777-7483bb64e1f0-kube-api-access-vt5b6\") pod \"multus-dnjcp\" (UID: \"2a7b6abe-370e-4514-9777-7483bb64e1f0\") " pod="openshift-multus/multus-dnjcp" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.387131 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2a7b6abe-370e-4514-9777-7483bb64e1f0-multus-conf-dir\") pod \"multus-dnjcp\" (UID: \"2a7b6abe-370e-4514-9777-7483bb64e1f0\") " pod="openshift-multus/multus-dnjcp" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.387171 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-host-kubelet\") pod \"ovnkube-node-kmjsq\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.387196 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kmjsq\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.387240 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2a7b6abe-370e-4514-9777-7483bb64e1f0-multus-socket-dir-parent\") pod \"multus-dnjcp\" (UID: \"2a7b6abe-370e-4514-9777-7483bb64e1f0\") " pod="openshift-multus/multus-dnjcp" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.452763 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-tvzgr"] Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.453139 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-tvzgr" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.455195 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.455246 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.455362 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.455868 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.482389 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.482421 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.482431 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.482445 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.482453 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:24Z","lastTransitionTime":"2026-03-09T09:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.488122 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-host-run-netns\") pod \"ovnkube-node-kmjsq\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.488183 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/752be2d4-f338-4c5e-b51e-452fd8391c73-ovn-node-metrics-cert\") pod \"ovnkube-node-kmjsq\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.488225 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/752be2d4-f338-4c5e-b51e-452fd8391c73-ovnkube-script-lib\") pod \"ovnkube-node-kmjsq\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.488258 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3a306fd7-dca6-4973-b8fa-4bd07840a104-os-release\") pod \"multus-additional-cni-plugins-7fs7j\" (UID: \"3a306fd7-dca6-4973-b8fa-4bd07840a104\") " pod="openshift-multus/multus-additional-cni-plugins-7fs7j" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.488265 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-host-run-netns\") pod \"ovnkube-node-kmjsq\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.488289 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2a7b6abe-370e-4514-9777-7483bb64e1f0-hostroot\") pod \"multus-dnjcp\" (UID: \"2a7b6abe-370e-4514-9777-7483bb64e1f0\") " pod="openshift-multus/multus-dnjcp" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.488340 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2a7b6abe-370e-4514-9777-7483bb64e1f0-hostroot\") pod \"multus-dnjcp\" (UID: \"2a7b6abe-370e-4514-9777-7483bb64e1f0\") " pod="openshift-multus/multus-dnjcp" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.488435 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3a306fd7-dca6-4973-b8fa-4bd07840a104-os-release\") pod \"multus-additional-cni-plugins-7fs7j\" (UID: \"3a306fd7-dca6-4973-b8fa-4bd07840a104\") " pod="openshift-multus/multus-additional-cni-plugins-7fs7j" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.488464 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-host-cni-bin\") pod \"ovnkube-node-kmjsq\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.488565 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-host-cni-bin\") pod \"ovnkube-node-kmjsq\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.488665 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2a7b6abe-370e-4514-9777-7483bb64e1f0-multus-cni-dir\") pod \"multus-dnjcp\" (UID: \"2a7b6abe-370e-4514-9777-7483bb64e1f0\") " pod="openshift-multus/multus-dnjcp" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.488711 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2a7b6abe-370e-4514-9777-7483bb64e1f0-host-var-lib-kubelet\") pod \"multus-dnjcp\" (UID: \"2a7b6abe-370e-4514-9777-7483bb64e1f0\") " pod="openshift-multus/multus-dnjcp" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.488779 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2a7b6abe-370e-4514-9777-7483bb64e1f0-host-var-lib-cni-multus\") pod \"multus-dnjcp\" (UID: \"2a7b6abe-370e-4514-9777-7483bb64e1f0\") " pod="openshift-multus/multus-dnjcp" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.488874 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2a7b6abe-370e-4514-9777-7483bb64e1f0-host-var-lib-kubelet\") pod \"multus-dnjcp\" (UID: \"2a7b6abe-370e-4514-9777-7483bb64e1f0\") " pod="openshift-multus/multus-dnjcp" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.488894 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2a7b6abe-370e-4514-9777-7483bb64e1f0-multus-cni-dir\") pod \"multus-dnjcp\" (UID: \"2a7b6abe-370e-4514-9777-7483bb64e1f0\") " pod="openshift-multus/multus-dnjcp" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.488954 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2a7b6abe-370e-4514-9777-7483bb64e1f0-host-var-lib-cni-multus\") pod \"multus-dnjcp\" (UID: \"2a7b6abe-370e-4514-9777-7483bb64e1f0\") " pod="openshift-multus/multus-dnjcp" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.488981 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2a7b6abe-370e-4514-9777-7483bb64e1f0-os-release\") pod \"multus-dnjcp\" (UID: \"2a7b6abe-370e-4514-9777-7483bb64e1f0\") " pod="openshift-multus/multus-dnjcp" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.489022 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3a306fd7-dca6-4973-b8fa-4bd07840a104-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7fs7j\" (UID: \"3a306fd7-dca6-4973-b8fa-4bd07840a104\") " pod="openshift-multus/multus-additional-cni-plugins-7fs7j" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.489055 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssrjl\" (UniqueName: \"kubernetes.io/projected/3a306fd7-dca6-4973-b8fa-4bd07840a104-kube-api-access-ssrjl\") pod \"multus-additional-cni-plugins-7fs7j\" (UID: \"3a306fd7-dca6-4973-b8fa-4bd07840a104\") " pod="openshift-multus/multus-additional-cni-plugins-7fs7j" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.489086 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2a7b6abe-370e-4514-9777-7483bb64e1f0-os-release\") pod \"multus-dnjcp\" (UID: \"2a7b6abe-370e-4514-9777-7483bb64e1f0\") " pod="openshift-multus/multus-dnjcp" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.489087 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-systemd-units\") pod \"ovnkube-node-kmjsq\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.489180 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-systemd-units\") pod \"ovnkube-node-kmjsq\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.489223 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/752be2d4-f338-4c5e-b51e-452fd8391c73-ovnkube-config\") pod \"ovnkube-node-kmjsq\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.489239 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/752be2d4-f338-4c5e-b51e-452fd8391c73-ovnkube-script-lib\") pod \"ovnkube-node-kmjsq\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.489295 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxs46\" (UniqueName: \"kubernetes.io/projected/752be2d4-f338-4c5e-b51e-452fd8391c73-kube-api-access-fxs46\") pod \"ovnkube-node-kmjsq\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.489406 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2a7b6abe-370e-4514-9777-7483bb64e1f0-host-run-k8s-cni-cncf-io\") pod \"multus-dnjcp\" (UID: \"2a7b6abe-370e-4514-9777-7483bb64e1f0\") " pod="openshift-multus/multus-dnjcp" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.489425 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2a7b6abe-370e-4514-9777-7483bb64e1f0-host-run-k8s-cni-cncf-io\") pod \"multus-dnjcp\" (UID: \"2a7b6abe-370e-4514-9777-7483bb64e1f0\") " pod="openshift-multus/multus-dnjcp" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.489453 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2a7b6abe-370e-4514-9777-7483bb64e1f0-host-run-netns\") pod \"multus-dnjcp\" (UID: \"2a7b6abe-370e-4514-9777-7483bb64e1f0\") " pod="openshift-multus/multus-dnjcp" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.489486 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2a7b6abe-370e-4514-9777-7483bb64e1f0-etc-kubernetes\") pod \"multus-dnjcp\" (UID: \"2a7b6abe-370e-4514-9777-7483bb64e1f0\") " pod="openshift-multus/multus-dnjcp" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.489522 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkqlf\" (UniqueName: \"kubernetes.io/projected/6f7875e3-174f-4c67-8675-d878de74aa4f-kube-api-access-zkqlf\") pod \"machine-config-daemon-5g7gc\" (UID: \"6f7875e3-174f-4c67-8675-d878de74aa4f\") " pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.489488 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2a7b6abe-370e-4514-9777-7483bb64e1f0-host-run-netns\") pod \"multus-dnjcp\" (UID: \"2a7b6abe-370e-4514-9777-7483bb64e1f0\") " pod="openshift-multus/multus-dnjcp" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.489553 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/752be2d4-f338-4c5e-b51e-452fd8391c73-env-overrides\") pod \"ovnkube-node-kmjsq\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.489551 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2a7b6abe-370e-4514-9777-7483bb64e1f0-etc-kubernetes\") pod \"multus-dnjcp\" (UID: \"2a7b6abe-370e-4514-9777-7483bb64e1f0\") " pod="openshift-multus/multus-dnjcp" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.489598 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/6f7875e3-174f-4c67-8675-d878de74aa4f-rootfs\") pod \"machine-config-daemon-5g7gc\" (UID: \"6f7875e3-174f-4c67-8675-d878de74aa4f\") " pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.489622 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3a306fd7-dca6-4973-b8fa-4bd07840a104-system-cni-dir\") pod \"multus-additional-cni-plugins-7fs7j\" (UID: \"3a306fd7-dca6-4973-b8fa-4bd07840a104\") " pod="openshift-multus/multus-additional-cni-plugins-7fs7j" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.489641 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-run-ovn\") pod \"ovnkube-node-kmjsq\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.489657 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-host-run-ovn-kubernetes\") pod \"ovnkube-node-kmjsq\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.489673 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2a7b6abe-370e-4514-9777-7483bb64e1f0-cnibin\") pod \"multus-dnjcp\" (UID: \"2a7b6abe-370e-4514-9777-7483bb64e1f0\") " pod="openshift-multus/multus-dnjcp" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.489697 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-run-systemd\") pod \"ovnkube-node-kmjsq\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.489700 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3a306fd7-dca6-4973-b8fa-4bd07840a104-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7fs7j\" (UID: \"3a306fd7-dca6-4973-b8fa-4bd07840a104\") " pod="openshift-multus/multus-additional-cni-plugins-7fs7j" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.489712 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3a306fd7-dca6-4973-b8fa-4bd07840a104-cnibin\") pod \"multus-additional-cni-plugins-7fs7j\" (UID: \"3a306fd7-dca6-4973-b8fa-4bd07840a104\") " pod="openshift-multus/multus-additional-cni-plugins-7fs7j" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.489714 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3a306fd7-dca6-4973-b8fa-4bd07840a104-system-cni-dir\") pod \"multus-additional-cni-plugins-7fs7j\" (UID: \"3a306fd7-dca6-4973-b8fa-4bd07840a104\") " pod="openshift-multus/multus-additional-cni-plugins-7fs7j" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.489738 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-host-run-ovn-kubernetes\") pod \"ovnkube-node-kmjsq\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.489780 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/6f7875e3-174f-4c67-8675-d878de74aa4f-rootfs\") pod \"machine-config-daemon-5g7gc\" (UID: \"6f7875e3-174f-4c67-8675-d878de74aa4f\") " pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.489792 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2a7b6abe-370e-4514-9777-7483bb64e1f0-cnibin\") pod \"multus-dnjcp\" (UID: \"2a7b6abe-370e-4514-9777-7483bb64e1f0\") " pod="openshift-multus/multus-dnjcp" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.489791 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-run-ovn\") pod \"ovnkube-node-kmjsq\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.489809 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-host-slash\") pod \"ovnkube-node-kmjsq\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.489813 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-run-systemd\") pod \"ovnkube-node-kmjsq\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.489820 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/752be2d4-f338-4c5e-b51e-452fd8391c73-ovnkube-config\") pod \"ovnkube-node-kmjsq\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.489826 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-host-slash\") pod \"ovnkube-node-kmjsq\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.489858 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-etc-openvswitch\") pod \"ovnkube-node-kmjsq\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.489844 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-etc-openvswitch\") pod \"ovnkube-node-kmjsq\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.489843 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3a306fd7-dca6-4973-b8fa-4bd07840a104-cnibin\") pod \"multus-additional-cni-plugins-7fs7j\" (UID: \"3a306fd7-dca6-4973-b8fa-4bd07840a104\") " pod="openshift-multus/multus-additional-cni-plugins-7fs7j" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.489900 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6f7875e3-174f-4c67-8675-d878de74aa4f-mcd-auth-proxy-config\") pod \"machine-config-daemon-5g7gc\" (UID: \"6f7875e3-174f-4c67-8675-d878de74aa4f\") " pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.489928 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2a7b6abe-370e-4514-9777-7483bb64e1f0-cni-binary-copy\") pod \"multus-dnjcp\" (UID: \"2a7b6abe-370e-4514-9777-7483bb64e1f0\") " pod="openshift-multus/multus-dnjcp" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.489951 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-node-log\") pod \"ovnkube-node-kmjsq\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.489972 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-var-lib-openvswitch\") pod \"ovnkube-node-kmjsq\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.489993 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6f7875e3-174f-4c67-8675-d878de74aa4f-proxy-tls\") pod \"machine-config-daemon-5g7gc\" (UID: \"6f7875e3-174f-4c67-8675-d878de74aa4f\") " pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.490014 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2a7b6abe-370e-4514-9777-7483bb64e1f0-host-var-lib-cni-bin\") pod \"multus-dnjcp\" (UID: \"2a7b6abe-370e-4514-9777-7483bb64e1f0\") " pod="openshift-multus/multus-dnjcp" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.490031 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-var-lib-openvswitch\") pod \"ovnkube-node-kmjsq\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.490029 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-node-log\") pod \"ovnkube-node-kmjsq\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.490054 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-host-cni-netd\") pod \"ovnkube-node-kmjsq\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.490113 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/78a93e94-047c-45c2-81b3-161408c41b0a-hosts-file\") pod \"node-resolver-lccwl\" (UID: \"78a93e94-047c-45c2-81b3-161408c41b0a\") " pod="openshift-dns/node-resolver-lccwl" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.490129 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2a7b6abe-370e-4514-9777-7483bb64e1f0-host-var-lib-cni-bin\") pod \"multus-dnjcp\" (UID: \"2a7b6abe-370e-4514-9777-7483bb64e1f0\") " pod="openshift-multus/multus-dnjcp" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.490134 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4mg7\" (UniqueName: \"kubernetes.io/projected/78a93e94-047c-45c2-81b3-161408c41b0a-kube-api-access-z4mg7\") pod \"node-resolver-lccwl\" (UID: \"78a93e94-047c-45c2-81b3-161408c41b0a\") " pod="openshift-dns/node-resolver-lccwl" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.490178 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/78a93e94-047c-45c2-81b3-161408c41b0a-hosts-file\") pod \"node-resolver-lccwl\" (UID: \"78a93e94-047c-45c2-81b3-161408c41b0a\") " pod="openshift-dns/node-resolver-lccwl" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.490217 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2a7b6abe-370e-4514-9777-7483bb64e1f0-system-cni-dir\") pod \"multus-dnjcp\" (UID: \"2a7b6abe-370e-4514-9777-7483bb64e1f0\") " pod="openshift-multus/multus-dnjcp" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.490252 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2a7b6abe-370e-4514-9777-7483bb64e1f0-system-cni-dir\") pod \"multus-dnjcp\" (UID: \"2a7b6abe-370e-4514-9777-7483bb64e1f0\") " pod="openshift-multus/multus-dnjcp" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.490269 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2a7b6abe-370e-4514-9777-7483bb64e1f0-host-run-multus-certs\") pod \"multus-dnjcp\" (UID: \"2a7b6abe-370e-4514-9777-7483bb64e1f0\") " pod="openshift-multus/multus-dnjcp" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.490297 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2a7b6abe-370e-4514-9777-7483bb64e1f0-host-run-multus-certs\") pod \"multus-dnjcp\" (UID: \"2a7b6abe-370e-4514-9777-7483bb64e1f0\") " pod="openshift-multus/multus-dnjcp" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.490269 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-host-cni-netd\") pod \"ovnkube-node-kmjsq\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.490309 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vt5b6\" (UniqueName: \"kubernetes.io/projected/2a7b6abe-370e-4514-9777-7483bb64e1f0-kube-api-access-vt5b6\") pod \"multus-dnjcp\" (UID: \"2a7b6abe-370e-4514-9777-7483bb64e1f0\") " pod="openshift-multus/multus-dnjcp" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.490337 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/752be2d4-f338-4c5e-b51e-452fd8391c73-env-overrides\") pod \"ovnkube-node-kmjsq\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.490357 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3a306fd7-dca6-4973-b8fa-4bd07840a104-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7fs7j\" (UID: \"3a306fd7-dca6-4973-b8fa-4bd07840a104\") " pod="openshift-multus/multus-additional-cni-plugins-7fs7j" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.490409 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2a7b6abe-370e-4514-9777-7483bb64e1f0-multus-conf-dir\") pod \"multus-dnjcp\" (UID: \"2a7b6abe-370e-4514-9777-7483bb64e1f0\") " pod="openshift-multus/multus-dnjcp" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.490462 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2a7b6abe-370e-4514-9777-7483bb64e1f0-multus-conf-dir\") pod \"multus-dnjcp\" (UID: \"2a7b6abe-370e-4514-9777-7483bb64e1f0\") " pod="openshift-multus/multus-dnjcp" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.490476 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6f7875e3-174f-4c67-8675-d878de74aa4f-mcd-auth-proxy-config\") pod \"machine-config-daemon-5g7gc\" (UID: \"6f7875e3-174f-4c67-8675-d878de74aa4f\") " pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.490485 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-host-kubelet\") pod \"ovnkube-node-kmjsq\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.490506 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kmjsq\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.490529 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-host-kubelet\") pod \"ovnkube-node-kmjsq\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.490526 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2a7b6abe-370e-4514-9777-7483bb64e1f0-multus-socket-dir-parent\") pod \"multus-dnjcp\" (UID: \"2a7b6abe-370e-4514-9777-7483bb64e1f0\") " pod="openshift-multus/multus-dnjcp" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.490556 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-run-openvswitch\") pod \"ovnkube-node-kmjsq\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.490574 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-log-socket\") pod \"ovnkube-node-kmjsq\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.490592 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3a306fd7-dca6-4973-b8fa-4bd07840a104-cni-binary-copy\") pod \"multus-additional-cni-plugins-7fs7j\" (UID: \"3a306fd7-dca6-4973-b8fa-4bd07840a104\") " pod="openshift-multus/multus-additional-cni-plugins-7fs7j" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.490615 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2a7b6abe-370e-4514-9777-7483bb64e1f0-multus-daemon-config\") pod \"multus-dnjcp\" (UID: \"2a7b6abe-370e-4514-9777-7483bb64e1f0\") " pod="openshift-multus/multus-dnjcp" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.490658 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-log-socket\") pod \"ovnkube-node-kmjsq\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.490647 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-run-openvswitch\") pod \"ovnkube-node-kmjsq\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.490717 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2a7b6abe-370e-4514-9777-7483bb64e1f0-multus-socket-dir-parent\") pod \"multus-dnjcp\" (UID: \"2a7b6abe-370e-4514-9777-7483bb64e1f0\") " pod="openshift-multus/multus-dnjcp" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.491016 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3a306fd7-dca6-4973-b8fa-4bd07840a104-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7fs7j\" (UID: \"3a306fd7-dca6-4973-b8fa-4bd07840a104\") " pod="openshift-multus/multus-additional-cni-plugins-7fs7j" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.491189 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2a7b6abe-370e-4514-9777-7483bb64e1f0-multus-daemon-config\") pod \"multus-dnjcp\" (UID: \"2a7b6abe-370e-4514-9777-7483bb64e1f0\") " pod="openshift-multus/multus-dnjcp" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.490585 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kmjsq\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.491416 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2a7b6abe-370e-4514-9777-7483bb64e1f0-cni-binary-copy\") pod \"multus-dnjcp\" (UID: \"2a7b6abe-370e-4514-9777-7483bb64e1f0\") " pod="openshift-multus/multus-dnjcp" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.491689 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3a306fd7-dca6-4973-b8fa-4bd07840a104-cni-binary-copy\") pod \"multus-additional-cni-plugins-7fs7j\" (UID: \"3a306fd7-dca6-4973-b8fa-4bd07840a104\") " pod="openshift-multus/multus-additional-cni-plugins-7fs7j" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.493764 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/752be2d4-f338-4c5e-b51e-452fd8391c73-ovn-node-metrics-cert\") pod \"ovnkube-node-kmjsq\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.497624 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6f7875e3-174f-4c67-8675-d878de74aa4f-proxy-tls\") pod \"machine-config-daemon-5g7gc\" (UID: \"6f7875e3-174f-4c67-8675-d878de74aa4f\") " pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.506683 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxs46\" (UniqueName: \"kubernetes.io/projected/752be2d4-f338-4c5e-b51e-452fd8391c73-kube-api-access-fxs46\") pod \"ovnkube-node-kmjsq\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.507828 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vt5b6\" (UniqueName: \"kubernetes.io/projected/2a7b6abe-370e-4514-9777-7483bb64e1f0-kube-api-access-vt5b6\") pod \"multus-dnjcp\" (UID: \"2a7b6abe-370e-4514-9777-7483bb64e1f0\") " pod="openshift-multus/multus-dnjcp" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.508556 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4mg7\" (UniqueName: \"kubernetes.io/projected/78a93e94-047c-45c2-81b3-161408c41b0a-kube-api-access-z4mg7\") pod \"node-resolver-lccwl\" (UID: \"78a93e94-047c-45c2-81b3-161408c41b0a\") " pod="openshift-dns/node-resolver-lccwl" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.509425 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssrjl\" (UniqueName: \"kubernetes.io/projected/3a306fd7-dca6-4973-b8fa-4bd07840a104-kube-api-access-ssrjl\") pod \"multus-additional-cni-plugins-7fs7j\" (UID: \"3a306fd7-dca6-4973-b8fa-4bd07840a104\") " pod="openshift-multus/multus-additional-cni-plugins-7fs7j" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.511057 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkqlf\" (UniqueName: \"kubernetes.io/projected/6f7875e3-174f-4c67-8675-d878de74aa4f-kube-api-access-zkqlf\") pod \"machine-config-daemon-5g7gc\" (UID: \"6f7875e3-174f-4c67-8675-d878de74aa4f\") " pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.585765 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.585807 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.585820 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.585836 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.585848 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:24Z","lastTransitionTime":"2026-03-09T09:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.587163 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-lccwl" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.591393 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0264daf8-9c63-49d8-b8d0-20778e5dbee2-serviceca\") pod \"node-ca-tvzgr\" (UID: \"0264daf8-9c63-49d8-b8d0-20778e5dbee2\") " pod="openshift-image-registry/node-ca-tvzgr" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.591451 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0264daf8-9c63-49d8-b8d0-20778e5dbee2-host\") pod \"node-ca-tvzgr\" (UID: \"0264daf8-9c63-49d8-b8d0-20778e5dbee2\") " pod="openshift-image-registry/node-ca-tvzgr" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.591515 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqt4t\" (UniqueName: \"kubernetes.io/projected/0264daf8-9c63-49d8-b8d0-20778e5dbee2-kube-api-access-dqt4t\") pod \"node-ca-tvzgr\" (UID: \"0264daf8-9c63-49d8-b8d0-20778e5dbee2\") " pod="openshift-image-registry/node-ca-tvzgr" Mar 09 09:07:24 crc kubenswrapper[4861]: W0309 09:07:24.596962 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78a93e94_047c_45c2_81b3_161408c41b0a.slice/crio-2e1fa5b675b487663862b961a81c7fa0e1a70b28671b6d50c63e74915305796b WatchSource:0}: Error finding container 2e1fa5b675b487663862b961a81c7fa0e1a70b28671b6d50c63e74915305796b: Status 404 returned error can't find the container with id 2e1fa5b675b487663862b961a81c7fa0e1a70b28671b6d50c63e74915305796b Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.605233 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.608575 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hdtxw"] Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.608992 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hdtxw" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.611108 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.611180 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.617565 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-7fs7j" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.628166 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-dnjcp" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.635299 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.635272 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-pp5xh"] Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.635614 4861 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.636297 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pp5xh" Mar 09 09:07:24 crc kubenswrapper[4861]: E0309 09:07:24.636641 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pp5xh" podUID="1ce13338-8caa-4be7-80e7-791207626053" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.657055 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.657074 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.657129 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:07:24 crc kubenswrapper[4861]: E0309 09:07:24.657182 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:07:24 crc kubenswrapper[4861]: E0309 09:07:24.657331 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:07:24 crc kubenswrapper[4861]: E0309 09:07:24.657451 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:07:24 crc kubenswrapper[4861]: W0309 09:07:24.660713 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a7b6abe_370e_4514_9777_7483bb64e1f0.slice/crio-f98cb3d6a5f1d636fbc13baf399e54d76d790b1bf2c7c37dd04afeba42344eb6 WatchSource:0}: Error finding container f98cb3d6a5f1d636fbc13baf399e54d76d790b1bf2c7c37dd04afeba42344eb6: Status 404 returned error can't find the container with id f98cb3d6a5f1d636fbc13baf399e54d76d790b1bf2c7c37dd04afeba42344eb6 Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.688850 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.689415 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.689432 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.689457 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.689468 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:24Z","lastTransitionTime":"2026-03-09T09:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.692418 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmv75\" (UniqueName: \"kubernetes.io/projected/839a9f20-0dd8-4b96-acff-85aab3ce9a7a-kube-api-access-hmv75\") pod \"ovnkube-control-plane-749d76644c-hdtxw\" (UID: \"839a9f20-0dd8-4b96-acff-85aab3ce9a7a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hdtxw" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.692462 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/839a9f20-0dd8-4b96-acff-85aab3ce9a7a-env-overrides\") pod \"ovnkube-control-plane-749d76644c-hdtxw\" (UID: \"839a9f20-0dd8-4b96-acff-85aab3ce9a7a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hdtxw" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.692508 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0264daf8-9c63-49d8-b8d0-20778e5dbee2-host\") pod \"node-ca-tvzgr\" (UID: \"0264daf8-9c63-49d8-b8d0-20778e5dbee2\") " pod="openshift-image-registry/node-ca-tvzgr" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.692555 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0264daf8-9c63-49d8-b8d0-20778e5dbee2-host\") pod \"node-ca-tvzgr\" (UID: \"0264daf8-9c63-49d8-b8d0-20778e5dbee2\") " pod="openshift-image-registry/node-ca-tvzgr" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.692656 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/839a9f20-0dd8-4b96-acff-85aab3ce9a7a-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-hdtxw\" (UID: \"839a9f20-0dd8-4b96-acff-85aab3ce9a7a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hdtxw" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.692713 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqt4t\" (UniqueName: \"kubernetes.io/projected/0264daf8-9c63-49d8-b8d0-20778e5dbee2-kube-api-access-dqt4t\") pod \"node-ca-tvzgr\" (UID: \"0264daf8-9c63-49d8-b8d0-20778e5dbee2\") " pod="openshift-image-registry/node-ca-tvzgr" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.692750 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0264daf8-9c63-49d8-b8d0-20778e5dbee2-serviceca\") pod \"node-ca-tvzgr\" (UID: \"0264daf8-9c63-49d8-b8d0-20778e5dbee2\") " pod="openshift-image-registry/node-ca-tvzgr" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.692777 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/839a9f20-0dd8-4b96-acff-85aab3ce9a7a-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-hdtxw\" (UID: \"839a9f20-0dd8-4b96-acff-85aab3ce9a7a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hdtxw" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.693816 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0264daf8-9c63-49d8-b8d0-20778e5dbee2-serviceca\") pod \"node-ca-tvzgr\" (UID: \"0264daf8-9c63-49d8-b8d0-20778e5dbee2\") " pod="openshift-image-registry/node-ca-tvzgr" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.713792 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqt4t\" (UniqueName: \"kubernetes.io/projected/0264daf8-9c63-49d8-b8d0-20778e5dbee2-kube-api-access-dqt4t\") pod \"node-ca-tvzgr\" (UID: \"0264daf8-9c63-49d8-b8d0-20778e5dbee2\") " pod="openshift-image-registry/node-ca-tvzgr" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.767635 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-tvzgr" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.793008 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.793039 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.793050 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.793065 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.793076 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:24Z","lastTransitionTime":"2026-03-09T09:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.793335 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/839a9f20-0dd8-4b96-acff-85aab3ce9a7a-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-hdtxw\" (UID: \"839a9f20-0dd8-4b96-acff-85aab3ce9a7a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hdtxw" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.794036 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/839a9f20-0dd8-4b96-acff-85aab3ce9a7a-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-hdtxw\" (UID: \"839a9f20-0dd8-4b96-acff-85aab3ce9a7a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hdtxw" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.794083 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmv75\" (UniqueName: \"kubernetes.io/projected/839a9f20-0dd8-4b96-acff-85aab3ce9a7a-kube-api-access-hmv75\") pod \"ovnkube-control-plane-749d76644c-hdtxw\" (UID: \"839a9f20-0dd8-4b96-acff-85aab3ce9a7a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hdtxw" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.794124 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/839a9f20-0dd8-4b96-acff-85aab3ce9a7a-env-overrides\") pod \"ovnkube-control-plane-749d76644c-hdtxw\" (UID: \"839a9f20-0dd8-4b96-acff-85aab3ce9a7a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hdtxw" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.794555 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/839a9f20-0dd8-4b96-acff-85aab3ce9a7a-env-overrides\") pod \"ovnkube-control-plane-749d76644c-hdtxw\" (UID: \"839a9f20-0dd8-4b96-acff-85aab3ce9a7a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hdtxw" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.794595 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ce13338-8caa-4be7-80e7-791207626053-metrics-certs\") pod \"network-metrics-daemon-pp5xh\" (UID: \"1ce13338-8caa-4be7-80e7-791207626053\") " pod="openshift-multus/network-metrics-daemon-pp5xh" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.794655 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvlmd\" (UniqueName: \"kubernetes.io/projected/1ce13338-8caa-4be7-80e7-791207626053-kube-api-access-lvlmd\") pod \"network-metrics-daemon-pp5xh\" (UID: \"1ce13338-8caa-4be7-80e7-791207626053\") " pod="openshift-multus/network-metrics-daemon-pp5xh" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.794756 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/839a9f20-0dd8-4b96-acff-85aab3ce9a7a-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-hdtxw\" (UID: \"839a9f20-0dd8-4b96-acff-85aab3ce9a7a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hdtxw" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.798168 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/839a9f20-0dd8-4b96-acff-85aab3ce9a7a-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-hdtxw\" (UID: \"839a9f20-0dd8-4b96-acff-85aab3ce9a7a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hdtxw" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.813124 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmv75\" (UniqueName: \"kubernetes.io/projected/839a9f20-0dd8-4b96-acff-85aab3ce9a7a-kube-api-access-hmv75\") pod \"ovnkube-control-plane-749d76644c-hdtxw\" (UID: \"839a9f20-0dd8-4b96-acff-85aab3ce9a7a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hdtxw" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.895669 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ce13338-8caa-4be7-80e7-791207626053-metrics-certs\") pod \"network-metrics-daemon-pp5xh\" (UID: \"1ce13338-8caa-4be7-80e7-791207626053\") " pod="openshift-multus/network-metrics-daemon-pp5xh" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.895715 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvlmd\" (UniqueName: \"kubernetes.io/projected/1ce13338-8caa-4be7-80e7-791207626053-kube-api-access-lvlmd\") pod \"network-metrics-daemon-pp5xh\" (UID: \"1ce13338-8caa-4be7-80e7-791207626053\") " pod="openshift-multus/network-metrics-daemon-pp5xh" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.895908 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.895937 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.895946 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.895959 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:24 crc kubenswrapper[4861]: E0309 09:07:24.896034 4861 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 09:07:24 crc kubenswrapper[4861]: E0309 09:07:24.896071 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ce13338-8caa-4be7-80e7-791207626053-metrics-certs podName:1ce13338-8caa-4be7-80e7-791207626053 nodeName:}" failed. No retries permitted until 2026-03-09 09:07:25.39605893 +0000 UTC m=+88.481098331 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1ce13338-8caa-4be7-80e7-791207626053-metrics-certs") pod "network-metrics-daemon-pp5xh" (UID: "1ce13338-8caa-4be7-80e7-791207626053") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.898571 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:24Z","lastTransitionTime":"2026-03-09T09:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.912365 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvlmd\" (UniqueName: \"kubernetes.io/projected/1ce13338-8caa-4be7-80e7-791207626053-kube-api-access-lvlmd\") pod \"network-metrics-daemon-pp5xh\" (UID: \"1ce13338-8caa-4be7-80e7-791207626053\") " pod="openshift-multus/network-metrics-daemon-pp5xh" Mar 09 09:07:24 crc kubenswrapper[4861]: I0309 09:07:24.921789 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hdtxw" Mar 09 09:07:24 crc kubenswrapper[4861]: W0309 09:07:24.947346 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod839a9f20_0dd8_4b96_acff_85aab3ce9a7a.slice/crio-866de98bdc6a2499874308aa5bd606f89e28eddbd11a464f1c77a645d124586c WatchSource:0}: Error finding container 866de98bdc6a2499874308aa5bd606f89e28eddbd11a464f1c77a645d124586c: Status 404 returned error can't find the container with id 866de98bdc6a2499874308aa5bd606f89e28eddbd11a464f1c77a645d124586c Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.001997 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.002035 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.002046 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.002062 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.002075 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:25Z","lastTransitionTime":"2026-03-09T09:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.032476 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-tvzgr" event={"ID":"0264daf8-9c63-49d8-b8d0-20778e5dbee2","Type":"ContainerStarted","Data":"e289a1fc7931934d6b1c1b671e25af15cdc4a3cd6c0a7370c95629c7c8cdde07"} Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.034063 4861 generic.go:334] "Generic (PLEG): container finished" podID="752be2d4-f338-4c5e-b51e-452fd8391c73" containerID="5d78e61177fed79e734e99406acb1d44f730e81d601827c7be96d7dec219dd93" exitCode=0 Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.034130 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" event={"ID":"752be2d4-f338-4c5e-b51e-452fd8391c73","Type":"ContainerDied","Data":"5d78e61177fed79e734e99406acb1d44f730e81d601827c7be96d7dec219dd93"} Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.034178 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" event={"ID":"752be2d4-f338-4c5e-b51e-452fd8391c73","Type":"ContainerStarted","Data":"69d927fb2c5fc26050c07a81d44ae2ec32cc0bcac8f5d9c11fb0c415816890a9"} Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.036003 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-lccwl" event={"ID":"78a93e94-047c-45c2-81b3-161408c41b0a","Type":"ContainerStarted","Data":"b6a320c4e98d803360abda136a4329f301d26c3b7183b9788d9e46b941bdd2aa"} Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.036033 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-lccwl" event={"ID":"78a93e94-047c-45c2-81b3-161408c41b0a","Type":"ContainerStarted","Data":"2e1fa5b675b487663862b961a81c7fa0e1a70b28671b6d50c63e74915305796b"} Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.040463 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hdtxw" event={"ID":"839a9f20-0dd8-4b96-acff-85aab3ce9a7a","Type":"ContainerStarted","Data":"866de98bdc6a2499874308aa5bd606f89e28eddbd11a464f1c77a645d124586c"} Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.047167 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dnjcp" event={"ID":"2a7b6abe-370e-4514-9777-7483bb64e1f0","Type":"ContainerStarted","Data":"8c38cd133d2ed36cdd55c33e0dbf2336c41cb7343be3c65e5b854b019ee99b4b"} Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.047194 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dnjcp" event={"ID":"2a7b6abe-370e-4514-9777-7483bb64e1f0","Type":"ContainerStarted","Data":"f98cb3d6a5f1d636fbc13baf399e54d76d790b1bf2c7c37dd04afeba42344eb6"} Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.048662 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7fs7j" event={"ID":"3a306fd7-dca6-4973-b8fa-4bd07840a104","Type":"ContainerStarted","Data":"41b9b01858c79781e00026bffb5cbb6465b1e0aa16798a701cc0b9df89c2e16c"} Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.048683 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7fs7j" event={"ID":"3a306fd7-dca6-4973-b8fa-4bd07840a104","Type":"ContainerStarted","Data":"37ce986e622f3e4275509e6299f8e1b5c272749ce7d0b65a2c2b77353de2466a"} Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.050883 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" event={"ID":"6f7875e3-174f-4c67-8675-d878de74aa4f","Type":"ContainerStarted","Data":"950cf6af64277b6188157067101e62c173f5d1e24a04e76a96e5d286e5818c94"} Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.050901 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" event={"ID":"6f7875e3-174f-4c67-8675-d878de74aa4f","Type":"ContainerStarted","Data":"c970ace96d4c918f6e61a749abffe084d175df04b5393bf6029d502cdda837af"} Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.050911 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" event={"ID":"6f7875e3-174f-4c67-8675-d878de74aa4f","Type":"ContainerStarted","Data":"9acf51fa8faf61361fe8988ce3048af89bb6ef2a4c305e6e2d6776f3a705ad31"} Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.072181 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podStartSLOduration=44.072164558 podStartE2EDuration="44.072164558s" podCreationTimestamp="2026-03-09 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:07:25.071999074 +0000 UTC m=+88.157038475" watchObservedRunningTime="2026-03-09 09:07:25.072164558 +0000 UTC m=+88.157203959" Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.108599 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.112046 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.112176 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.112200 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.112213 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:25Z","lastTransitionTime":"2026-03-09T09:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.118762 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-dnjcp" podStartSLOduration=44.118738151 podStartE2EDuration="44.118738151s" podCreationTimestamp="2026-03-09 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:07:25.104767385 +0000 UTC m=+88.189806786" watchObservedRunningTime="2026-03-09 09:07:25.118738151 +0000 UTC m=+88.203777552" Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.121222 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-lccwl" podStartSLOduration=44.121208677 podStartE2EDuration="44.121208677s" podCreationTimestamp="2026-03-09 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:07:25.117063756 +0000 UTC m=+88.202103177" watchObservedRunningTime="2026-03-09 09:07:25.121208677 +0000 UTC m=+88.206248078" Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.214706 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.214747 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.214757 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.214775 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.214786 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:25Z","lastTransitionTime":"2026-03-09T09:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.317084 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.317128 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.317142 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.317163 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.317178 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:25Z","lastTransitionTime":"2026-03-09T09:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.404632 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ce13338-8caa-4be7-80e7-791207626053-metrics-certs\") pod \"network-metrics-daemon-pp5xh\" (UID: \"1ce13338-8caa-4be7-80e7-791207626053\") " pod="openshift-multus/network-metrics-daemon-pp5xh" Mar 09 09:07:25 crc kubenswrapper[4861]: E0309 09:07:25.404797 4861 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 09:07:25 crc kubenswrapper[4861]: E0309 09:07:25.404900 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ce13338-8caa-4be7-80e7-791207626053-metrics-certs podName:1ce13338-8caa-4be7-80e7-791207626053 nodeName:}" failed. No retries permitted until 2026-03-09 09:07:26.404880209 +0000 UTC m=+89.489919610 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1ce13338-8caa-4be7-80e7-791207626053-metrics-certs") pod "network-metrics-daemon-pp5xh" (UID: "1ce13338-8caa-4be7-80e7-791207626053") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.419262 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.419300 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.419313 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.419331 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.419344 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:25Z","lastTransitionTime":"2026-03-09T09:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.522873 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.522912 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.522969 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.522985 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.523300 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:25Z","lastTransitionTime":"2026-03-09T09:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.625869 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.625908 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.626140 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.626162 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.626177 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:25Z","lastTransitionTime":"2026-03-09T09:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.728871 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.728902 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.728938 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.728953 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.728962 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:25Z","lastTransitionTime":"2026-03-09T09:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.831477 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.831730 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.831743 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.831758 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.831767 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:25Z","lastTransitionTime":"2026-03-09T09:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.934295 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.934327 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.934337 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.934352 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:25 crc kubenswrapper[4861]: I0309 09:07:25.934361 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:25Z","lastTransitionTime":"2026-03-09T09:07:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.036395 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.036446 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.036460 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.036484 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.036497 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:26Z","lastTransitionTime":"2026-03-09T09:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.058106 4861 generic.go:334] "Generic (PLEG): container finished" podID="3a306fd7-dca6-4973-b8fa-4bd07840a104" containerID="41b9b01858c79781e00026bffb5cbb6465b1e0aa16798a701cc0b9df89c2e16c" exitCode=0 Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.058223 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7fs7j" event={"ID":"3a306fd7-dca6-4973-b8fa-4bd07840a104","Type":"ContainerDied","Data":"41b9b01858c79781e00026bffb5cbb6465b1e0aa16798a701cc0b9df89c2e16c"} Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.060187 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-tvzgr" event={"ID":"0264daf8-9c63-49d8-b8d0-20778e5dbee2","Type":"ContainerStarted","Data":"f6dfd5d602f1555ef39eb0150dc4b829de93c0f72ed413857c3b320a0ab6a364"} Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.074499 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" event={"ID":"752be2d4-f338-4c5e-b51e-452fd8391c73","Type":"ContainerStarted","Data":"c141660a01917112b004037f5bfc373a39b90e14dcb37a156d0b0dc60af56e0a"} Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.074550 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" event={"ID":"752be2d4-f338-4c5e-b51e-452fd8391c73","Type":"ContainerStarted","Data":"77865966595bc249ebb7de4adca17e523e85c0dfab3b83bcc5ef662c7c31b0e0"} Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.074563 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" event={"ID":"752be2d4-f338-4c5e-b51e-452fd8391c73","Type":"ContainerStarted","Data":"3404d4ea05e503128dcf25a1314f47abdcd31cf9014df5b10889374c469707e8"} Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.074576 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" event={"ID":"752be2d4-f338-4c5e-b51e-452fd8391c73","Type":"ContainerStarted","Data":"35204d90abde22c56dbe4579e11f1da589271202a7e059926b799e4ddfb69f63"} Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.074586 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" event={"ID":"752be2d4-f338-4c5e-b51e-452fd8391c73","Type":"ContainerStarted","Data":"04d44b4ad8719d9acaed799be8575081e6d326f27f3f02958ad30441df94d80a"} Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.074595 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" event={"ID":"752be2d4-f338-4c5e-b51e-452fd8391c73","Type":"ContainerStarted","Data":"c73566e23363dcd42915bf2068aa8e97fe90f6c7afa409450cee62c683cbfa90"} Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.076973 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hdtxw" event={"ID":"839a9f20-0dd8-4b96-acff-85aab3ce9a7a","Type":"ContainerStarted","Data":"a3164f65c993085f0183772e809e4f99ae44a5113594d8fcc62a45fa9343aff3"} Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.077000 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hdtxw" event={"ID":"839a9f20-0dd8-4b96-acff-85aab3ce9a7a","Type":"ContainerStarted","Data":"2d263af5ad269b17bed81af7f0361ff5380d1cf447882a489759e86e24b80e5e"} Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.132512 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hdtxw" podStartSLOduration=45.132480893 podStartE2EDuration="45.132480893s" podCreationTimestamp="2026-03-09 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:07:26.131642341 +0000 UTC m=+89.216681792" watchObservedRunningTime="2026-03-09 09:07:26.132480893 +0000 UTC m=+89.217520294" Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.145053 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.145108 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.145119 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.145139 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.145154 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:26Z","lastTransitionTime":"2026-03-09T09:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.147909 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-tvzgr" podStartSLOduration=46.147863576 podStartE2EDuration="46.147863576s" podCreationTimestamp="2026-03-09 09:06:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:07:26.147150378 +0000 UTC m=+89.232189789" watchObservedRunningTime="2026-03-09 09:07:26.147863576 +0000 UTC m=+89.232902977" Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.249006 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.249088 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.249100 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.249128 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.249141 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:26Z","lastTransitionTime":"2026-03-09T09:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.314729 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:07:26 crc kubenswrapper[4861]: E0309 09:07:26.315102 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:07:34.315073735 +0000 UTC m=+97.400113176 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.352481 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.352843 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.352855 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.352871 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.352883 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:26Z","lastTransitionTime":"2026-03-09T09:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.416660 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.417069 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.417331 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ce13338-8caa-4be7-80e7-791207626053-metrics-certs\") pod \"network-metrics-daemon-pp5xh\" (UID: \"1ce13338-8caa-4be7-80e7-791207626053\") " pod="openshift-multus/network-metrics-daemon-pp5xh" Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.417617 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.417910 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:07:26 crc kubenswrapper[4861]: E0309 09:07:26.416908 4861 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 09:07:26 crc kubenswrapper[4861]: E0309 09:07:26.418303 4861 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 09:07:26 crc kubenswrapper[4861]: E0309 09:07:26.417141 4861 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 09:07:26 crc kubenswrapper[4861]: E0309 09:07:26.417470 4861 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 09:07:26 crc kubenswrapper[4861]: E0309 09:07:26.417805 4861 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 09:07:26 crc kubenswrapper[4861]: E0309 09:07:26.418888 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ce13338-8caa-4be7-80e7-791207626053-metrics-certs podName:1ce13338-8caa-4be7-80e7-791207626053 nodeName:}" failed. No retries permitted until 2026-03-09 09:07:28.418850037 +0000 UTC m=+91.503889488 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1ce13338-8caa-4be7-80e7-791207626053-metrics-certs") pod "network-metrics-daemon-pp5xh" (UID: "1ce13338-8caa-4be7-80e7-791207626053") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 09:07:26 crc kubenswrapper[4861]: E0309 09:07:26.418897 4861 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 09:07:26 crc kubenswrapper[4861]: E0309 09:07:26.419045 4861 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:07:26 crc kubenswrapper[4861]: E0309 09:07:26.418550 4861 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:07:26 crc kubenswrapper[4861]: E0309 09:07:26.418049 4861 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 09:07:26 crc kubenswrapper[4861]: E0309 09:07:26.418965 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 09:07:34.418942809 +0000 UTC m=+97.503982450 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 09:07:26 crc kubenswrapper[4861]: E0309 09:07:26.419283 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 09:07:34.419258758 +0000 UTC m=+97.504298279 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:07:26 crc kubenswrapper[4861]: E0309 09:07:26.419320 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 09:07:34.419303769 +0000 UTC m=+97.504343310 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:07:26 crc kubenswrapper[4861]: E0309 09:07:26.419358 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 09:07:34.41934263 +0000 UTC m=+97.504382141 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.455724 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.455787 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.455800 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.455830 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.455848 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:26Z","lastTransitionTime":"2026-03-09T09:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.559333 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.559410 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.559429 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.559449 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.559464 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:26Z","lastTransitionTime":"2026-03-09T09:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.657447 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.657548 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.657589 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:07:26 crc kubenswrapper[4861]: E0309 09:07:26.657627 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.657563 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pp5xh" Mar 09 09:07:26 crc kubenswrapper[4861]: E0309 09:07:26.657734 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:07:26 crc kubenswrapper[4861]: E0309 09:07:26.657869 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pp5xh" podUID="1ce13338-8caa-4be7-80e7-791207626053" Mar 09 09:07:26 crc kubenswrapper[4861]: E0309 09:07:26.657993 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.661962 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.661991 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.661999 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.662012 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.662021 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:26Z","lastTransitionTime":"2026-03-09T09:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.764581 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.764640 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.764655 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.764675 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.764690 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:26Z","lastTransitionTime":"2026-03-09T09:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.867602 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.867659 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.867674 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.867697 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.867712 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:26Z","lastTransitionTime":"2026-03-09T09:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.971283 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.971348 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.971408 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.971461 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:26 crc kubenswrapper[4861]: I0309 09:07:26.971483 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:26Z","lastTransitionTime":"2026-03-09T09:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:27 crc kubenswrapper[4861]: I0309 09:07:27.074312 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:27 crc kubenswrapper[4861]: I0309 09:07:27.074406 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:27 crc kubenswrapper[4861]: I0309 09:07:27.074427 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:27 crc kubenswrapper[4861]: I0309 09:07:27.074451 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:27 crc kubenswrapper[4861]: I0309 09:07:27.074469 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:27Z","lastTransitionTime":"2026-03-09T09:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:27 crc kubenswrapper[4861]: I0309 09:07:27.084446 4861 generic.go:334] "Generic (PLEG): container finished" podID="3a306fd7-dca6-4973-b8fa-4bd07840a104" containerID="1971810ea9b68bd395a0f751a6a1e961e3f1c8885e1de8e8dfebea548cb60985" exitCode=0 Mar 09 09:07:27 crc kubenswrapper[4861]: I0309 09:07:27.084539 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7fs7j" event={"ID":"3a306fd7-dca6-4973-b8fa-4bd07840a104","Type":"ContainerDied","Data":"1971810ea9b68bd395a0f751a6a1e961e3f1c8885e1de8e8dfebea548cb60985"} Mar 09 09:07:27 crc kubenswrapper[4861]: I0309 09:07:27.177045 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:27 crc kubenswrapper[4861]: I0309 09:07:27.177112 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:27 crc kubenswrapper[4861]: I0309 09:07:27.177135 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:27 crc kubenswrapper[4861]: I0309 09:07:27.177164 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:27 crc kubenswrapper[4861]: I0309 09:07:27.177189 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:27Z","lastTransitionTime":"2026-03-09T09:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:27 crc kubenswrapper[4861]: I0309 09:07:27.280259 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:27 crc kubenswrapper[4861]: I0309 09:07:27.280306 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:27 crc kubenswrapper[4861]: I0309 09:07:27.280317 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:27 crc kubenswrapper[4861]: I0309 09:07:27.280333 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:27 crc kubenswrapper[4861]: I0309 09:07:27.280346 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:27Z","lastTransitionTime":"2026-03-09T09:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:27 crc kubenswrapper[4861]: I0309 09:07:27.382703 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:27 crc kubenswrapper[4861]: I0309 09:07:27.382749 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:27 crc kubenswrapper[4861]: I0309 09:07:27.382761 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:27 crc kubenswrapper[4861]: I0309 09:07:27.382778 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:27 crc kubenswrapper[4861]: I0309 09:07:27.382790 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:27Z","lastTransitionTime":"2026-03-09T09:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:27 crc kubenswrapper[4861]: I0309 09:07:27.485862 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:27 crc kubenswrapper[4861]: I0309 09:07:27.485905 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:27 crc kubenswrapper[4861]: I0309 09:07:27.485920 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:27 crc kubenswrapper[4861]: I0309 09:07:27.485941 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:27 crc kubenswrapper[4861]: I0309 09:07:27.485956 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:27Z","lastTransitionTime":"2026-03-09T09:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:27 crc kubenswrapper[4861]: I0309 09:07:27.589149 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:27 crc kubenswrapper[4861]: I0309 09:07:27.589216 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:27 crc kubenswrapper[4861]: I0309 09:07:27.589235 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:27 crc kubenswrapper[4861]: I0309 09:07:27.589268 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:27 crc kubenswrapper[4861]: I0309 09:07:27.589288 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:27Z","lastTransitionTime":"2026-03-09T09:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:27 crc kubenswrapper[4861]: I0309 09:07:27.692180 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:27 crc kubenswrapper[4861]: I0309 09:07:27.692689 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:27 crc kubenswrapper[4861]: I0309 09:07:27.692707 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:27 crc kubenswrapper[4861]: I0309 09:07:27.692735 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:27 crc kubenswrapper[4861]: I0309 09:07:27.692752 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:27Z","lastTransitionTime":"2026-03-09T09:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:27 crc kubenswrapper[4861]: I0309 09:07:27.796509 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:27 crc kubenswrapper[4861]: I0309 09:07:27.796550 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:27 crc kubenswrapper[4861]: I0309 09:07:27.796564 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:27 crc kubenswrapper[4861]: I0309 09:07:27.796591 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:27 crc kubenswrapper[4861]: I0309 09:07:27.796631 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:27Z","lastTransitionTime":"2026-03-09T09:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:27 crc kubenswrapper[4861]: I0309 09:07:27.900765 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:27 crc kubenswrapper[4861]: I0309 09:07:27.900827 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:27 crc kubenswrapper[4861]: I0309 09:07:27.900845 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:27 crc kubenswrapper[4861]: I0309 09:07:27.900870 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:27 crc kubenswrapper[4861]: I0309 09:07:27.900890 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:27Z","lastTransitionTime":"2026-03-09T09:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:28 crc kubenswrapper[4861]: I0309 09:07:28.003509 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:28 crc kubenswrapper[4861]: I0309 09:07:28.003609 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:28 crc kubenswrapper[4861]: I0309 09:07:28.003628 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:28 crc kubenswrapper[4861]: I0309 09:07:28.003654 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:28 crc kubenswrapper[4861]: I0309 09:07:28.003672 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:28Z","lastTransitionTime":"2026-03-09T09:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:28 crc kubenswrapper[4861]: I0309 09:07:28.093837 4861 generic.go:334] "Generic (PLEG): container finished" podID="3a306fd7-dca6-4973-b8fa-4bd07840a104" containerID="7171510ceeac060a30a79555132752a4861496a825d75cd57b34b3908fccc39e" exitCode=0 Mar 09 09:07:28 crc kubenswrapper[4861]: I0309 09:07:28.093910 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7fs7j" event={"ID":"3a306fd7-dca6-4973-b8fa-4bd07840a104","Type":"ContainerDied","Data":"7171510ceeac060a30a79555132752a4861496a825d75cd57b34b3908fccc39e"} Mar 09 09:07:28 crc kubenswrapper[4861]: I0309 09:07:28.101803 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" event={"ID":"752be2d4-f338-4c5e-b51e-452fd8391c73","Type":"ContainerStarted","Data":"431c1bbb7e00b5b4c1615bff59cac3ec1f16e7a4bb52478bd3dccb0b441c24a2"} Mar 09 09:07:28 crc kubenswrapper[4861]: I0309 09:07:28.107280 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:28 crc kubenswrapper[4861]: I0309 09:07:28.107339 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:28 crc kubenswrapper[4861]: I0309 09:07:28.107363 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:28 crc kubenswrapper[4861]: I0309 09:07:28.107438 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:28 crc kubenswrapper[4861]: I0309 09:07:28.107463 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:28Z","lastTransitionTime":"2026-03-09T09:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:28 crc kubenswrapper[4861]: I0309 09:07:28.211858 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:28 crc kubenswrapper[4861]: I0309 09:07:28.211906 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:28 crc kubenswrapper[4861]: I0309 09:07:28.211919 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:28 crc kubenswrapper[4861]: I0309 09:07:28.211934 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:28 crc kubenswrapper[4861]: I0309 09:07:28.211946 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:28Z","lastTransitionTime":"2026-03-09T09:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:28 crc kubenswrapper[4861]: I0309 09:07:28.283215 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 09:07:28 crc kubenswrapper[4861]: I0309 09:07:28.283858 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 09:07:28 crc kubenswrapper[4861]: I0309 09:07:28.283885 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 09:07:28 crc kubenswrapper[4861]: I0309 09:07:28.283911 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 09:07:28 crc kubenswrapper[4861]: I0309 09:07:28.283935 4861 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T09:07:28Z","lastTransitionTime":"2026-03-09T09:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 09:07:28 crc kubenswrapper[4861]: I0309 09:07:28.438505 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ce13338-8caa-4be7-80e7-791207626053-metrics-certs\") pod \"network-metrics-daemon-pp5xh\" (UID: \"1ce13338-8caa-4be7-80e7-791207626053\") " pod="openshift-multus/network-metrics-daemon-pp5xh" Mar 09 09:07:28 crc kubenswrapper[4861]: E0309 09:07:28.438923 4861 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 09:07:28 crc kubenswrapper[4861]: E0309 09:07:28.439066 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ce13338-8caa-4be7-80e7-791207626053-metrics-certs podName:1ce13338-8caa-4be7-80e7-791207626053 nodeName:}" failed. No retries permitted until 2026-03-09 09:07:32.439031264 +0000 UTC m=+95.524070695 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1ce13338-8caa-4be7-80e7-791207626053-metrics-certs") pod "network-metrics-daemon-pp5xh" (UID: "1ce13338-8caa-4be7-80e7-791207626053") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 09:07:28 crc kubenswrapper[4861]: I0309 09:07:28.504143 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-m78dp"] Mar 09 09:07:28 crc kubenswrapper[4861]: I0309 09:07:28.505397 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m78dp" Mar 09 09:07:28 crc kubenswrapper[4861]: I0309 09:07:28.507728 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 09 09:07:28 crc kubenswrapper[4861]: I0309 09:07:28.508448 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 09 09:07:28 crc kubenswrapper[4861]: I0309 09:07:28.510482 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 09 09:07:28 crc kubenswrapper[4861]: I0309 09:07:28.510863 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 09 09:07:28 crc kubenswrapper[4861]: I0309 09:07:28.627168 4861 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 09 09:07:28 crc kubenswrapper[4861]: I0309 09:07:28.635085 4861 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 09 09:07:28 crc kubenswrapper[4861]: I0309 09:07:28.641358 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3b746fb1-9a5a-422a-a221-47d9dd5e996c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-m78dp\" (UID: \"3b746fb1-9a5a-422a-a221-47d9dd5e996c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m78dp" Mar 09 09:07:28 crc kubenswrapper[4861]: I0309 09:07:28.641524 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/3b746fb1-9a5a-422a-a221-47d9dd5e996c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-m78dp\" (UID: \"3b746fb1-9a5a-422a-a221-47d9dd5e996c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m78dp" Mar 09 09:07:28 crc kubenswrapper[4861]: I0309 09:07:28.641574 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b746fb1-9a5a-422a-a221-47d9dd5e996c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-m78dp\" (UID: \"3b746fb1-9a5a-422a-a221-47d9dd5e996c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m78dp" Mar 09 09:07:28 crc kubenswrapper[4861]: I0309 09:07:28.641607 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/3b746fb1-9a5a-422a-a221-47d9dd5e996c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-m78dp\" (UID: \"3b746fb1-9a5a-422a-a221-47d9dd5e996c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m78dp" Mar 09 09:07:28 crc kubenswrapper[4861]: I0309 09:07:28.641714 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b746fb1-9a5a-422a-a221-47d9dd5e996c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-m78dp\" (UID: \"3b746fb1-9a5a-422a-a221-47d9dd5e996c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m78dp" Mar 09 09:07:28 crc kubenswrapper[4861]: I0309 09:07:28.657168 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:07:28 crc kubenswrapper[4861]: I0309 09:07:28.657255 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:07:28 crc kubenswrapper[4861]: I0309 09:07:28.657193 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:07:28 crc kubenswrapper[4861]: E0309 09:07:28.657301 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:07:28 crc kubenswrapper[4861]: I0309 09:07:28.657362 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pp5xh" Mar 09 09:07:28 crc kubenswrapper[4861]: E0309 09:07:28.657455 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:07:28 crc kubenswrapper[4861]: E0309 09:07:28.657572 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pp5xh" podUID="1ce13338-8caa-4be7-80e7-791207626053" Mar 09 09:07:28 crc kubenswrapper[4861]: E0309 09:07:28.657661 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:07:28 crc kubenswrapper[4861]: I0309 09:07:28.742533 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b746fb1-9a5a-422a-a221-47d9dd5e996c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-m78dp\" (UID: \"3b746fb1-9a5a-422a-a221-47d9dd5e996c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m78dp" Mar 09 09:07:28 crc kubenswrapper[4861]: I0309 09:07:28.742632 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3b746fb1-9a5a-422a-a221-47d9dd5e996c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-m78dp\" (UID: \"3b746fb1-9a5a-422a-a221-47d9dd5e996c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m78dp" Mar 09 09:07:28 crc kubenswrapper[4861]: I0309 09:07:28.742708 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/3b746fb1-9a5a-422a-a221-47d9dd5e996c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-m78dp\" (UID: \"3b746fb1-9a5a-422a-a221-47d9dd5e996c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m78dp" Mar 09 09:07:28 crc kubenswrapper[4861]: I0309 09:07:28.742744 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b746fb1-9a5a-422a-a221-47d9dd5e996c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-m78dp\" (UID: \"3b746fb1-9a5a-422a-a221-47d9dd5e996c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m78dp" Mar 09 09:07:28 crc kubenswrapper[4861]: I0309 09:07:28.742777 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/3b746fb1-9a5a-422a-a221-47d9dd5e996c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-m78dp\" (UID: \"3b746fb1-9a5a-422a-a221-47d9dd5e996c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m78dp" Mar 09 09:07:28 crc kubenswrapper[4861]: I0309 09:07:28.742875 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/3b746fb1-9a5a-422a-a221-47d9dd5e996c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-m78dp\" (UID: \"3b746fb1-9a5a-422a-a221-47d9dd5e996c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m78dp" Mar 09 09:07:28 crc kubenswrapper[4861]: I0309 09:07:28.743844 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/3b746fb1-9a5a-422a-a221-47d9dd5e996c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-m78dp\" (UID: \"3b746fb1-9a5a-422a-a221-47d9dd5e996c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m78dp" Mar 09 09:07:28 crc kubenswrapper[4861]: I0309 09:07:28.744254 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3b746fb1-9a5a-422a-a221-47d9dd5e996c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-m78dp\" (UID: \"3b746fb1-9a5a-422a-a221-47d9dd5e996c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m78dp" Mar 09 09:07:28 crc kubenswrapper[4861]: I0309 09:07:28.758722 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b746fb1-9a5a-422a-a221-47d9dd5e996c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-m78dp\" (UID: \"3b746fb1-9a5a-422a-a221-47d9dd5e996c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m78dp" Mar 09 09:07:28 crc kubenswrapper[4861]: I0309 09:07:28.771734 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b746fb1-9a5a-422a-a221-47d9dd5e996c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-m78dp\" (UID: \"3b746fb1-9a5a-422a-a221-47d9dd5e996c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m78dp" Mar 09 09:07:28 crc kubenswrapper[4861]: I0309 09:07:28.885331 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m78dp" Mar 09 09:07:28 crc kubenswrapper[4861]: W0309 09:07:28.913520 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b746fb1_9a5a_422a_a221_47d9dd5e996c.slice/crio-69bfd266798438ab3d47341d686c471bd7afeae4175d76d0e994d964f4c66f32 WatchSource:0}: Error finding container 69bfd266798438ab3d47341d686c471bd7afeae4175d76d0e994d964f4c66f32: Status 404 returned error can't find the container with id 69bfd266798438ab3d47341d686c471bd7afeae4175d76d0e994d964f4c66f32 Mar 09 09:07:29 crc kubenswrapper[4861]: I0309 09:07:29.107885 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m78dp" event={"ID":"3b746fb1-9a5a-422a-a221-47d9dd5e996c","Type":"ContainerStarted","Data":"07ce19ae374ca1a53a4b864a908ff624a65fb1d776de0003bb42bedb5d43a689"} Mar 09 09:07:29 crc kubenswrapper[4861]: I0309 09:07:29.107964 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m78dp" event={"ID":"3b746fb1-9a5a-422a-a221-47d9dd5e996c","Type":"ContainerStarted","Data":"69bfd266798438ab3d47341d686c471bd7afeae4175d76d0e994d964f4c66f32"} Mar 09 09:07:29 crc kubenswrapper[4861]: I0309 09:07:29.113805 4861 generic.go:334] "Generic (PLEG): container finished" podID="3a306fd7-dca6-4973-b8fa-4bd07840a104" containerID="bf122adf5566e4292f63bf5ff49bbc86e303adfc5304d4b4af05f58a50c412cd" exitCode=0 Mar 09 09:07:29 crc kubenswrapper[4861]: I0309 09:07:29.113860 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7fs7j" event={"ID":"3a306fd7-dca6-4973-b8fa-4bd07840a104","Type":"ContainerDied","Data":"bf122adf5566e4292f63bf5ff49bbc86e303adfc5304d4b4af05f58a50c412cd"} Mar 09 09:07:29 crc kubenswrapper[4861]: I0309 09:07:29.124709 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-m78dp" podStartSLOduration=48.124685389 podStartE2EDuration="48.124685389s" podCreationTimestamp="2026-03-09 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:07:29.124185656 +0000 UTC m=+92.209225067" watchObservedRunningTime="2026-03-09 09:07:29.124685389 +0000 UTC m=+92.209724830" Mar 09 09:07:29 crc kubenswrapper[4861]: I0309 09:07:29.425047 4861 csr.go:261] certificate signing request csr-sz8xl is approved, waiting to be issued Mar 09 09:07:29 crc kubenswrapper[4861]: I0309 09:07:29.433114 4861 csr.go:257] certificate signing request csr-sz8xl is issued Mar 09 09:07:30 crc kubenswrapper[4861]: I0309 09:07:30.121017 4861 generic.go:334] "Generic (PLEG): container finished" podID="3a306fd7-dca6-4973-b8fa-4bd07840a104" containerID="15b6acc4ff504218134ee503fce5b1a33f9fbaf9487c8cf74d09b0c4e1be49e0" exitCode=0 Mar 09 09:07:30 crc kubenswrapper[4861]: I0309 09:07:30.121113 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7fs7j" event={"ID":"3a306fd7-dca6-4973-b8fa-4bd07840a104","Type":"ContainerDied","Data":"15b6acc4ff504218134ee503fce5b1a33f9fbaf9487c8cf74d09b0c4e1be49e0"} Mar 09 09:07:30 crc kubenswrapper[4861]: I0309 09:07:30.434876 4861 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-28 22:44:01.83008036 +0000 UTC Mar 09 09:07:30 crc kubenswrapper[4861]: I0309 09:07:30.435132 4861 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7069h36m31.394951277s for next certificate rotation Mar 09 09:07:30 crc kubenswrapper[4861]: I0309 09:07:30.657350 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:07:30 crc kubenswrapper[4861]: I0309 09:07:30.657396 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pp5xh" Mar 09 09:07:30 crc kubenswrapper[4861]: I0309 09:07:30.657401 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:07:30 crc kubenswrapper[4861]: I0309 09:07:30.657350 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:07:30 crc kubenswrapper[4861]: E0309 09:07:30.657501 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:07:30 crc kubenswrapper[4861]: E0309 09:07:30.657620 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:07:30 crc kubenswrapper[4861]: E0309 09:07:30.657707 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pp5xh" podUID="1ce13338-8caa-4be7-80e7-791207626053" Mar 09 09:07:30 crc kubenswrapper[4861]: E0309 09:07:30.657796 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:07:31 crc kubenswrapper[4861]: I0309 09:07:31.129436 4861 generic.go:334] "Generic (PLEG): container finished" podID="3a306fd7-dca6-4973-b8fa-4bd07840a104" containerID="8165bdce65241d136a2ab8c45f6c520f5f12ab13daa8b4476cbed58be58bdf8d" exitCode=0 Mar 09 09:07:31 crc kubenswrapper[4861]: I0309 09:07:31.129532 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7fs7j" event={"ID":"3a306fd7-dca6-4973-b8fa-4bd07840a104","Type":"ContainerDied","Data":"8165bdce65241d136a2ab8c45f6c520f5f12ab13daa8b4476cbed58be58bdf8d"} Mar 09 09:07:31 crc kubenswrapper[4861]: I0309 09:07:31.137232 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" event={"ID":"752be2d4-f338-4c5e-b51e-452fd8391c73","Type":"ContainerStarted","Data":"65c9df23fe9acb6770fb9f21b9a56a1b6551aa9c7787d2a41350815ce00dd460"} Mar 09 09:07:31 crc kubenswrapper[4861]: I0309 09:07:31.137582 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:31 crc kubenswrapper[4861]: I0309 09:07:31.137599 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:31 crc kubenswrapper[4861]: I0309 09:07:31.137607 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:31 crc kubenswrapper[4861]: I0309 09:07:31.166206 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:31 crc kubenswrapper[4861]: I0309 09:07:31.167175 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:31 crc kubenswrapper[4861]: I0309 09:07:31.185592 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" podStartSLOduration=50.185573332 podStartE2EDuration="50.185573332s" podCreationTimestamp="2026-03-09 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:07:31.185534901 +0000 UTC m=+94.270574332" watchObservedRunningTime="2026-03-09 09:07:31.185573332 +0000 UTC m=+94.270612743" Mar 09 09:07:31 crc kubenswrapper[4861]: I0309 09:07:31.436329 4861 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-20 08:23:18.457783612 +0000 UTC Mar 09 09:07:31 crc kubenswrapper[4861]: I0309 09:07:31.436988 4861 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6143h15m47.020801516s for next certificate rotation Mar 09 09:07:32 crc kubenswrapper[4861]: I0309 09:07:32.143219 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7fs7j" event={"ID":"3a306fd7-dca6-4973-b8fa-4bd07840a104","Type":"ContainerStarted","Data":"7deb1e9f1deff77e66066799fd1a1ef7385e07a5ce42d832ca12c8e99844ea1e"} Mar 09 09:07:32 crc kubenswrapper[4861]: I0309 09:07:32.163785 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-7fs7j" podStartSLOduration=51.163765818 podStartE2EDuration="51.163765818s" podCreationTimestamp="2026-03-09 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:07:32.162489943 +0000 UTC m=+95.247529354" watchObservedRunningTime="2026-03-09 09:07:32.163765818 +0000 UTC m=+95.248805229" Mar 09 09:07:32 crc kubenswrapper[4861]: I0309 09:07:32.488271 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ce13338-8caa-4be7-80e7-791207626053-metrics-certs\") pod \"network-metrics-daemon-pp5xh\" (UID: \"1ce13338-8caa-4be7-80e7-791207626053\") " pod="openshift-multus/network-metrics-daemon-pp5xh" Mar 09 09:07:32 crc kubenswrapper[4861]: E0309 09:07:32.488445 4861 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 09:07:32 crc kubenswrapper[4861]: E0309 09:07:32.488514 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ce13338-8caa-4be7-80e7-791207626053-metrics-certs podName:1ce13338-8caa-4be7-80e7-791207626053 nodeName:}" failed. No retries permitted until 2026-03-09 09:07:40.488499414 +0000 UTC m=+103.573538815 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1ce13338-8caa-4be7-80e7-791207626053-metrics-certs") pod "network-metrics-daemon-pp5xh" (UID: "1ce13338-8caa-4be7-80e7-791207626053") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 09:07:32 crc kubenswrapper[4861]: I0309 09:07:32.614891 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-pp5xh"] Mar 09 09:07:32 crc kubenswrapper[4861]: I0309 09:07:32.615081 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pp5xh" Mar 09 09:07:32 crc kubenswrapper[4861]: E0309 09:07:32.615217 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pp5xh" podUID="1ce13338-8caa-4be7-80e7-791207626053" Mar 09 09:07:32 crc kubenswrapper[4861]: I0309 09:07:32.657875 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:07:32 crc kubenswrapper[4861]: I0309 09:07:32.657938 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:07:32 crc kubenswrapper[4861]: E0309 09:07:32.658046 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 09:07:32 crc kubenswrapper[4861]: I0309 09:07:32.657884 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:07:32 crc kubenswrapper[4861]: E0309 09:07:32.658165 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 09:07:32 crc kubenswrapper[4861]: E0309 09:07:32.658431 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 09:07:33 crc kubenswrapper[4861]: I0309 09:07:33.657678 4861 scope.go:117] "RemoveContainer" containerID="f88f2e129d4bfc17b0ac4416da5c5096bf314e097dfa40a48858a83425ca91e1" Mar 09 09:07:33 crc kubenswrapper[4861]: E0309 09:07:33.657836 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.404764 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:07:34 crc kubenswrapper[4861]: E0309 09:07:34.405047 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:07:50.404979012 +0000 UTC m=+113.490018413 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.506190 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.506291 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.506352 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:07:34 crc kubenswrapper[4861]: E0309 09:07:34.506407 4861 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.506439 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:07:34 crc kubenswrapper[4861]: E0309 09:07:34.506487 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 09:07:50.506469512 +0000 UTC m=+113.591508913 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 09:07:34 crc kubenswrapper[4861]: E0309 09:07:34.506676 4861 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 09:07:34 crc kubenswrapper[4861]: E0309 09:07:34.506718 4861 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 09:07:34 crc kubenswrapper[4861]: E0309 09:07:34.506777 4861 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 09:07:34 crc kubenswrapper[4861]: E0309 09:07:34.506689 4861 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 09:07:34 crc kubenswrapper[4861]: E0309 09:07:34.506805 4861 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:07:34 crc kubenswrapper[4861]: E0309 09:07:34.506865 4861 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 09:07:34 crc kubenswrapper[4861]: E0309 09:07:34.506902 4861 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:07:34 crc kubenswrapper[4861]: E0309 09:07:34.506809 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 09:07:50.506772829 +0000 UTC m=+113.591812390 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 09:07:34 crc kubenswrapper[4861]: E0309 09:07:34.506934 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 09:07:50.506923883 +0000 UTC m=+113.591963284 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:07:34 crc kubenswrapper[4861]: E0309 09:07:34.506983 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 09:07:50.506954844 +0000 UTC m=+113.591994415 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.619465 4861 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.619636 4861 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.679651 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.679686 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pp5xh" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.679693 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.679662 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.682422 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.682689 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.683094 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.683402 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.683544 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.683679 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.704333 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-lz22f"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.705216 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lz22f" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.707781 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4e3ac241-df4c-4438-83a5-9cc923b0f82f-auth-proxy-config\") pod \"machine-approver-56656f9798-lz22f\" (UID: \"4e3ac241-df4c-4438-83a5-9cc923b0f82f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lz22f" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.707868 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/4e3ac241-df4c-4438-83a5-9cc923b0f82f-machine-approver-tls\") pod \"machine-approver-56656f9798-lz22f\" (UID: \"4e3ac241-df4c-4438-83a5-9cc923b0f82f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lz22f" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.707929 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qclcn\" (UniqueName: \"kubernetes.io/projected/4e3ac241-df4c-4438-83a5-9cc923b0f82f-kube-api-access-qclcn\") pod \"machine-approver-56656f9798-lz22f\" (UID: \"4e3ac241-df4c-4438-83a5-9cc923b0f82f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lz22f" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.707975 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e3ac241-df4c-4438-83a5-9cc923b0f82f-config\") pod \"machine-approver-56656f9798-lz22f\" (UID: \"4e3ac241-df4c-4438-83a5-9cc923b0f82f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lz22f" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.709673 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.719108 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-h6j45"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.730950 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-gkrbr"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.731505 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-h6j45" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.711702 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.731878 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-5m9c6"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.732018 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-gkrbr" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.716012 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.726791 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.727299 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.730200 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.755118 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jzt4r"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.755410 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lj7m8"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.755586 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-h84v9"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.755901 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h84v9" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.756205 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-5m9c6" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.756352 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lj7m8" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.756665 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jzt4r" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.756857 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-gqnxr"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.757575 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-gqnxr" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.758321 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-jtnvg"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.758740 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-24x2s"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.759297 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-24x2s" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.759427 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-l2jkp"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.759588 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.759712 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-jtnvg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.759755 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.759871 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.759957 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-l2jkp" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.760005 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.760296 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.760356 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.760458 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.762418 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-hn7r4"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.760917 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.766104 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.770470 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ldhj8"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.770831 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ldhj8" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.770944 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hn7r4" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.777250 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.777795 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.777905 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.778262 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.778342 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.779058 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.779175 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.779431 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.780058 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.780604 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.780693 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.780754 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.780868 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.780868 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.781002 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.781084 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.781152 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.781366 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.781528 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.781647 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.781756 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.782452 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.782596 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.782757 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.782881 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.782999 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.783168 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.783312 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.783467 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.783576 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.783675 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.783783 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.783892 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.783989 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.784058 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.784128 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.784163 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.783995 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.784299 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.784425 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-dwjwj"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.784480 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.784302 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.784640 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.784744 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.784843 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.784943 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.785016 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-97dkg"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.785048 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dwjwj" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.785363 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-97dkg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.785516 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.785672 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.786482 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-75sbr"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.786923 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-75sbr" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.787465 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.799935 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.802560 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-qh9sg"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.804472 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-qh9sg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.805382 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.806061 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.811391 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.811435 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2xgl\" (UniqueName: \"kubernetes.io/projected/f300ad16-f055-45cf-ac02-960f49b5d426-kube-api-access-b2xgl\") pod \"console-operator-58897d9998-jtnvg\" (UID: \"f300ad16-f055-45cf-ac02-960f49b5d426\") " pod="openshift-console-operator/console-operator-58897d9998-jtnvg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.811882 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8415c9be-22de-49bb-8b53-6f06a923ef33-config\") pod \"openshift-apiserver-operator-796bbdcf4f-lj7m8\" (UID: \"8415c9be-22de-49bb-8b53-6f06a923ef33\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lj7m8" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.811968 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/da4004a6-c6fd-41d6-a651-b4aaec2d6454-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-gqnxr\" (UID: \"da4004a6-c6fd-41d6-a651-b4aaec2d6454\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gqnxr" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.812052 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e3056772-5de8-4fed-9796-440422743470-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-24x2s\" (UID: \"e3056772-5de8-4fed-9796-440422743470\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-24x2s" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.812135 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b820aa76-e1f5-440b-b942-2a4468dc4d51-config\") pod \"authentication-operator-69f744f599-5m9c6\" (UID: \"b820aa76-e1f5-440b-b942-2a4468dc4d51\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5m9c6" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.812211 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e2a968f-c9a7-415e-895a-3d1f8a8a3e0b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-jzt4r\" (UID: \"8e2a968f-c9a7-415e-895a-3d1f8a8a3e0b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jzt4r" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.812329 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/85a3bbcb-e663-4a97-980c-606c979409d7-console-oauth-config\") pod \"console-f9d7485db-qh9sg\" (UID: \"85a3bbcb-e663-4a97-980c-606c979409d7\") " pod="openshift-console/console-f9d7485db-qh9sg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.832227 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.833653 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.833682 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.833992 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.834069 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.834236 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.834523 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pkvv\" (UniqueName: \"kubernetes.io/projected/85a3bbcb-e663-4a97-980c-606c979409d7-kube-api-access-8pkvv\") pod \"console-f9d7485db-qh9sg\" (UID: \"85a3bbcb-e663-4a97-980c-606c979409d7\") " pod="openshift-console/console-f9d7485db-qh9sg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.834564 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-97dkg\" (UID: \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\") " pod="openshift-authentication/oauth-openshift-558db77b4-97dkg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.834589 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfzss\" (UniqueName: \"kubernetes.io/projected/8415c9be-22de-49bb-8b53-6f06a923ef33-kube-api-access-hfzss\") pod \"openshift-apiserver-operator-796bbdcf4f-lj7m8\" (UID: \"8415c9be-22de-49bb-8b53-6f06a923ef33\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lj7m8" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.834625 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qclcn\" (UniqueName: \"kubernetes.io/projected/4e3ac241-df4c-4438-83a5-9cc923b0f82f-kube-api-access-qclcn\") pod \"machine-approver-56656f9798-lz22f\" (UID: \"4e3ac241-df4c-4438-83a5-9cc923b0f82f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lz22f" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.834650 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/2afc7adc-5b22-4203-9265-2ea4293f132f-node-pullsecrets\") pod \"apiserver-76f77b778f-l2jkp\" (UID: \"2afc7adc-5b22-4203-9265-2ea4293f132f\") " pod="openshift-apiserver/apiserver-76f77b778f-l2jkp" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.834675 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85a3bbcb-e663-4a97-980c-606c979409d7-trusted-ca-bundle\") pod \"console-f9d7485db-qh9sg\" (UID: \"85a3bbcb-e663-4a97-980c-606c979409d7\") " pod="openshift-console/console-f9d7485db-qh9sg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.834698 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8415c9be-22de-49bb-8b53-6f06a923ef33-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-lj7m8\" (UID: \"8415c9be-22de-49bb-8b53-6f06a923ef33\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lj7m8" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.834731 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e3ac241-df4c-4438-83a5-9cc923b0f82f-config\") pod \"machine-approver-56656f9798-lz22f\" (UID: \"4e3ac241-df4c-4438-83a5-9cc923b0f82f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lz22f" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.834752 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e2a968f-c9a7-415e-895a-3d1f8a8a3e0b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-jzt4r\" (UID: \"8e2a968f-c9a7-415e-895a-3d1f8a8a3e0b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jzt4r" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.834757 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.834771 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ba6c5b9f-7812-4b6d-998b-6f368a6edf83-client-ca\") pod \"route-controller-manager-6576b87f9c-h84v9\" (UID: \"ba6c5b9f-7812-4b6d-998b-6f368a6edf83\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h84v9" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.834791 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2afc7adc-5b22-4203-9265-2ea4293f132f-etcd-client\") pod \"apiserver-76f77b778f-l2jkp\" (UID: \"2afc7adc-5b22-4203-9265-2ea4293f132f\") " pod="openshift-apiserver/apiserver-76f77b778f-l2jkp" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.834810 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2afc7adc-5b22-4203-9265-2ea4293f132f-trusted-ca-bundle\") pod \"apiserver-76f77b778f-l2jkp\" (UID: \"2afc7adc-5b22-4203-9265-2ea4293f132f\") " pod="openshift-apiserver/apiserver-76f77b778f-l2jkp" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.834830 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kmsc\" (UniqueName: \"kubernetes.io/projected/271a4e3c-1aa2-4bf5-bdfe-0495910f75d6-kube-api-access-7kmsc\") pod \"cluster-image-registry-operator-dc59b4c8b-ldhj8\" (UID: \"271a4e3c-1aa2-4bf5-bdfe-0495910f75d6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ldhj8" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.834874 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kb6dq\" (UniqueName: \"kubernetes.io/projected/6273411b-70f9-4fdf-bf82-d156b10a5824-kube-api-access-kb6dq\") pod \"apiserver-7bbb656c7d-hn7r4\" (UID: \"6273411b-70f9-4fdf-bf82-d156b10a5824\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hn7r4" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.834897 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2afc7adc-5b22-4203-9265-2ea4293f132f-serving-cert\") pod \"apiserver-76f77b778f-l2jkp\" (UID: \"2afc7adc-5b22-4203-9265-2ea4293f132f\") " pod="openshift-apiserver/apiserver-76f77b778f-l2jkp" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.834919 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da4004a6-c6fd-41d6-a651-b4aaec2d6454-config\") pod \"machine-api-operator-5694c8668f-gqnxr\" (UID: \"da4004a6-c6fd-41d6-a651-b4aaec2d6454\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gqnxr" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.834941 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/2afc7adc-5b22-4203-9265-2ea4293f132f-image-import-ca\") pod \"apiserver-76f77b778f-l2jkp\" (UID: \"2afc7adc-5b22-4203-9265-2ea4293f132f\") " pod="openshift-apiserver/apiserver-76f77b778f-l2jkp" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.834968 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-97dkg\" (UID: \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\") " pod="openshift-authentication/oauth-openshift-558db77b4-97dkg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.835004 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-audit-policies\") pod \"oauth-openshift-558db77b4-97dkg\" (UID: \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\") " pod="openshift-authentication/oauth-openshift-558db77b4-97dkg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.835029 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-97dkg\" (UID: \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\") " pod="openshift-authentication/oauth-openshift-558db77b4-97dkg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.835064 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2a29855a-bbc0-458f-a9a0-0ddfd8763f2d-serving-cert\") pod \"openshift-config-operator-7777fb866f-dwjwj\" (UID: \"2a29855a-bbc0-458f-a9a0-0ddfd8763f2d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dwjwj" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.835086 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b820aa76-e1f5-440b-b942-2a4468dc4d51-serving-cert\") pod \"authentication-operator-69f744f599-5m9c6\" (UID: \"b820aa76-e1f5-440b-b942-2a4468dc4d51\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5m9c6" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.835108 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-97dkg\" (UID: \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\") " pod="openshift-authentication/oauth-openshift-558db77b4-97dkg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.835131 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9x2n\" (UniqueName: \"kubernetes.io/projected/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-kube-api-access-p9x2n\") pod \"oauth-openshift-558db77b4-97dkg\" (UID: \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\") " pod="openshift-authentication/oauth-openshift-558db77b4-97dkg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.835151 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20b8695b-8129-4f02-824d-5ca2a451d899-serving-cert\") pod \"controller-manager-879f6c89f-gkrbr\" (UID: \"20b8695b-8129-4f02-824d-5ca2a451d899\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gkrbr" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.835173 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhdr9\" (UniqueName: \"kubernetes.io/projected/20b8695b-8129-4f02-824d-5ca2a451d899-kube-api-access-dhdr9\") pod \"controller-manager-879f6c89f-gkrbr\" (UID: \"20b8695b-8129-4f02-824d-5ca2a451d899\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gkrbr" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.835210 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6273411b-70f9-4fdf-bf82-d156b10a5824-audit-dir\") pod \"apiserver-7bbb656c7d-hn7r4\" (UID: \"6273411b-70f9-4fdf-bf82-d156b10a5824\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hn7r4" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.835233 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/2afc7adc-5b22-4203-9265-2ea4293f132f-etcd-serving-ca\") pod \"apiserver-76f77b778f-l2jkp\" (UID: \"2afc7adc-5b22-4203-9265-2ea4293f132f\") " pod="openshift-apiserver/apiserver-76f77b778f-l2jkp" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.835253 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-97dkg\" (UID: \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\") " pod="openshift-authentication/oauth-openshift-558db77b4-97dkg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.835278 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20b8695b-8129-4f02-824d-5ca2a451d899-config\") pod \"controller-manager-879f6c89f-gkrbr\" (UID: \"20b8695b-8129-4f02-824d-5ca2a451d899\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gkrbr" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.835301 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj42q\" (UniqueName: \"kubernetes.io/projected/ba6c5b9f-7812-4b6d-998b-6f368a6edf83-kube-api-access-mj42q\") pod \"route-controller-manager-6576b87f9c-h84v9\" (UID: \"ba6c5b9f-7812-4b6d-998b-6f368a6edf83\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h84v9" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.834000 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-cjkj6"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.835457 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.835549 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.835744 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.835754 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-fsckq"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.835762 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.835862 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.834244 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.834000 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.836032 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.836083 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-fsckq" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.836239 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e3ac241-df4c-4438-83a5-9cc923b0f82f-config\") pod \"machine-approver-56656f9798-lz22f\" (UID: \"4e3ac241-df4c-4438-83a5-9cc923b0f82f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lz22f" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.835325 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/85a3bbcb-e663-4a97-980c-606c979409d7-console-serving-cert\") pod \"console-f9d7485db-qh9sg\" (UID: \"85a3bbcb-e663-4a97-980c-606c979409d7\") " pod="openshift-console/console-f9d7485db-qh9sg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.836384 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.836389 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/da4004a6-c6fd-41d6-a651-b4aaec2d6454-images\") pod \"machine-api-operator-5694c8668f-gqnxr\" (UID: \"da4004a6-c6fd-41d6-a651-b4aaec2d6454\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gqnxr" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.836614 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/20b8695b-8129-4f02-824d-5ca2a451d899-client-ca\") pod \"controller-manager-879f6c89f-gkrbr\" (UID: \"20b8695b-8129-4f02-824d-5ca2a451d899\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gkrbr" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.836637 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-97dkg\" (UID: \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\") " pod="openshift-authentication/oauth-openshift-558db77b4-97dkg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.836672 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6273411b-70f9-4fdf-bf82-d156b10a5824-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-hn7r4\" (UID: \"6273411b-70f9-4fdf-bf82-d156b10a5824\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hn7r4" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.836692 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/54ee5cc6-676d-412c-b2e8-66308fcfc3d6-metrics-tls\") pod \"dns-operator-744455d44c-75sbr\" (UID: \"54ee5cc6-676d-412c-b2e8-66308fcfc3d6\") " pod="openshift-dns-operator/dns-operator-744455d44c-75sbr" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.836703 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-q8fz7"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.836723 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/2afc7adc-5b22-4203-9265-2ea4293f132f-encryption-config\") pod \"apiserver-76f77b778f-l2jkp\" (UID: \"2afc7adc-5b22-4203-9265-2ea4293f132f\") " pod="openshift-apiserver/apiserver-76f77b778f-l2jkp" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.836741 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-97dkg\" (UID: \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\") " pod="openshift-authentication/oauth-openshift-558db77b4-97dkg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.836767 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fggwc\" (UniqueName: \"kubernetes.io/projected/b820aa76-e1f5-440b-b942-2a4468dc4d51-kube-api-access-fggwc\") pod \"authentication-operator-69f744f599-5m9c6\" (UID: \"b820aa76-e1f5-440b-b942-2a4468dc4d51\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5m9c6" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.836819 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4e3ac241-df4c-4438-83a5-9cc923b0f82f-auth-proxy-config\") pod \"machine-approver-56656f9798-lz22f\" (UID: \"4e3ac241-df4c-4438-83a5-9cc923b0f82f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lz22f" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.836847 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b820aa76-e1f5-440b-b942-2a4468dc4d51-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-5m9c6\" (UID: \"b820aa76-e1f5-440b-b942-2a4468dc4d51\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5m9c6" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.836865 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6273411b-70f9-4fdf-bf82-d156b10a5824-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-hn7r4\" (UID: \"6273411b-70f9-4fdf-bf82-d156b10a5824\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hn7r4" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.836885 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6273411b-70f9-4fdf-bf82-d156b10a5824-encryption-config\") pod \"apiserver-7bbb656c7d-hn7r4\" (UID: \"6273411b-70f9-4fdf-bf82-d156b10a5824\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hn7r4" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.837223 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-q8fz7" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.837561 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f300ad16-f055-45cf-ac02-960f49b5d426-config\") pod \"console-operator-58897d9998-jtnvg\" (UID: \"f300ad16-f055-45cf-ac02-960f49b5d426\") " pod="openshift-console-operator/console-operator-58897d9998-jtnvg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.837616 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f300ad16-f055-45cf-ac02-960f49b5d426-trusted-ca\") pod \"console-operator-58897d9998-jtnvg\" (UID: \"f300ad16-f055-45cf-ac02-960f49b5d426\") " pod="openshift-console-operator/console-operator-58897d9998-jtnvg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.837653 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lxkv\" (UniqueName: \"kubernetes.io/projected/e3056772-5de8-4fed-9796-440422743470-kube-api-access-6lxkv\") pod \"cluster-samples-operator-665b6dd947-24x2s\" (UID: \"e3056772-5de8-4fed-9796-440422743470\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-24x2s" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.837705 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2afc7adc-5b22-4203-9265-2ea4293f132f-audit-dir\") pod \"apiserver-76f77b778f-l2jkp\" (UID: \"2afc7adc-5b22-4203-9265-2ea4293f132f\") " pod="openshift-apiserver/apiserver-76f77b778f-l2jkp" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.837734 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/85a3bbcb-e663-4a97-980c-606c979409d7-console-config\") pod \"console-f9d7485db-qh9sg\" (UID: \"85a3bbcb-e663-4a97-980c-606c979409d7\") " pod="openshift-console/console-f9d7485db-qh9sg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.837769 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/85a3bbcb-e663-4a97-980c-606c979409d7-service-ca\") pod \"console-f9d7485db-qh9sg\" (UID: \"85a3bbcb-e663-4a97-980c-606c979409d7\") " pod="openshift-console/console-f9d7485db-qh9sg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.837820 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/2afc7adc-5b22-4203-9265-2ea4293f132f-audit\") pod \"apiserver-76f77b778f-l2jkp\" (UID: \"2afc7adc-5b22-4203-9265-2ea4293f132f\") " pod="openshift-apiserver/apiserver-76f77b778f-l2jkp" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.837842 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba6c5b9f-7812-4b6d-998b-6f368a6edf83-serving-cert\") pod \"route-controller-manager-6576b87f9c-h84v9\" (UID: \"ba6c5b9f-7812-4b6d-998b-6f368a6edf83\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h84v9" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.837862 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-97dkg\" (UID: \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\") " pod="openshift-authentication/oauth-openshift-558db77b4-97dkg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.837991 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drh7b\" (UniqueName: \"kubernetes.io/projected/54ee5cc6-676d-412c-b2e8-66308fcfc3d6-kube-api-access-drh7b\") pod \"dns-operator-744455d44c-75sbr\" (UID: \"54ee5cc6-676d-412c-b2e8-66308fcfc3d6\") " pod="openshift-dns-operator/dns-operator-744455d44c-75sbr" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.838025 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/4e3ac241-df4c-4438-83a5-9cc923b0f82f-machine-approver-tls\") pod \"machine-approver-56656f9798-lz22f\" (UID: \"4e3ac241-df4c-4438-83a5-9cc923b0f82f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lz22f" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.838155 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-97dkg\" (UID: \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\") " pod="openshift-authentication/oauth-openshift-558db77b4-97dkg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.838268 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.838268 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwsc2\" (UniqueName: \"kubernetes.io/projected/da4004a6-c6fd-41d6-a651-b4aaec2d6454-kube-api-access-dwsc2\") pod \"machine-api-operator-5694c8668f-gqnxr\" (UID: \"da4004a6-c6fd-41d6-a651-b4aaec2d6454\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gqnxr" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.838426 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6273411b-70f9-4fdf-bf82-d156b10a5824-etcd-client\") pod \"apiserver-7bbb656c7d-hn7r4\" (UID: \"6273411b-70f9-4fdf-bf82-d156b10a5824\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hn7r4" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.838436 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.838478 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6273411b-70f9-4fdf-bf82-d156b10a5824-serving-cert\") pod \"apiserver-7bbb656c7d-hn7r4\" (UID: \"6273411b-70f9-4fdf-bf82-d156b10a5824\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hn7r4" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.838517 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2afc7adc-5b22-4203-9265-2ea4293f132f-config\") pod \"apiserver-76f77b778f-l2jkp\" (UID: \"2afc7adc-5b22-4203-9265-2ea4293f132f\") " pod="openshift-apiserver/apiserver-76f77b778f-l2jkp" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.838564 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48s8c\" (UniqueName: \"kubernetes.io/projected/2afc7adc-5b22-4203-9265-2ea4293f132f-kube-api-access-48s8c\") pod \"apiserver-76f77b778f-l2jkp\" (UID: \"2afc7adc-5b22-4203-9265-2ea4293f132f\") " pod="openshift-apiserver/apiserver-76f77b778f-l2jkp" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.838588 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/271a4e3c-1aa2-4bf5-bdfe-0495910f75d6-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-ldhj8\" (UID: \"271a4e3c-1aa2-4bf5-bdfe-0495910f75d6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ldhj8" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.838634 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b820aa76-e1f5-440b-b942-2a4468dc4d51-service-ca-bundle\") pod \"authentication-operator-69f744f599-5m9c6\" (UID: \"b820aa76-e1f5-440b-b942-2a4468dc4d51\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5m9c6" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.838663 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-audit-dir\") pod \"oauth-openshift-558db77b4-97dkg\" (UID: \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\") " pod="openshift-authentication/oauth-openshift-558db77b4-97dkg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.838689 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/85a3bbcb-e663-4a97-980c-606c979409d7-oauth-serving-cert\") pod \"console-f9d7485db-qh9sg\" (UID: \"85a3bbcb-e663-4a97-980c-606c979409d7\") " pod="openshift-console/console-f9d7485db-qh9sg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.838744 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba6c5b9f-7812-4b6d-998b-6f368a6edf83-config\") pod \"route-controller-manager-6576b87f9c-h84v9\" (UID: \"ba6c5b9f-7812-4b6d-998b-6f368a6edf83\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h84v9" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.838769 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/271a4e3c-1aa2-4bf5-bdfe-0495910f75d6-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-ldhj8\" (UID: \"271a4e3c-1aa2-4bf5-bdfe-0495910f75d6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ldhj8" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.838825 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8w4c\" (UniqueName: \"kubernetes.io/projected/566d8448-e794-44ee-9d17-e92493adcd87-kube-api-access-v8w4c\") pod \"downloads-7954f5f757-h6j45\" (UID: \"566d8448-e794-44ee-9d17-e92493adcd87\") " pod="openshift-console/downloads-7954f5f757-h6j45" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.838848 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmdsk\" (UniqueName: \"kubernetes.io/projected/8e2a968f-c9a7-415e-895a-3d1f8a8a3e0b-kube-api-access-wmdsk\") pod \"openshift-controller-manager-operator-756b6f6bc6-jzt4r\" (UID: \"8e2a968f-c9a7-415e-895a-3d1f8a8a3e0b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jzt4r" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.838893 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/271a4e3c-1aa2-4bf5-bdfe-0495910f75d6-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-ldhj8\" (UID: \"271a4e3c-1aa2-4bf5-bdfe-0495910f75d6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ldhj8" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.838921 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6273411b-70f9-4fdf-bf82-d156b10a5824-audit-policies\") pod \"apiserver-7bbb656c7d-hn7r4\" (UID: \"6273411b-70f9-4fdf-bf82-d156b10a5824\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hn7r4" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.838989 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-97dkg\" (UID: \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\") " pod="openshift-authentication/oauth-openshift-558db77b4-97dkg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.839013 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f300ad16-f055-45cf-ac02-960f49b5d426-serving-cert\") pod \"console-operator-58897d9998-jtnvg\" (UID: \"f300ad16-f055-45cf-ac02-960f49b5d426\") " pod="openshift-console-operator/console-operator-58897d9998-jtnvg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.839065 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2m5p\" (UniqueName: \"kubernetes.io/projected/2a29855a-bbc0-458f-a9a0-0ddfd8763f2d-kube-api-access-z2m5p\") pod \"openshift-config-operator-7777fb866f-dwjwj\" (UID: \"2a29855a-bbc0-458f-a9a0-0ddfd8763f2d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dwjwj" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.839127 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-97dkg\" (UID: \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\") " pod="openshift-authentication/oauth-openshift-558db77b4-97dkg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.839169 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/2a29855a-bbc0-458f-a9a0-0ddfd8763f2d-available-featuregates\") pod \"openshift-config-operator-7777fb866f-dwjwj\" (UID: \"2a29855a-bbc0-458f-a9a0-0ddfd8763f2d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dwjwj" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.839201 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/20b8695b-8129-4f02-824d-5ca2a451d899-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-gkrbr\" (UID: \"20b8695b-8129-4f02-824d-5ca2a451d899\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gkrbr" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.839645 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.839922 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.840057 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.840162 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.841121 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4e3ac241-df4c-4438-83a5-9cc923b0f82f-auth-proxy-config\") pod \"machine-approver-56656f9798-lz22f\" (UID: \"4e3ac241-df4c-4438-83a5-9cc923b0f82f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lz22f" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.842053 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-s96f9"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.842606 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-s96f9" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.852206 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/4e3ac241-df4c-4438-83a5-9cc923b0f82f-machine-approver-tls\") pod \"machine-approver-56656f9798-lz22f\" (UID: \"4e3ac241-df4c-4438-83a5-9cc923b0f82f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lz22f" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.852705 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.853065 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.855499 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-tm5tk"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.856540 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tm5tk" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.861079 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-h6j45"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.866266 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-c6sj6"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.866880 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zc95n"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.863490 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.865354 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.867406 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c57s5"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.867060 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.867552 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-c6sj6" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.867739 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2hgfm"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.867914 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zc95n" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.868205 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c57s5" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.868292 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.870053 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-96ksj"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.870619 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-96ksj" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.870847 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2hgfm" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.872475 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.911713 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.914696 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-trcr9"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.915273 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.915612 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-68457"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.915867 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.916712 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-68457" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.917002 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-trcr9" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.917272 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-dd22q"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.917611 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.917974 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dd22q" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.920017 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qmmgv"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.921039 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qmmgv" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.922482 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.923734 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wr7xx"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.924822 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wr7xx" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.925406 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.926082 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-hj4tg"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.938543 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fhw9h"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.939329 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fhw9h" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.939658 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hj4tg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.940782 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e2a968f-c9a7-415e-895a-3d1f8a8a3e0b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-jzt4r\" (UID: \"8e2a968f-c9a7-415e-895a-3d1f8a8a3e0b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jzt4r" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.940812 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/85a3bbcb-e663-4a97-980c-606c979409d7-console-oauth-config\") pod \"console-f9d7485db-qh9sg\" (UID: \"85a3bbcb-e663-4a97-980c-606c979409d7\") " pod="openshift-console/console-f9d7485db-qh9sg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.940837 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pkvv\" (UniqueName: \"kubernetes.io/projected/85a3bbcb-e663-4a97-980c-606c979409d7-kube-api-access-8pkvv\") pod \"console-f9d7485db-qh9sg\" (UID: \"85a3bbcb-e663-4a97-980c-606c979409d7\") " pod="openshift-console/console-f9d7485db-qh9sg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.940865 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/71340af0-2591-4b41-ae96-e7f5fada7318-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-q8fz7\" (UID: \"71340af0-2591-4b41-ae96-e7f5fada7318\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-q8fz7" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.940898 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/2afc7adc-5b22-4203-9265-2ea4293f132f-node-pullsecrets\") pod \"apiserver-76f77b778f-l2jkp\" (UID: \"2afc7adc-5b22-4203-9265-2ea4293f132f\") " pod="openshift-apiserver/apiserver-76f77b778f-l2jkp" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.940922 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-97dkg\" (UID: \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\") " pod="openshift-authentication/oauth-openshift-558db77b4-97dkg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.940945 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfzss\" (UniqueName: \"kubernetes.io/projected/8415c9be-22de-49bb-8b53-6f06a923ef33-kube-api-access-hfzss\") pod \"openshift-apiserver-operator-796bbdcf4f-lj7m8\" (UID: \"8415c9be-22de-49bb-8b53-6f06a923ef33\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lj7m8" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.940970 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85a3bbcb-e663-4a97-980c-606c979409d7-trusted-ca-bundle\") pod \"console-f9d7485db-qh9sg\" (UID: \"85a3bbcb-e663-4a97-980c-606c979409d7\") " pod="openshift-console/console-f9d7485db-qh9sg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.940989 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e2a968f-c9a7-415e-895a-3d1f8a8a3e0b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-jzt4r\" (UID: \"8e2a968f-c9a7-415e-895a-3d1f8a8a3e0b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jzt4r" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.941010 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ba6c5b9f-7812-4b6d-998b-6f368a6edf83-client-ca\") pod \"route-controller-manager-6576b87f9c-h84v9\" (UID: \"ba6c5b9f-7812-4b6d-998b-6f368a6edf83\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h84v9" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.941031 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8415c9be-22de-49bb-8b53-6f06a923ef33-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-lj7m8\" (UID: \"8415c9be-22de-49bb-8b53-6f06a923ef33\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lj7m8" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.941054 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2afc7adc-5b22-4203-9265-2ea4293f132f-etcd-client\") pod \"apiserver-76f77b778f-l2jkp\" (UID: \"2afc7adc-5b22-4203-9265-2ea4293f132f\") " pod="openshift-apiserver/apiserver-76f77b778f-l2jkp" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.941076 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2afc7adc-5b22-4203-9265-2ea4293f132f-trusted-ca-bundle\") pod \"apiserver-76f77b778f-l2jkp\" (UID: \"2afc7adc-5b22-4203-9265-2ea4293f132f\") " pod="openshift-apiserver/apiserver-76f77b778f-l2jkp" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.941096 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kmsc\" (UniqueName: \"kubernetes.io/projected/271a4e3c-1aa2-4bf5-bdfe-0495910f75d6-kube-api-access-7kmsc\") pod \"cluster-image-registry-operator-dc59b4c8b-ldhj8\" (UID: \"271a4e3c-1aa2-4bf5-bdfe-0495910f75d6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ldhj8" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.941127 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kb6dq\" (UniqueName: \"kubernetes.io/projected/6273411b-70f9-4fdf-bf82-d156b10a5824-kube-api-access-kb6dq\") pod \"apiserver-7bbb656c7d-hn7r4\" (UID: \"6273411b-70f9-4fdf-bf82-d156b10a5824\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hn7r4" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.941148 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2afc7adc-5b22-4203-9265-2ea4293f132f-serving-cert\") pod \"apiserver-76f77b778f-l2jkp\" (UID: \"2afc7adc-5b22-4203-9265-2ea4293f132f\") " pod="openshift-apiserver/apiserver-76f77b778f-l2jkp" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.941175 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4rgw\" (UniqueName: \"kubernetes.io/projected/8ff37c1b-1688-42ce-8b0c-952d297ae4a0-kube-api-access-l4rgw\") pod \"router-default-5444994796-c6sj6\" (UID: \"8ff37c1b-1688-42ce-8b0c-952d297ae4a0\") " pod="openshift-ingress/router-default-5444994796-c6sj6" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.941202 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da4004a6-c6fd-41d6-a651-b4aaec2d6454-config\") pod \"machine-api-operator-5694c8668f-gqnxr\" (UID: \"da4004a6-c6fd-41d6-a651-b4aaec2d6454\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gqnxr" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.941227 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/2afc7adc-5b22-4203-9265-2ea4293f132f-image-import-ca\") pod \"apiserver-76f77b778f-l2jkp\" (UID: \"2afc7adc-5b22-4203-9265-2ea4293f132f\") " pod="openshift-apiserver/apiserver-76f77b778f-l2jkp" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.941248 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-97dkg\" (UID: \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\") " pod="openshift-authentication/oauth-openshift-558db77b4-97dkg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.941279 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-audit-policies\") pod \"oauth-openshift-558db77b4-97dkg\" (UID: \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\") " pod="openshift-authentication/oauth-openshift-558db77b4-97dkg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.941315 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-97dkg\" (UID: \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\") " pod="openshift-authentication/oauth-openshift-558db77b4-97dkg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.941333 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.941435 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-97dkg\" (UID: \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\") " pod="openshift-authentication/oauth-openshift-558db77b4-97dkg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.941469 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9x2n\" (UniqueName: \"kubernetes.io/projected/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-kube-api-access-p9x2n\") pod \"oauth-openshift-558db77b4-97dkg\" (UID: \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\") " pod="openshift-authentication/oauth-openshift-558db77b4-97dkg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.941495 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkvw8\" (UniqueName: \"kubernetes.io/projected/71340af0-2591-4b41-ae96-e7f5fada7318-kube-api-access-nkvw8\") pod \"machine-config-controller-84d6567774-q8fz7\" (UID: \"71340af0-2591-4b41-ae96-e7f5fada7318\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-q8fz7" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.941515 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-k2z5n"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.941522 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ff37c1b-1688-42ce-8b0c-952d297ae4a0-service-ca-bundle\") pod \"router-default-5444994796-c6sj6\" (UID: \"8ff37c1b-1688-42ce-8b0c-952d297ae4a0\") " pod="openshift-ingress/router-default-5444994796-c6sj6" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.941558 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2a29855a-bbc0-458f-a9a0-0ddfd8763f2d-serving-cert\") pod \"openshift-config-operator-7777fb866f-dwjwj\" (UID: \"2a29855a-bbc0-458f-a9a0-0ddfd8763f2d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dwjwj" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.941581 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b820aa76-e1f5-440b-b942-2a4468dc4d51-serving-cert\") pod \"authentication-operator-69f744f599-5m9c6\" (UID: \"b820aa76-e1f5-440b-b942-2a4468dc4d51\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5m9c6" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.941610 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6273411b-70f9-4fdf-bf82-d156b10a5824-audit-dir\") pod \"apiserver-7bbb656c7d-hn7r4\" (UID: \"6273411b-70f9-4fdf-bf82-d156b10a5824\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hn7r4" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.941632 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/2afc7adc-5b22-4203-9265-2ea4293f132f-etcd-serving-ca\") pod \"apiserver-76f77b778f-l2jkp\" (UID: \"2afc7adc-5b22-4203-9265-2ea4293f132f\") " pod="openshift-apiserver/apiserver-76f77b778f-l2jkp" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.941656 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20b8695b-8129-4f02-824d-5ca2a451d899-serving-cert\") pod \"controller-manager-879f6c89f-gkrbr\" (UID: \"20b8695b-8129-4f02-824d-5ca2a451d899\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gkrbr" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.941813 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhdr9\" (UniqueName: \"kubernetes.io/projected/20b8695b-8129-4f02-824d-5ca2a451d899-kube-api-access-dhdr9\") pod \"controller-manager-879f6c89f-gkrbr\" (UID: \"20b8695b-8129-4f02-824d-5ca2a451d899\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gkrbr" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.941873 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-97dkg\" (UID: \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\") " pod="openshift-authentication/oauth-openshift-558db77b4-97dkg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.941900 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/85a3bbcb-e663-4a97-980c-606c979409d7-console-serving-cert\") pod \"console-f9d7485db-qh9sg\" (UID: \"85a3bbcb-e663-4a97-980c-606c979409d7\") " pod="openshift-console/console-f9d7485db-qh9sg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.941923 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20b8695b-8129-4f02-824d-5ca2a451d899-config\") pod \"controller-manager-879f6c89f-gkrbr\" (UID: \"20b8695b-8129-4f02-824d-5ca2a451d899\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gkrbr" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.941945 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mj42q\" (UniqueName: \"kubernetes.io/projected/ba6c5b9f-7812-4b6d-998b-6f368a6edf83-kube-api-access-mj42q\") pod \"route-controller-manager-6576b87f9c-h84v9\" (UID: \"ba6c5b9f-7812-4b6d-998b-6f368a6edf83\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h84v9" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.941969 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/da4004a6-c6fd-41d6-a651-b4aaec2d6454-images\") pod \"machine-api-operator-5694c8668f-gqnxr\" (UID: \"da4004a6-c6fd-41d6-a651-b4aaec2d6454\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gqnxr" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.941994 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/20b8695b-8129-4f02-824d-5ca2a451d899-client-ca\") pod \"controller-manager-879f6c89f-gkrbr\" (UID: \"20b8695b-8129-4f02-824d-5ca2a451d899\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gkrbr" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.942036 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6273411b-70f9-4fdf-bf82-d156b10a5824-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-hn7r4\" (UID: \"6273411b-70f9-4fdf-bf82-d156b10a5824\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hn7r4" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.942061 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-97dkg\" (UID: \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\") " pod="openshift-authentication/oauth-openshift-558db77b4-97dkg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.942125 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/2afc7adc-5b22-4203-9265-2ea4293f132f-encryption-config\") pod \"apiserver-76f77b778f-l2jkp\" (UID: \"2afc7adc-5b22-4203-9265-2ea4293f132f\") " pod="openshift-apiserver/apiserver-76f77b778f-l2jkp" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.942151 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-97dkg\" (UID: \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\") " pod="openshift-authentication/oauth-openshift-558db77b4-97dkg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.942176 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/54ee5cc6-676d-412c-b2e8-66308fcfc3d6-metrics-tls\") pod \"dns-operator-744455d44c-75sbr\" (UID: \"54ee5cc6-676d-412c-b2e8-66308fcfc3d6\") " pod="openshift-dns-operator/dns-operator-744455d44c-75sbr" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.942199 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b820aa76-e1f5-440b-b942-2a4468dc4d51-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-5m9c6\" (UID: \"b820aa76-e1f5-440b-b942-2a4468dc4d51\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5m9c6" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.942229 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fggwc\" (UniqueName: \"kubernetes.io/projected/b820aa76-e1f5-440b-b942-2a4468dc4d51-kube-api-access-fggwc\") pod \"authentication-operator-69f744f599-5m9c6\" (UID: \"b820aa76-e1f5-440b-b942-2a4468dc4d51\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5m9c6" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.942254 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7mk6w"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.942509 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-k2z5n" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.942773 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7mk6w" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.942254 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f300ad16-f055-45cf-ac02-960f49b5d426-config\") pod \"console-operator-58897d9998-jtnvg\" (UID: \"f300ad16-f055-45cf-ac02-960f49b5d426\") " pod="openshift-console-operator/console-operator-58897d9998-jtnvg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.942875 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f300ad16-f055-45cf-ac02-960f49b5d426-trusted-ca\") pod \"console-operator-58897d9998-jtnvg\" (UID: \"f300ad16-f055-45cf-ac02-960f49b5d426\") " pod="openshift-console-operator/console-operator-58897d9998-jtnvg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.942895 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6273411b-70f9-4fdf-bf82-d156b10a5824-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-hn7r4\" (UID: \"6273411b-70f9-4fdf-bf82-d156b10a5824\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hn7r4" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.942936 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6273411b-70f9-4fdf-bf82-d156b10a5824-encryption-config\") pod \"apiserver-7bbb656c7d-hn7r4\" (UID: \"6273411b-70f9-4fdf-bf82-d156b10a5824\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hn7r4" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.942954 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lxkv\" (UniqueName: \"kubernetes.io/projected/e3056772-5de8-4fed-9796-440422743470-kube-api-access-6lxkv\") pod \"cluster-samples-operator-665b6dd947-24x2s\" (UID: \"e3056772-5de8-4fed-9796-440422743470\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-24x2s" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.942992 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2afc7adc-5b22-4203-9265-2ea4293f132f-audit-dir\") pod \"apiserver-76f77b778f-l2jkp\" (UID: \"2afc7adc-5b22-4203-9265-2ea4293f132f\") " pod="openshift-apiserver/apiserver-76f77b778f-l2jkp" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.943009 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-97dkg\" (UID: \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\") " pod="openshift-authentication/oauth-openshift-558db77b4-97dkg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.943089 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-97dkg\" (UID: \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\") " pod="openshift-authentication/oauth-openshift-558db77b4-97dkg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.943183 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drh7b\" (UniqueName: \"kubernetes.io/projected/54ee5cc6-676d-412c-b2e8-66308fcfc3d6-kube-api-access-drh7b\") pod \"dns-operator-744455d44c-75sbr\" (UID: \"54ee5cc6-676d-412c-b2e8-66308fcfc3d6\") " pod="openshift-dns-operator/dns-operator-744455d44c-75sbr" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.943210 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/85a3bbcb-e663-4a97-980c-606c979409d7-console-config\") pod \"console-f9d7485db-qh9sg\" (UID: \"85a3bbcb-e663-4a97-980c-606c979409d7\") " pod="openshift-console/console-f9d7485db-qh9sg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.943255 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/85a3bbcb-e663-4a97-980c-606c979409d7-service-ca\") pod \"console-f9d7485db-qh9sg\" (UID: \"85a3bbcb-e663-4a97-980c-606c979409d7\") " pod="openshift-console/console-f9d7485db-qh9sg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.943307 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/2afc7adc-5b22-4203-9265-2ea4293f132f-audit\") pod \"apiserver-76f77b778f-l2jkp\" (UID: \"2afc7adc-5b22-4203-9265-2ea4293f132f\") " pod="openshift-apiserver/apiserver-76f77b778f-l2jkp" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.943324 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba6c5b9f-7812-4b6d-998b-6f368a6edf83-serving-cert\") pod \"route-controller-manager-6576b87f9c-h84v9\" (UID: \"ba6c5b9f-7812-4b6d-998b-6f368a6edf83\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h84v9" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.943343 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8ff37c1b-1688-42ce-8b0c-952d297ae4a0-metrics-certs\") pod \"router-default-5444994796-c6sj6\" (UID: \"8ff37c1b-1688-42ce-8b0c-952d297ae4a0\") " pod="openshift-ingress/router-default-5444994796-c6sj6" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.943548 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-zrbdj"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.943617 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/2afc7adc-5b22-4203-9265-2ea4293f132f-node-pullsecrets\") pod \"apiserver-76f77b778f-l2jkp\" (UID: \"2afc7adc-5b22-4203-9265-2ea4293f132f\") " pod="openshift-apiserver/apiserver-76f77b778f-l2jkp" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.944059 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-97dkg\" (UID: \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\") " pod="openshift-authentication/oauth-openshift-558db77b4-97dkg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.944093 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5239e6e7-4afa-4b37-9582-e159b201453a-config\") pod \"kube-apiserver-operator-766d6c64bb-c57s5\" (UID: \"5239e6e7-4afa-4b37-9582-e159b201453a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c57s5" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.944121 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwsc2\" (UniqueName: \"kubernetes.io/projected/da4004a6-c6fd-41d6-a651-b4aaec2d6454-kube-api-access-dwsc2\") pod \"machine-api-operator-5694c8668f-gqnxr\" (UID: \"da4004a6-c6fd-41d6-a651-b4aaec2d6454\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gqnxr" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.944144 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6273411b-70f9-4fdf-bf82-d156b10a5824-etcd-client\") pod \"apiserver-7bbb656c7d-hn7r4\" (UID: \"6273411b-70f9-4fdf-bf82-d156b10a5824\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hn7r4" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.944164 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6273411b-70f9-4fdf-bf82-d156b10a5824-serving-cert\") pod \"apiserver-7bbb656c7d-hn7r4\" (UID: \"6273411b-70f9-4fdf-bf82-d156b10a5824\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hn7r4" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.944185 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2afc7adc-5b22-4203-9265-2ea4293f132f-config\") pod \"apiserver-76f77b778f-l2jkp\" (UID: \"2afc7adc-5b22-4203-9265-2ea4293f132f\") " pod="openshift-apiserver/apiserver-76f77b778f-l2jkp" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.944205 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48s8c\" (UniqueName: \"kubernetes.io/projected/2afc7adc-5b22-4203-9265-2ea4293f132f-kube-api-access-48s8c\") pod \"apiserver-76f77b778f-l2jkp\" (UID: \"2afc7adc-5b22-4203-9265-2ea4293f132f\") " pod="openshift-apiserver/apiserver-76f77b778f-l2jkp" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.944224 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/271a4e3c-1aa2-4bf5-bdfe-0495910f75d6-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-ldhj8\" (UID: \"271a4e3c-1aa2-4bf5-bdfe-0495910f75d6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ldhj8" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.944242 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/85a3bbcb-e663-4a97-980c-606c979409d7-oauth-serving-cert\") pod \"console-f9d7485db-qh9sg\" (UID: \"85a3bbcb-e663-4a97-980c-606c979409d7\") " pod="openshift-console/console-f9d7485db-qh9sg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.944266 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b820aa76-e1f5-440b-b942-2a4468dc4d51-service-ca-bundle\") pod \"authentication-operator-69f744f599-5m9c6\" (UID: \"b820aa76-e1f5-440b-b942-2a4468dc4d51\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5m9c6" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.944284 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-audit-dir\") pod \"oauth-openshift-558db77b4-97dkg\" (UID: \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\") " pod="openshift-authentication/oauth-openshift-558db77b4-97dkg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.944510 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba6c5b9f-7812-4b6d-998b-6f368a6edf83-config\") pod \"route-controller-manager-6576b87f9c-h84v9\" (UID: \"ba6c5b9f-7812-4b6d-998b-6f368a6edf83\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h84v9" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.944534 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/271a4e3c-1aa2-4bf5-bdfe-0495910f75d6-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-ldhj8\" (UID: \"271a4e3c-1aa2-4bf5-bdfe-0495910f75d6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ldhj8" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.944558 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8w4c\" (UniqueName: \"kubernetes.io/projected/566d8448-e794-44ee-9d17-e92493adcd87-kube-api-access-v8w4c\") pod \"downloads-7954f5f757-h6j45\" (UID: \"566d8448-e794-44ee-9d17-e92493adcd87\") " pod="openshift-console/downloads-7954f5f757-h6j45" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.944575 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmdsk\" (UniqueName: \"kubernetes.io/projected/8e2a968f-c9a7-415e-895a-3d1f8a8a3e0b-kube-api-access-wmdsk\") pod \"openshift-controller-manager-operator-756b6f6bc6-jzt4r\" (UID: \"8e2a968f-c9a7-415e-895a-3d1f8a8a3e0b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jzt4r" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.944596 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6273411b-70f9-4fdf-bf82-d156b10a5824-audit-policies\") pod \"apiserver-7bbb656c7d-hn7r4\" (UID: \"6273411b-70f9-4fdf-bf82-d156b10a5824\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hn7r4" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.944613 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/271a4e3c-1aa2-4bf5-bdfe-0495910f75d6-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-ldhj8\" (UID: \"271a4e3c-1aa2-4bf5-bdfe-0495910f75d6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ldhj8" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.944633 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ad46ee5f-9c08-438c-8284-aa488f48e522-profile-collector-cert\") pod \"olm-operator-6b444d44fb-s96f9\" (UID: \"ad46ee5f-9c08-438c-8284-aa488f48e522\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-s96f9" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.944660 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-97dkg\" (UID: \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\") " pod="openshift-authentication/oauth-openshift-558db77b4-97dkg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.944684 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f300ad16-f055-45cf-ac02-960f49b5d426-serving-cert\") pod \"console-operator-58897d9998-jtnvg\" (UID: \"f300ad16-f055-45cf-ac02-960f49b5d426\") " pod="openshift-console-operator/console-operator-58897d9998-jtnvg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.944855 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zrbdj" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.945009 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-gkrbr"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.945610 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f300ad16-f055-45cf-ac02-960f49b5d426-config\") pod \"console-operator-58897d9998-jtnvg\" (UID: \"f300ad16-f055-45cf-ac02-960f49b5d426\") " pod="openshift-console-operator/console-operator-58897d9998-jtnvg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.941440 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.945953 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2m5p\" (UniqueName: \"kubernetes.io/projected/2a29855a-bbc0-458f-a9a0-0ddfd8763f2d-kube-api-access-z2m5p\") pod \"openshift-config-operator-7777fb866f-dwjwj\" (UID: \"2a29855a-bbc0-458f-a9a0-0ddfd8763f2d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dwjwj" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.945988 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/8ff37c1b-1688-42ce-8b0c-952d297ae4a0-stats-auth\") pod \"router-default-5444994796-c6sj6\" (UID: \"8ff37c1b-1688-42ce-8b0c-952d297ae4a0\") " pod="openshift-ingress/router-default-5444994796-c6sj6" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.946012 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ad46ee5f-9c08-438c-8284-aa488f48e522-srv-cert\") pod \"olm-operator-6b444d44fb-s96f9\" (UID: \"ad46ee5f-9c08-438c-8284-aa488f48e522\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-s96f9" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.946152 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-audit-policies\") pod \"oauth-openshift-558db77b4-97dkg\" (UID: \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\") " pod="openshift-authentication/oauth-openshift-558db77b4-97dkg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.946426 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-kpns7"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.946481 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7q4f\" (UniqueName: \"kubernetes.io/projected/ad46ee5f-9c08-438c-8284-aa488f48e522-kube-api-access-k7q4f\") pod \"olm-operator-6b444d44fb-s96f9\" (UID: \"ad46ee5f-9c08-438c-8284-aa488f48e522\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-s96f9" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.946510 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-97dkg\" (UID: \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\") " pod="openshift-authentication/oauth-openshift-558db77b4-97dkg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.946629 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/2a29855a-bbc0-458f-a9a0-0ddfd8763f2d-available-featuregates\") pod \"openshift-config-operator-7777fb866f-dwjwj\" (UID: \"2a29855a-bbc0-458f-a9a0-0ddfd8763f2d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dwjwj" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.946657 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/20b8695b-8129-4f02-824d-5ca2a451d899-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-gkrbr\" (UID: \"20b8695b-8129-4f02-824d-5ca2a451d899\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gkrbr" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.946705 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2xgl\" (UniqueName: \"kubernetes.io/projected/f300ad16-f055-45cf-ac02-960f49b5d426-kube-api-access-b2xgl\") pod \"console-operator-58897d9998-jtnvg\" (UID: \"f300ad16-f055-45cf-ac02-960f49b5d426\") " pod="openshift-console-operator/console-operator-58897d9998-jtnvg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.946770 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8415c9be-22de-49bb-8b53-6f06a923ef33-config\") pod \"openshift-apiserver-operator-796bbdcf4f-lj7m8\" (UID: \"8415c9be-22de-49bb-8b53-6f06a923ef33\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lj7m8" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.946815 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5239e6e7-4afa-4b37-9582-e159b201453a-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-c57s5\" (UID: \"5239e6e7-4afa-4b37-9582-e159b201453a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c57s5" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.946834 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5239e6e7-4afa-4b37-9582-e159b201453a-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-c57s5\" (UID: \"5239e6e7-4afa-4b37-9582-e159b201453a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c57s5" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.946857 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e3056772-5de8-4fed-9796-440422743470-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-24x2s\" (UID: \"e3056772-5de8-4fed-9796-440422743470\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-24x2s" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.946875 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/71340af0-2591-4b41-ae96-e7f5fada7318-proxy-tls\") pod \"machine-config-controller-84d6567774-q8fz7\" (UID: \"71340af0-2591-4b41-ae96-e7f5fada7318\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-q8fz7" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.946880 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-97dkg\" (UID: \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\") " pod="openshift-authentication/oauth-openshift-558db77b4-97dkg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.946897 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/8ff37c1b-1688-42ce-8b0c-952d297ae4a0-default-certificate\") pod \"router-default-5444994796-c6sj6\" (UID: \"8ff37c1b-1688-42ce-8b0c-952d297ae4a0\") " pod="openshift-ingress/router-default-5444994796-c6sj6" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.946965 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6273411b-70f9-4fdf-bf82-d156b10a5824-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-hn7r4\" (UID: \"6273411b-70f9-4fdf-bf82-d156b10a5824\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hn7r4" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.946967 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/da4004a6-c6fd-41d6-a651-b4aaec2d6454-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-gqnxr\" (UID: \"da4004a6-c6fd-41d6-a651-b4aaec2d6454\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gqnxr" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.947030 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b820aa76-e1f5-440b-b942-2a4468dc4d51-config\") pod \"authentication-operator-69f744f599-5m9c6\" (UID: \"b820aa76-e1f5-440b-b942-2a4468dc4d51\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5m9c6" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.947491 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-kpns7" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.947881 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e2a968f-c9a7-415e-895a-3d1f8a8a3e0b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-jzt4r\" (UID: \"8e2a968f-c9a7-415e-895a-3d1f8a8a3e0b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jzt4r" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.947902 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b820aa76-e1f5-440b-b942-2a4468dc4d51-config\") pod \"authentication-operator-69f744f599-5m9c6\" (UID: \"b820aa76-e1f5-440b-b942-2a4468dc4d51\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5m9c6" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.948516 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f300ad16-f055-45cf-ac02-960f49b5d426-trusted-ca\") pod \"console-operator-58897d9998-jtnvg\" (UID: \"f300ad16-f055-45cf-ac02-960f49b5d426\") " pod="openshift-console-operator/console-operator-58897d9998-jtnvg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.948584 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2afc7adc-5b22-4203-9265-2ea4293f132f-audit-dir\") pod \"apiserver-76f77b778f-l2jkp\" (UID: \"2afc7adc-5b22-4203-9265-2ea4293f132f\") " pod="openshift-apiserver/apiserver-76f77b778f-l2jkp" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.948648 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-97dkg\" (UID: \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\") " pod="openshift-authentication/oauth-openshift-558db77b4-97dkg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.948960 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/da4004a6-c6fd-41d6-a651-b4aaec2d6454-images\") pod \"machine-api-operator-5694c8668f-gqnxr\" (UID: \"da4004a6-c6fd-41d6-a651-b4aaec2d6454\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gqnxr" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.949879 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da4004a6-c6fd-41d6-a651-b4aaec2d6454-config\") pod \"machine-api-operator-5694c8668f-gqnxr\" (UID: \"da4004a6-c6fd-41d6-a651-b4aaec2d6454\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gqnxr" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.950087 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20b8695b-8129-4f02-824d-5ca2a451d899-config\") pod \"controller-manager-879f6c89f-gkrbr\" (UID: \"20b8695b-8129-4f02-824d-5ca2a451d899\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gkrbr" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.951649 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/da4004a6-c6fd-41d6-a651-b4aaec2d6454-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-gqnxr\" (UID: \"da4004a6-c6fd-41d6-a651-b4aaec2d6454\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gqnxr" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.951932 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-97dkg\" (UID: \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\") " pod="openshift-authentication/oauth-openshift-558db77b4-97dkg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.952099 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6273411b-70f9-4fdf-bf82-d156b10a5824-audit-dir\") pod \"apiserver-7bbb656c7d-hn7r4\" (UID: \"6273411b-70f9-4fdf-bf82-d156b10a5824\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hn7r4" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.952484 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/20b8695b-8129-4f02-824d-5ca2a451d899-client-ca\") pod \"controller-manager-879f6c89f-gkrbr\" (UID: \"20b8695b-8129-4f02-824d-5ca2a451d899\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gkrbr" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.952681 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/6273411b-70f9-4fdf-bf82-d156b10a5824-encryption-config\") pod \"apiserver-7bbb656c7d-hn7r4\" (UID: \"6273411b-70f9-4fdf-bf82-d156b10a5824\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hn7r4" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.952773 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550780-wpnp9"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.953028 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/6273411b-70f9-4fdf-bf82-d156b10a5824-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-hn7r4\" (UID: \"6273411b-70f9-4fdf-bf82-d156b10a5824\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hn7r4" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.953162 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-5m9c6"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.953183 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-h84v9"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.953236 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550780-wpnp9" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.953913 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2a29855a-bbc0-458f-a9a0-0ddfd8763f2d-serving-cert\") pod \"openshift-config-operator-7777fb866f-dwjwj\" (UID: \"2a29855a-bbc0-458f-a9a0-0ddfd8763f2d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dwjwj" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.954137 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-97dkg\" (UID: \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\") " pod="openshift-authentication/oauth-openshift-558db77b4-97dkg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.954302 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b820aa76-e1f5-440b-b942-2a4468dc4d51-serving-cert\") pod \"authentication-operator-69f744f599-5m9c6\" (UID: \"b820aa76-e1f5-440b-b942-2a4468dc4d51\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5m9c6" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.955050 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/54ee5cc6-676d-412c-b2e8-66308fcfc3d6-metrics-tls\") pod \"dns-operator-744455d44c-75sbr\" (UID: \"54ee5cc6-676d-412c-b2e8-66308fcfc3d6\") " pod="openshift-dns-operator/dns-operator-744455d44c-75sbr" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.957448 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-97dkg\" (UID: \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\") " pod="openshift-authentication/oauth-openshift-558db77b4-97dkg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.957958 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20b8695b-8129-4f02-824d-5ca2a451d899-serving-cert\") pod \"controller-manager-879f6c89f-gkrbr\" (UID: \"20b8695b-8129-4f02-824d-5ca2a451d899\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gkrbr" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.958895 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b820aa76-e1f5-440b-b942-2a4468dc4d51-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-5m9c6\" (UID: \"b820aa76-e1f5-440b-b942-2a4468dc4d51\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5m9c6" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.959283 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6273411b-70f9-4fdf-bf82-d156b10a5824-audit-policies\") pod \"apiserver-7bbb656c7d-hn7r4\" (UID: \"6273411b-70f9-4fdf-bf82-d156b10a5824\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hn7r4" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.959332 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e2a968f-c9a7-415e-895a-3d1f8a8a3e0b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-jzt4r\" (UID: \"8e2a968f-c9a7-415e-895a-3d1f8a8a3e0b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jzt4r" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.959363 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-jtnvg"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.959433 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-24x2s"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.959620 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jzt4r"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.960199 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-97dkg\" (UID: \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\") " pod="openshift-authentication/oauth-openshift-558db77b4-97dkg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.960970 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8415c9be-22de-49bb-8b53-6f06a923ef33-config\") pod \"openshift-apiserver-operator-796bbdcf4f-lj7m8\" (UID: \"8415c9be-22de-49bb-8b53-6f06a923ef33\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lj7m8" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.961501 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8415c9be-22de-49bb-8b53-6f06a923ef33-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-lj7m8\" (UID: \"8415c9be-22de-49bb-8b53-6f06a923ef33\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lj7m8" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.962246 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ba6c5b9f-7812-4b6d-998b-6f368a6edf83-client-ca\") pod \"route-controller-manager-6576b87f9c-h84v9\" (UID: \"ba6c5b9f-7812-4b6d-998b-6f368a6edf83\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h84v9" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.962975 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-97dkg\" (UID: \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\") " pod="openshift-authentication/oauth-openshift-558db77b4-97dkg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.963316 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-97dkg\" (UID: \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\") " pod="openshift-authentication/oauth-openshift-558db77b4-97dkg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.963489 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-gqnxr"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.963602 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba6c5b9f-7812-4b6d-998b-6f368a6edf83-serving-cert\") pod \"route-controller-manager-6576b87f9c-h84v9\" (UID: \"ba6c5b9f-7812-4b6d-998b-6f368a6edf83\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h84v9" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.963772 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lj7m8"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.963871 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/20b8695b-8129-4f02-824d-5ca2a451d899-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-gkrbr\" (UID: \"20b8695b-8129-4f02-824d-5ca2a451d899\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gkrbr" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.964179 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/2a29855a-bbc0-458f-a9a0-0ddfd8763f2d-available-featuregates\") pod \"openshift-config-operator-7777fb866f-dwjwj\" (UID: \"2a29855a-bbc0-458f-a9a0-0ddfd8763f2d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dwjwj" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.965000 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/271a4e3c-1aa2-4bf5-bdfe-0495910f75d6-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-ldhj8\" (UID: \"271a4e3c-1aa2-4bf5-bdfe-0495910f75d6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ldhj8" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.965743 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba6c5b9f-7812-4b6d-998b-6f368a6edf83-config\") pod \"route-controller-manager-6576b87f9c-h84v9\" (UID: \"ba6c5b9f-7812-4b6d-998b-6f368a6edf83\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h84v9" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.966982 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-jtqjq"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.967494 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6273411b-70f9-4fdf-bf82-d156b10a5824-etcd-client\") pod \"apiserver-7bbb656c7d-hn7r4\" (UID: \"6273411b-70f9-4fdf-bf82-d156b10a5824\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hn7r4" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.967512 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/85a3bbcb-e663-4a97-980c-606c979409d7-oauth-serving-cert\") pod \"console-f9d7485db-qh9sg\" (UID: \"85a3bbcb-e663-4a97-980c-606c979409d7\") " pod="openshift-console/console-f9d7485db-qh9sg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.967664 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-qh9sg"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.967682 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c57s5"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.967754 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-jtqjq" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.967902 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b820aa76-e1f5-440b-b942-2a4468dc4d51-service-ca-bundle\") pod \"authentication-operator-69f744f599-5m9c6\" (UID: \"b820aa76-e1f5-440b-b942-2a4468dc4d51\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5m9c6" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.967961 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-audit-dir\") pod \"oauth-openshift-558db77b4-97dkg\" (UID: \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\") " pod="openshift-authentication/oauth-openshift-558db77b4-97dkg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.968045 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.968481 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/271a4e3c-1aa2-4bf5-bdfe-0495910f75d6-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-ldhj8\" (UID: \"271a4e3c-1aa2-4bf5-bdfe-0495910f75d6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ldhj8" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.968814 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f300ad16-f055-45cf-ac02-960f49b5d426-serving-cert\") pod \"console-operator-58897d9998-jtnvg\" (UID: \"f300ad16-f055-45cf-ac02-960f49b5d426\") " pod="openshift-console-operator/console-operator-58897d9998-jtnvg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.969390 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-s96f9"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.970716 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6273411b-70f9-4fdf-bf82-d156b10a5824-serving-cert\") pod \"apiserver-7bbb656c7d-hn7r4\" (UID: \"6273411b-70f9-4fdf-bf82-d156b10a5824\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hn7r4" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.972484 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e3056772-5de8-4fed-9796-440422743470-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-24x2s\" (UID: \"e3056772-5de8-4fed-9796-440422743470\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-24x2s" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.972543 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zc95n"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.973442 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-75sbr"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.974224 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2afc7adc-5b22-4203-9265-2ea4293f132f-trusted-ca-bundle\") pod \"apiserver-76f77b778f-l2jkp\" (UID: \"2afc7adc-5b22-4203-9265-2ea4293f132f\") " pod="openshift-apiserver/apiserver-76f77b778f-l2jkp" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.974753 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/2afc7adc-5b22-4203-9265-2ea4293f132f-etcd-serving-ca\") pod \"apiserver-76f77b778f-l2jkp\" (UID: \"2afc7adc-5b22-4203-9265-2ea4293f132f\") " pod="openshift-apiserver/apiserver-76f77b778f-l2jkp" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.975221 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/2afc7adc-5b22-4203-9265-2ea4293f132f-etcd-client\") pod \"apiserver-76f77b778f-l2jkp\" (UID: \"2afc7adc-5b22-4203-9265-2ea4293f132f\") " pod="openshift-apiserver/apiserver-76f77b778f-l2jkp" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.975249 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2afc7adc-5b22-4203-9265-2ea4293f132f-config\") pod \"apiserver-76f77b778f-l2jkp\" (UID: \"2afc7adc-5b22-4203-9265-2ea4293f132f\") " pod="openshift-apiserver/apiserver-76f77b778f-l2jkp" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.975672 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-tm5tk"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.975861 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/2afc7adc-5b22-4203-9265-2ea4293f132f-image-import-ca\") pod \"apiserver-76f77b778f-l2jkp\" (UID: \"2afc7adc-5b22-4203-9265-2ea4293f132f\") " pod="openshift-apiserver/apiserver-76f77b778f-l2jkp" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.975883 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/2afc7adc-5b22-4203-9265-2ea4293f132f-audit\") pod \"apiserver-76f77b778f-l2jkp\" (UID: \"2afc7adc-5b22-4203-9265-2ea4293f132f\") " pod="openshift-apiserver/apiserver-76f77b778f-l2jkp" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.977447 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-fsckq"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.977569 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-trcr9"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.978524 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/2afc7adc-5b22-4203-9265-2ea4293f132f-encryption-config\") pod \"apiserver-76f77b778f-l2jkp\" (UID: \"2afc7adc-5b22-4203-9265-2ea4293f132f\") " pod="openshift-apiserver/apiserver-76f77b778f-l2jkp" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.978614 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-5t4mn"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.979036 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-97dkg\" (UID: \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\") " pod="openshift-authentication/oauth-openshift-558db77b4-97dkg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.979860 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5t4mn" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.979970 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/85a3bbcb-e663-4a97-980c-606c979409d7-console-serving-cert\") pod \"console-f9d7485db-qh9sg\" (UID: \"85a3bbcb-e663-4a97-980c-606c979409d7\") " pod="openshift-console/console-f9d7485db-qh9sg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.980564 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-wdsnl"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.980854 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2afc7adc-5b22-4203-9265-2ea4293f132f-serving-cert\") pod \"apiserver-76f77b778f-l2jkp\" (UID: \"2afc7adc-5b22-4203-9265-2ea4293f132f\") " pod="openshift-apiserver/apiserver-76f77b778f-l2jkp" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.981474 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.982188 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-q8fz7"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.982211 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-wdsnl" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.983120 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-97dkg\" (UID: \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\") " pod="openshift-authentication/oauth-openshift-558db77b4-97dkg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.983490 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-zrbdj"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.984607 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-hn7r4"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.985895 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-dwjwj"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.986819 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-k2z5n"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.988047 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-97dkg"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.988818 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-96ksj"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.989714 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2hgfm"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.991701 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-kpns7"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.992800 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ldhj8"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.992838 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/85a3bbcb-e663-4a97-980c-606c979409d7-console-oauth-config\") pod \"console-f9d7485db-qh9sg\" (UID: \"85a3bbcb-e663-4a97-980c-606c979409d7\") " pod="openshift-console/console-f9d7485db-qh9sg" Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.993736 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-cjkj6"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.994865 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-l2jkp"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.995769 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fhw9h"] Mar 09 09:07:34 crc kubenswrapper[4861]: I0309 09:07:34.998479 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-68457"] Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.002255 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.004163 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550780-wpnp9"] Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.005324 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/85a3bbcb-e663-4a97-980c-606c979409d7-console-config\") pod \"console-f9d7485db-qh9sg\" (UID: \"85a3bbcb-e663-4a97-980c-606c979409d7\") " pod="openshift-console/console-f9d7485db-qh9sg" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.006900 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5t4mn"] Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.012622 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-dd22q"] Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.014184 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7mk6w"] Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.015687 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qmmgv"] Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.017922 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wr7xx"] Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.019174 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-hj4tg"] Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.020616 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-z6v85"] Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.021526 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.023071 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-z6v85"] Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.023247 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-z6v85" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.028583 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/85a3bbcb-e663-4a97-980c-606c979409d7-service-ca\") pod \"console-f9d7485db-qh9sg\" (UID: \"85a3bbcb-e663-4a97-980c-606c979409d7\") " pod="openshift-console/console-f9d7485db-qh9sg" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.047857 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ad46ee5f-9c08-438c-8284-aa488f48e522-profile-collector-cert\") pod \"olm-operator-6b444d44fb-s96f9\" (UID: \"ad46ee5f-9c08-438c-8284-aa488f48e522\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-s96f9" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.047899 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/8ff37c1b-1688-42ce-8b0c-952d297ae4a0-stats-auth\") pod \"router-default-5444994796-c6sj6\" (UID: \"8ff37c1b-1688-42ce-8b0c-952d297ae4a0\") " pod="openshift-ingress/router-default-5444994796-c6sj6" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.047915 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ad46ee5f-9c08-438c-8284-aa488f48e522-srv-cert\") pod \"olm-operator-6b444d44fb-s96f9\" (UID: \"ad46ee5f-9c08-438c-8284-aa488f48e522\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-s96f9" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.047931 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7q4f\" (UniqueName: \"kubernetes.io/projected/ad46ee5f-9c08-438c-8284-aa488f48e522-kube-api-access-k7q4f\") pod \"olm-operator-6b444d44fb-s96f9\" (UID: \"ad46ee5f-9c08-438c-8284-aa488f48e522\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-s96f9" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.047954 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5239e6e7-4afa-4b37-9582-e159b201453a-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-c57s5\" (UID: \"5239e6e7-4afa-4b37-9582-e159b201453a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c57s5" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.047967 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.047969 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5239e6e7-4afa-4b37-9582-e159b201453a-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-c57s5\" (UID: \"5239e6e7-4afa-4b37-9582-e159b201453a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c57s5" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.048169 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/71340af0-2591-4b41-ae96-e7f5fada7318-proxy-tls\") pod \"machine-config-controller-84d6567774-q8fz7\" (UID: \"71340af0-2591-4b41-ae96-e7f5fada7318\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-q8fz7" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.048197 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/8ff37c1b-1688-42ce-8b0c-952d297ae4a0-default-certificate\") pod \"router-default-5444994796-c6sj6\" (UID: \"8ff37c1b-1688-42ce-8b0c-952d297ae4a0\") " pod="openshift-ingress/router-default-5444994796-c6sj6" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.048242 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/71340af0-2591-4b41-ae96-e7f5fada7318-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-q8fz7\" (UID: \"71340af0-2591-4b41-ae96-e7f5fada7318\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-q8fz7" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.048313 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4rgw\" (UniqueName: \"kubernetes.io/projected/8ff37c1b-1688-42ce-8b0c-952d297ae4a0-kube-api-access-l4rgw\") pod \"router-default-5444994796-c6sj6\" (UID: \"8ff37c1b-1688-42ce-8b0c-952d297ae4a0\") " pod="openshift-ingress/router-default-5444994796-c6sj6" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.048352 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkvw8\" (UniqueName: \"kubernetes.io/projected/71340af0-2591-4b41-ae96-e7f5fada7318-kube-api-access-nkvw8\") pod \"machine-config-controller-84d6567774-q8fz7\" (UID: \"71340af0-2591-4b41-ae96-e7f5fada7318\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-q8fz7" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.048386 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ff37c1b-1688-42ce-8b0c-952d297ae4a0-service-ca-bundle\") pod \"router-default-5444994796-c6sj6\" (UID: \"8ff37c1b-1688-42ce-8b0c-952d297ae4a0\") " pod="openshift-ingress/router-default-5444994796-c6sj6" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.048486 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8ff37c1b-1688-42ce-8b0c-952d297ae4a0-metrics-certs\") pod \"router-default-5444994796-c6sj6\" (UID: \"8ff37c1b-1688-42ce-8b0c-952d297ae4a0\") " pod="openshift-ingress/router-default-5444994796-c6sj6" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.048511 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5239e6e7-4afa-4b37-9582-e159b201453a-config\") pod \"kube-apiserver-operator-766d6c64bb-c57s5\" (UID: \"5239e6e7-4afa-4b37-9582-e159b201453a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c57s5" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.049052 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/71340af0-2591-4b41-ae96-e7f5fada7318-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-q8fz7\" (UID: \"71340af0-2591-4b41-ae96-e7f5fada7318\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-q8fz7" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.056029 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85a3bbcb-e663-4a97-980c-606c979409d7-trusted-ca-bundle\") pod \"console-f9d7485db-qh9sg\" (UID: \"85a3bbcb-e663-4a97-980c-606c979409d7\") " pod="openshift-console/console-f9d7485db-qh9sg" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.074004 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qclcn\" (UniqueName: \"kubernetes.io/projected/4e3ac241-df4c-4438-83a5-9cc923b0f82f-kube-api-access-qclcn\") pod \"machine-approver-56656f9798-lz22f\" (UID: \"4e3ac241-df4c-4438-83a5-9cc923b0f82f\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lz22f" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.081571 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.101325 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.121415 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.127605 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lz22f" Mar 09 09:07:35 crc kubenswrapper[4861]: W0309 09:07:35.138415 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e3ac241_df4c_4438_83a5_9cc923b0f82f.slice/crio-e770987c03bced3a451eb5731323be999503ab6f9486e7a399074d92157d38ef WatchSource:0}: Error finding container e770987c03bced3a451eb5731323be999503ab6f9486e7a399074d92157d38ef: Status 404 returned error can't find the container with id e770987c03bced3a451eb5731323be999503ab6f9486e7a399074d92157d38ef Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.141130 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.154393 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lz22f" event={"ID":"4e3ac241-df4c-4438-83a5-9cc923b0f82f","Type":"ContainerStarted","Data":"e770987c03bced3a451eb5731323be999503ab6f9486e7a399074d92157d38ef"} Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.161815 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.182824 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.202016 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.222339 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.241781 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.262549 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.282469 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.302035 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.313050 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/71340af0-2591-4b41-ae96-e7f5fada7318-proxy-tls\") pod \"machine-config-controller-84d6567774-q8fz7\" (UID: \"71340af0-2591-4b41-ae96-e7f5fada7318\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-q8fz7" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.321612 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.369879 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.381441 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.382208 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ad46ee5f-9c08-438c-8284-aa488f48e522-profile-collector-cert\") pod \"olm-operator-6b444d44fb-s96f9\" (UID: \"ad46ee5f-9c08-438c-8284-aa488f48e522\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-s96f9" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.402311 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.413678 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ad46ee5f-9c08-438c-8284-aa488f48e522-srv-cert\") pod \"olm-operator-6b444d44fb-s96f9\" (UID: \"ad46ee5f-9c08-438c-8284-aa488f48e522\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-s96f9" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.422405 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.442038 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.461751 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.481092 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.501127 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.522384 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.541562 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.552913 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8ff37c1b-1688-42ce-8b0c-952d297ae4a0-metrics-certs\") pod \"router-default-5444994796-c6sj6\" (UID: \"8ff37c1b-1688-42ce-8b0c-952d297ae4a0\") " pod="openshift-ingress/router-default-5444994796-c6sj6" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.562142 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.581001 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.593137 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/8ff37c1b-1688-42ce-8b0c-952d297ae4a0-default-certificate\") pod \"router-default-5444994796-c6sj6\" (UID: \"8ff37c1b-1688-42ce-8b0c-952d297ae4a0\") " pod="openshift-ingress/router-default-5444994796-c6sj6" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.601871 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.611306 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/8ff37c1b-1688-42ce-8b0c-952d297ae4a0-stats-auth\") pod \"router-default-5444994796-c6sj6\" (UID: \"8ff37c1b-1688-42ce-8b0c-952d297ae4a0\") " pod="openshift-ingress/router-default-5444994796-c6sj6" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.621883 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.629752 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ff37c1b-1688-42ce-8b0c-952d297ae4a0-service-ca-bundle\") pod \"router-default-5444994796-c6sj6\" (UID: \"8ff37c1b-1688-42ce-8b0c-952d297ae4a0\") " pod="openshift-ingress/router-default-5444994796-c6sj6" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.642524 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.661960 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.681384 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.692503 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5239e6e7-4afa-4b37-9582-e159b201453a-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-c57s5\" (UID: \"5239e6e7-4afa-4b37-9582-e159b201453a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c57s5" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.701667 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.721576 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.729813 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5239e6e7-4afa-4b37-9582-e159b201453a-config\") pod \"kube-apiserver-operator-766d6c64bb-c57s5\" (UID: \"5239e6e7-4afa-4b37-9582-e159b201453a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c57s5" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.741983 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.761949 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.781904 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.801569 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.822225 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.842166 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.862864 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.880294 4861 request.go:700] Waited for 1.008758057s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager-operator/configmaps?fieldSelector=metadata.name%3Dkube-controller-manager-operator-config&limit=500&resourceVersion=0 Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.881829 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.901567 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.934801 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.942003 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.962720 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 09 09:07:35 crc kubenswrapper[4861]: I0309 09:07:35.983705 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 09 09:07:36 crc kubenswrapper[4861]: I0309 09:07:36.001542 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 09 09:07:36 crc kubenswrapper[4861]: I0309 09:07:36.022139 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 09 09:07:36 crc kubenswrapper[4861]: I0309 09:07:36.042167 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 09 09:07:36 crc kubenswrapper[4861]: I0309 09:07:36.068515 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 09 09:07:36 crc kubenswrapper[4861]: I0309 09:07:36.084161 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 09 09:07:36 crc kubenswrapper[4861]: I0309 09:07:36.122243 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 09 09:07:36 crc kubenswrapper[4861]: I0309 09:07:36.142106 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 09 09:07:36 crc kubenswrapper[4861]: I0309 09:07:36.164296 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lz22f" event={"ID":"4e3ac241-df4c-4438-83a5-9cc923b0f82f","Type":"ContainerStarted","Data":"2035347b8d2feebd510c16788c42ef79100a2565f1f1b20caace6f5de7ff612e"} Mar 09 09:07:36 crc kubenswrapper[4861]: I0309 09:07:36.164354 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lz22f" event={"ID":"4e3ac241-df4c-4438-83a5-9cc923b0f82f","Type":"ContainerStarted","Data":"d5739f2ba4e221ef65e259f0af890500e5d2f6379404e17edffa5a11e29ec44f"} Mar 09 09:07:36 crc kubenswrapper[4861]: I0309 09:07:36.164351 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 09 09:07:36 crc kubenswrapper[4861]: I0309 09:07:36.181570 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 09 09:07:36 crc kubenswrapper[4861]: I0309 09:07:36.201680 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 09 09:07:36 crc kubenswrapper[4861]: I0309 09:07:36.221564 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 09 09:07:36 crc kubenswrapper[4861]: I0309 09:07:36.242737 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 09 09:07:36 crc kubenswrapper[4861]: I0309 09:07:36.262153 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 09 09:07:36 crc kubenswrapper[4861]: I0309 09:07:36.282251 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 09 09:07:36 crc kubenswrapper[4861]: I0309 09:07:36.302450 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 09 09:07:36 crc kubenswrapper[4861]: I0309 09:07:36.322651 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 09 09:07:36 crc kubenswrapper[4861]: I0309 09:07:36.342650 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 09 09:07:36 crc kubenswrapper[4861]: I0309 09:07:36.362281 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 09 09:07:36 crc kubenswrapper[4861]: I0309 09:07:36.389462 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 09 09:07:36 crc kubenswrapper[4861]: I0309 09:07:36.403446 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 09 09:07:36 crc kubenswrapper[4861]: I0309 09:07:36.422477 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 09 09:07:36 crc kubenswrapper[4861]: I0309 09:07:36.441803 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 09 09:07:36 crc kubenswrapper[4861]: I0309 09:07:36.473848 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 09 09:07:36 crc kubenswrapper[4861]: I0309 09:07:36.482565 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 09 09:07:36 crc kubenswrapper[4861]: I0309 09:07:36.506926 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 09 09:07:36 crc kubenswrapper[4861]: I0309 09:07:36.522337 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 09 09:07:36 crc kubenswrapper[4861]: I0309 09:07:36.541670 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 09 09:07:36 crc kubenswrapper[4861]: I0309 09:07:36.562271 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 09 09:07:36 crc kubenswrapper[4861]: I0309 09:07:36.602200 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9x2n\" (UniqueName: \"kubernetes.io/projected/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-kube-api-access-p9x2n\") pod \"oauth-openshift-558db77b4-97dkg\" (UID: \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\") " pod="openshift-authentication/oauth-openshift-558db77b4-97dkg" Mar 09 09:07:36 crc kubenswrapper[4861]: I0309 09:07:36.618236 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lxkv\" (UniqueName: \"kubernetes.io/projected/e3056772-5de8-4fed-9796-440422743470-kube-api-access-6lxkv\") pod \"cluster-samples-operator-665b6dd947-24x2s\" (UID: \"e3056772-5de8-4fed-9796-440422743470\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-24x2s" Mar 09 09:07:36 crc kubenswrapper[4861]: I0309 09:07:36.622460 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 09 09:07:36 crc kubenswrapper[4861]: I0309 09:07:36.642715 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 09 09:07:36 crc kubenswrapper[4861]: I0309 09:07:36.663053 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 09 09:07:36 crc kubenswrapper[4861]: I0309 09:07:36.682965 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 09 09:07:36 crc kubenswrapper[4861]: I0309 09:07:36.722125 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 09 09:07:36 crc kubenswrapper[4861]: I0309 09:07:36.728902 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pkvv\" (UniqueName: \"kubernetes.io/projected/85a3bbcb-e663-4a97-980c-606c979409d7-kube-api-access-8pkvv\") pod \"console-f9d7485db-qh9sg\" (UID: \"85a3bbcb-e663-4a97-980c-606c979409d7\") " pod="openshift-console/console-f9d7485db-qh9sg" Mar 09 09:07:36 crc kubenswrapper[4861]: I0309 09:07:36.757669 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kb6dq\" (UniqueName: \"kubernetes.io/projected/6273411b-70f9-4fdf-bf82-d156b10a5824-kube-api-access-kb6dq\") pod \"apiserver-7bbb656c7d-hn7r4\" (UID: \"6273411b-70f9-4fdf-bf82-d156b10a5824\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hn7r4" Mar 09 09:07:36 crc kubenswrapper[4861]: I0309 09:07:36.780643 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kmsc\" (UniqueName: \"kubernetes.io/projected/271a4e3c-1aa2-4bf5-bdfe-0495910f75d6-kube-api-access-7kmsc\") pod \"cluster-image-registry-operator-dc59b4c8b-ldhj8\" (UID: \"271a4e3c-1aa2-4bf5-bdfe-0495910f75d6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ldhj8" Mar 09 09:07:36 crc kubenswrapper[4861]: I0309 09:07:36.797363 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj42q\" (UniqueName: \"kubernetes.io/projected/ba6c5b9f-7812-4b6d-998b-6f368a6edf83-kube-api-access-mj42q\") pod \"route-controller-manager-6576b87f9c-h84v9\" (UID: \"ba6c5b9f-7812-4b6d-998b-6f368a6edf83\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h84v9" Mar 09 09:07:36 crc kubenswrapper[4861]: I0309 09:07:36.815474 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drh7b\" (UniqueName: \"kubernetes.io/projected/54ee5cc6-676d-412c-b2e8-66308fcfc3d6-kube-api-access-drh7b\") pod \"dns-operator-744455d44c-75sbr\" (UID: \"54ee5cc6-676d-412c-b2e8-66308fcfc3d6\") " pod="openshift-dns-operator/dns-operator-744455d44c-75sbr" Mar 09 09:07:36 crc kubenswrapper[4861]: I0309 09:07:36.819253 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-24x2s" Mar 09 09:07:36 crc kubenswrapper[4861]: I0309 09:07:36.826102 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 09 09:07:36 crc kubenswrapper[4861]: I0309 09:07:36.842320 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hn7r4" Mar 09 09:07:36 crc kubenswrapper[4861]: I0309 09:07:36.846325 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 09 09:07:36 crc kubenswrapper[4861]: I0309 09:07:36.864712 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-97dkg" Mar 09 09:07:36 crc kubenswrapper[4861]: I0309 09:07:36.872646 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-75sbr" Mar 09 09:07:36 crc kubenswrapper[4861]: I0309 09:07:36.879584 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-qh9sg" Mar 09 09:07:36 crc kubenswrapper[4861]: I0309 09:07:36.880427 4861 request.go:700] Waited for 1.920840136s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/serviceaccounts/cluster-image-registry-operator/token Mar 09 09:07:36 crc kubenswrapper[4861]: I0309 09:07:36.881297 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhdr9\" (UniqueName: \"kubernetes.io/projected/20b8695b-8129-4f02-824d-5ca2a451d899-kube-api-access-dhdr9\") pod \"controller-manager-879f6c89f-gkrbr\" (UID: \"20b8695b-8129-4f02-824d-5ca2a451d899\") " pod="openshift-controller-manager/controller-manager-879f6c89f-gkrbr" Mar 09 09:07:36 crc kubenswrapper[4861]: I0309 09:07:36.912847 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/271a4e3c-1aa2-4bf5-bdfe-0495910f75d6-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-ldhj8\" (UID: \"271a4e3c-1aa2-4bf5-bdfe-0495910f75d6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ldhj8" Mar 09 09:07:36 crc kubenswrapper[4861]: I0309 09:07:36.922503 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfzss\" (UniqueName: \"kubernetes.io/projected/8415c9be-22de-49bb-8b53-6f06a923ef33-kube-api-access-hfzss\") pod \"openshift-apiserver-operator-796bbdcf4f-lj7m8\" (UID: \"8415c9be-22de-49bb-8b53-6f06a923ef33\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lj7m8" Mar 09 09:07:36 crc kubenswrapper[4861]: I0309 09:07:36.936478 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fggwc\" (UniqueName: \"kubernetes.io/projected/b820aa76-e1f5-440b-b942-2a4468dc4d51-kube-api-access-fggwc\") pod \"authentication-operator-69f744f599-5m9c6\" (UID: \"b820aa76-e1f5-440b-b942-2a4468dc4d51\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-5m9c6" Mar 09 09:07:36 crc kubenswrapper[4861]: I0309 09:07:36.968107 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2m5p\" (UniqueName: \"kubernetes.io/projected/2a29855a-bbc0-458f-a9a0-0ddfd8763f2d-kube-api-access-z2m5p\") pod \"openshift-config-operator-7777fb866f-dwjwj\" (UID: \"2a29855a-bbc0-458f-a9a0-0ddfd8763f2d\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-dwjwj" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:36.995993 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwsc2\" (UniqueName: \"kubernetes.io/projected/da4004a6-c6fd-41d6-a651-b4aaec2d6454-kube-api-access-dwsc2\") pod \"machine-api-operator-5694c8668f-gqnxr\" (UID: \"da4004a6-c6fd-41d6-a651-b4aaec2d6454\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-gqnxr" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.001620 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.005487 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2xgl\" (UniqueName: \"kubernetes.io/projected/f300ad16-f055-45cf-ac02-960f49b5d426-kube-api-access-b2xgl\") pod \"console-operator-58897d9998-jtnvg\" (UID: \"f300ad16-f055-45cf-ac02-960f49b5d426\") " pod="openshift-console-operator/console-operator-58897d9998-jtnvg" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.016473 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-gkrbr" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.021929 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.031575 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h84v9" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.042149 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lj7m8" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.042295 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.084200 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48s8c\" (UniqueName: \"kubernetes.io/projected/2afc7adc-5b22-4203-9265-2ea4293f132f-kube-api-access-48s8c\") pod \"apiserver-76f77b778f-l2jkp\" (UID: \"2afc7adc-5b22-4203-9265-2ea4293f132f\") " pod="openshift-apiserver/apiserver-76f77b778f-l2jkp" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.096438 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8w4c\" (UniqueName: \"kubernetes.io/projected/566d8448-e794-44ee-9d17-e92493adcd87-kube-api-access-v8w4c\") pod \"downloads-7954f5f757-h6j45\" (UID: \"566d8448-e794-44ee-9d17-e92493adcd87\") " pod="openshift-console/downloads-7954f5f757-h6j45" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.100170 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-5m9c6" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.113883 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-gqnxr" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.116354 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmdsk\" (UniqueName: \"kubernetes.io/projected/8e2a968f-c9a7-415e-895a-3d1f8a8a3e0b-kube-api-access-wmdsk\") pod \"openshift-controller-manager-operator-756b6f6bc6-jzt4r\" (UID: \"8e2a968f-c9a7-415e-895a-3d1f8a8a3e0b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jzt4r" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.122852 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.127116 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-jtnvg" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.134683 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-l2jkp" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.143421 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.149820 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ldhj8" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.157765 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dwjwj" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.174959 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.184287 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.209627 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.224163 4861 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.244473 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.262054 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-h6j45" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.263692 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.271192 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-hn7r4"] Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.296938 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5239e6e7-4afa-4b37-9582-e159b201453a-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-c57s5\" (UID: \"5239e6e7-4afa-4b37-9582-e159b201453a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c57s5" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.318219 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7q4f\" (UniqueName: \"kubernetes.io/projected/ad46ee5f-9c08-438c-8284-aa488f48e522-kube-api-access-k7q4f\") pod \"olm-operator-6b444d44fb-s96f9\" (UID: \"ad46ee5f-9c08-438c-8284-aa488f48e522\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-s96f9" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.325976 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-24x2s"] Mar 09 09:07:37 crc kubenswrapper[4861]: W0309 09:07:37.326104 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6273411b_70f9_4fdf_bf82_d156b10a5824.slice/crio-c412164bd0809a2f41df29c6344ff28fd6762a46a60a5a0228dc741c4818c6a8 WatchSource:0}: Error finding container c412164bd0809a2f41df29c6344ff28fd6762a46a60a5a0228dc741c4818c6a8: Status 404 returned error can't find the container with id c412164bd0809a2f41df29c6344ff28fd6762a46a60a5a0228dc741c4818c6a8 Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.341473 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4rgw\" (UniqueName: \"kubernetes.io/projected/8ff37c1b-1688-42ce-8b0c-952d297ae4a0-kube-api-access-l4rgw\") pod \"router-default-5444994796-c6sj6\" (UID: \"8ff37c1b-1688-42ce-8b0c-952d297ae4a0\") " pod="openshift-ingress/router-default-5444994796-c6sj6" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.363756 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkvw8\" (UniqueName: \"kubernetes.io/projected/71340af0-2591-4b41-ae96-e7f5fada7318-kube-api-access-nkvw8\") pod \"machine-config-controller-84d6567774-q8fz7\" (UID: \"71340af0-2591-4b41-ae96-e7f5fada7318\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-q8fz7" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.404494 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jzt4r" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.408630 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/de2fac75-67e1-47c9-9507-b8b5e5857c32-bound-sa-token\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.408764 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3bffc0ff-7dbe-425b-b81d-bad8f9a42e12-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-zc95n\" (UID: \"3bffc0ff-7dbe-425b-b81d-bad8f9a42e12\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zc95n" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.408849 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6t4js\" (UniqueName: \"kubernetes.io/projected/dea29153-0067-4ab9-b93c-54ad9fff1590-kube-api-access-6t4js\") pod \"etcd-operator-b45778765-fsckq\" (UID: \"dea29153-0067-4ab9-b93c-54ad9fff1590\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fsckq" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.408948 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/d8afafe8-bf56-46a7-bab9-c5a1c221a740-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-trcr9\" (UID: \"d8afafe8-bf56-46a7-bab9-c5a1c221a740\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-trcr9" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.408999 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/41fbb42d-a8a5-4d95-ba2c-ceeac7e96bff-auth-proxy-config\") pod \"machine-config-operator-74547568cd-tm5tk\" (UID: \"41fbb42d-a8a5-4d95-ba2c-ceeac7e96bff\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tm5tk" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.409088 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.409157 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/de2fac75-67e1-47c9-9507-b8b5e5857c32-registry-certificates\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.409176 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbkxb\" (UniqueName: \"kubernetes.io/projected/d8afafe8-bf56-46a7-bab9-c5a1c221a740-kube-api-access-kbkxb\") pod \"control-plane-machine-set-operator-78cbb6b69f-trcr9\" (UID: \"d8afafe8-bf56-46a7-bab9-c5a1c221a740\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-trcr9" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.409210 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rqsq\" (UniqueName: \"kubernetes.io/projected/3bffc0ff-7dbe-425b-b81d-bad8f9a42e12-kube-api-access-4rqsq\") pod \"package-server-manager-789f6589d5-zc95n\" (UID: \"3bffc0ff-7dbe-425b-b81d-bad8f9a42e12\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zc95n" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.409231 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/de2fac75-67e1-47c9-9507-b8b5e5857c32-ca-trust-extracted\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.409259 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dea29153-0067-4ab9-b93c-54ad9fff1590-serving-cert\") pod \"etcd-operator-b45778765-fsckq\" (UID: \"dea29153-0067-4ab9-b93c-54ad9fff1590\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fsckq" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.409296 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed7e49ef-0cce-4e76-b000-e3f6b9542246-config\") pod \"kube-controller-manager-operator-78b949d7b-2hgfm\" (UID: \"ed7e49ef-0cce-4e76-b000-e3f6b9542246\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2hgfm" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.409346 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/dea29153-0067-4ab9-b93c-54ad9fff1590-etcd-client\") pod \"etcd-operator-b45778765-fsckq\" (UID: \"dea29153-0067-4ab9-b93c-54ad9fff1590\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fsckq" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.409848 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dea29153-0067-4ab9-b93c-54ad9fff1590-config\") pod \"etcd-operator-b45778765-fsckq\" (UID: \"dea29153-0067-4ab9-b93c-54ad9fff1590\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fsckq" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.409877 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/41fbb42d-a8a5-4d95-ba2c-ceeac7e96bff-images\") pod \"machine-config-operator-74547568cd-tm5tk\" (UID: \"41fbb42d-a8a5-4d95-ba2c-ceeac7e96bff\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tm5tk" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.409915 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jw22\" (UniqueName: \"kubernetes.io/projected/de2fac75-67e1-47c9-9507-b8b5e5857c32-kube-api-access-6jw22\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:37 crc kubenswrapper[4861]: E0309 09:07:37.410229 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:07:37.910213019 +0000 UTC m=+100.995252420 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cjkj6" (UID: "de2fac75-67e1-47c9-9507-b8b5e5857c32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.410990 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/41fbb42d-a8a5-4d95-ba2c-ceeac7e96bff-proxy-tls\") pod \"machine-config-operator-74547568cd-tm5tk\" (UID: \"41fbb42d-a8a5-4d95-ba2c-ceeac7e96bff\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tm5tk" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.411306 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vznn2\" (UniqueName: \"kubernetes.io/projected/5d9e8554-0dac-404d-8be9-6bb656818cd2-kube-api-access-vznn2\") pod \"migrator-59844c95c7-96ksj\" (UID: \"5d9e8554-0dac-404d-8be9-6bb656818cd2\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-96ksj" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.411357 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed7e49ef-0cce-4e76-b000-e3f6b9542246-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-2hgfm\" (UID: \"ed7e49ef-0cce-4e76-b000-e3f6b9542246\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2hgfm" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.411409 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/de2fac75-67e1-47c9-9507-b8b5e5857c32-installation-pull-secrets\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.411475 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/de2fac75-67e1-47c9-9507-b8b5e5857c32-trusted-ca\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.411493 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ed7e49ef-0cce-4e76-b000-e3f6b9542246-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-2hgfm\" (UID: \"ed7e49ef-0cce-4e76-b000-e3f6b9542246\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2hgfm" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.411513 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/dea29153-0067-4ab9-b93c-54ad9fff1590-etcd-service-ca\") pod \"etcd-operator-b45778765-fsckq\" (UID: \"dea29153-0067-4ab9-b93c-54ad9fff1590\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fsckq" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.411588 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m5cn\" (UniqueName: \"kubernetes.io/projected/41fbb42d-a8a5-4d95-ba2c-ceeac7e96bff-kube-api-access-7m5cn\") pod \"machine-config-operator-74547568cd-tm5tk\" (UID: \"41fbb42d-a8a5-4d95-ba2c-ceeac7e96bff\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tm5tk" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.412013 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/dea29153-0067-4ab9-b93c-54ad9fff1590-etcd-ca\") pod \"etcd-operator-b45778765-fsckq\" (UID: \"dea29153-0067-4ab9-b93c-54ad9fff1590\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fsckq" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.412042 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/de2fac75-67e1-47c9-9507-b8b5e5857c32-registry-tls\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.497172 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-q8fz7" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.506844 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-75sbr"] Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.508041 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-s96f9" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.508777 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-97dkg"] Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.508961 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-qh9sg"] Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.513258 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.513583 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7672aac2-6c30-4cab-82aa-285ef39ea67d-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-68457\" (UID: \"7672aac2-6c30-4cab-82aa-285ef39ea67d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-68457" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.513652 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpbm2\" (UniqueName: \"kubernetes.io/projected/3d2a0407-66d1-4d05-9623-fe968aa3b516-kube-api-access-dpbm2\") pod \"marketplace-operator-79b997595-k2z5n\" (UID: \"3d2a0407-66d1-4d05-9623-fe968aa3b516\") " pod="openshift-marketplace/marketplace-operator-79b997595-k2z5n" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.513672 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggs6q\" (UniqueName: \"kubernetes.io/projected/7672aac2-6c30-4cab-82aa-285ef39ea67d-kube-api-access-ggs6q\") pod \"multus-admission-controller-857f4d67dd-68457\" (UID: \"7672aac2-6c30-4cab-82aa-285ef39ea67d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-68457" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.513693 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bf346b6e-1bfb-4753-b113-aa981202e7e7-metrics-tls\") pod \"dns-default-zrbdj\" (UID: \"bf346b6e-1bfb-4753-b113-aa981202e7e7\") " pod="openshift-dns/dns-default-zrbdj" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.513749 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djxts\" (UniqueName: \"kubernetes.io/projected/4eed3eac-42f8-4683-9c1f-3733965e6af7-kube-api-access-djxts\") pod \"cni-sysctl-allowlist-ds-wdsnl\" (UID: \"4eed3eac-42f8-4683-9c1f-3733965e6af7\") " pod="openshift-multus/cni-sysctl-allowlist-ds-wdsnl" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.513769 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/de2fac75-67e1-47c9-9507-b8b5e5857c32-registry-certificates\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.513803 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nfpw\" (UniqueName: \"kubernetes.io/projected/c553be85-38c4-45a7-9406-257b039d7734-kube-api-access-4nfpw\") pod \"service-ca-operator-777779d784-dd22q\" (UID: \"c553be85-38c4-45a7-9406-257b039d7734\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dd22q" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.513825 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/7044838f-643f-4f07-9f45-0468786d3798-certs\") pod \"machine-config-server-jtqjq\" (UID: \"7044838f-643f-4f07-9f45-0468786d3798\") " pod="openshift-machine-config-operator/machine-config-server-jtqjq" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.513845 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbkxb\" (UniqueName: \"kubernetes.io/projected/d8afafe8-bf56-46a7-bab9-c5a1c221a740-kube-api-access-kbkxb\") pod \"control-plane-machine-set-operator-78cbb6b69f-trcr9\" (UID: \"d8afafe8-bf56-46a7-bab9-c5a1c221a740\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-trcr9" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.513872 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rqsq\" (UniqueName: \"kubernetes.io/projected/3bffc0ff-7dbe-425b-b81d-bad8f9a42e12-kube-api-access-4rqsq\") pod \"package-server-manager-789f6589d5-zc95n\" (UID: \"3bffc0ff-7dbe-425b-b81d-bad8f9a42e12\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zc95n" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.513940 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/de2fac75-67e1-47c9-9507-b8b5e5857c32-ca-trust-extracted\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.513964 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/db5f0bfe-4d3e-4b72-9dba-437fc3e94b93-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fhw9h\" (UID: \"db5f0bfe-4d3e-4b72-9dba-437fc3e94b93\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fhw9h" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.513981 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bf346b6e-1bfb-4753-b113-aa981202e7e7-config-volume\") pod \"dns-default-zrbdj\" (UID: \"bf346b6e-1bfb-4753-b113-aa981202e7e7\") " pod="openshift-dns/dns-default-zrbdj" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.514034 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dea29153-0067-4ab9-b93c-54ad9fff1590-serving-cert\") pod \"etcd-operator-b45778765-fsckq\" (UID: \"dea29153-0067-4ab9-b93c-54ad9fff1590\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fsckq" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.514054 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed7e49ef-0cce-4e76-b000-e3f6b9542246-config\") pod \"kube-controller-manager-operator-78b949d7b-2hgfm\" (UID: \"ed7e49ef-0cce-4e76-b000-e3f6b9542246\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2hgfm" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.514074 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vxzx\" (UniqueName: \"kubernetes.io/projected/cf30c324-b218-45df-8462-1b76cc2825c2-kube-api-access-6vxzx\") pod \"csi-hostpathplugin-z6v85\" (UID: \"cf30c324-b218-45df-8462-1b76cc2825c2\") " pod="hostpath-provisioner/csi-hostpathplugin-z6v85" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.514095 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/17d9ce44-f425-4262-ab22-5edef1fad72e-trusted-ca\") pod \"ingress-operator-5b745b69d9-hj4tg\" (UID: \"17d9ce44-f425-4262-ab22-5edef1fad72e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hj4tg" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.514114 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/cf30c324-b218-45df-8462-1b76cc2825c2-plugins-dir\") pod \"csi-hostpathplugin-z6v85\" (UID: \"cf30c324-b218-45df-8462-1b76cc2825c2\") " pod="hostpath-provisioner/csi-hostpathplugin-z6v85" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.514162 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/dea29153-0067-4ab9-b93c-54ad9fff1590-etcd-client\") pod \"etcd-operator-b45778765-fsckq\" (UID: \"dea29153-0067-4ab9-b93c-54ad9fff1590\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fsckq" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.514241 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4eed3eac-42f8-4683-9c1f-3733965e6af7-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-wdsnl\" (UID: \"4eed3eac-42f8-4683-9c1f-3733965e6af7\") " pod="openshift-multus/cni-sysctl-allowlist-ds-wdsnl" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.514285 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dea29153-0067-4ab9-b93c-54ad9fff1590-config\") pod \"etcd-operator-b45778765-fsckq\" (UID: \"dea29153-0067-4ab9-b93c-54ad9fff1590\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fsckq" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.514303 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/41fbb42d-a8a5-4d95-ba2c-ceeac7e96bff-images\") pod \"machine-config-operator-74547568cd-tm5tk\" (UID: \"41fbb42d-a8a5-4d95-ba2c-ceeac7e96bff\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tm5tk" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.514330 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jw22\" (UniqueName: \"kubernetes.io/projected/de2fac75-67e1-47c9-9507-b8b5e5857c32-kube-api-access-6jw22\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.514350 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/608b7b11-f38a-4c4b-9e61-dab4f84c34c1-config-volume\") pod \"collect-profiles-29550780-wpnp9\" (UID: \"608b7b11-f38a-4c4b-9e61-dab4f84c34c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550780-wpnp9" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.514398 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7bed8f1-d158-4e04-aa35-99ff2c7cd59e-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-wr7xx\" (UID: \"c7bed8f1-d158-4e04-aa35-99ff2c7cd59e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wr7xx" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.514418 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3d2a0407-66d1-4d05-9623-fe968aa3b516-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-k2z5n\" (UID: \"3d2a0407-66d1-4d05-9623-fe968aa3b516\") " pod="openshift-marketplace/marketplace-operator-79b997595-k2z5n" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.514517 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3b714fda-1eb1-4637-9808-a2d0d47b4a91-cert\") pod \"ingress-canary-5t4mn\" (UID: \"3b714fda-1eb1-4637-9808-a2d0d47b4a91\") " pod="openshift-ingress-canary/ingress-canary-5t4mn" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.514538 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5mtx\" (UniqueName: \"kubernetes.io/projected/c7bed8f1-d158-4e04-aa35-99ff2c7cd59e-kube-api-access-n5mtx\") pod \"kube-storage-version-migrator-operator-b67b599dd-wr7xx\" (UID: \"c7bed8f1-d158-4e04-aa35-99ff2c7cd59e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wr7xx" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.514556 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/17d9ce44-f425-4262-ab22-5edef1fad72e-bound-sa-token\") pod \"ingress-operator-5b745b69d9-hj4tg\" (UID: \"17d9ce44-f425-4262-ab22-5edef1fad72e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hj4tg" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.514574 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/608b7b11-f38a-4c4b-9e61-dab4f84c34c1-secret-volume\") pod \"collect-profiles-29550780-wpnp9\" (UID: \"608b7b11-f38a-4c4b-9e61-dab4f84c34c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550780-wpnp9" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.514602 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/036eac53-a3f2-4efc-bd3e-4616e47d8901-signing-cabundle\") pod \"service-ca-9c57cc56f-kpns7\" (UID: \"036eac53-a3f2-4efc-bd3e-4616e47d8901\") " pod="openshift-service-ca/service-ca-9c57cc56f-kpns7" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.515232 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/41fbb42d-a8a5-4d95-ba2c-ceeac7e96bff-proxy-tls\") pod \"machine-config-operator-74547568cd-tm5tk\" (UID: \"41fbb42d-a8a5-4d95-ba2c-ceeac7e96bff\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tm5tk" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.515287 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvsqs\" (UniqueName: \"kubernetes.io/projected/036eac53-a3f2-4efc-bd3e-4616e47d8901-kube-api-access-jvsqs\") pod \"service-ca-9c57cc56f-kpns7\" (UID: \"036eac53-a3f2-4efc-bd3e-4616e47d8901\") " pod="openshift-service-ca/service-ca-9c57cc56f-kpns7" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.515315 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xst5\" (UniqueName: \"kubernetes.io/projected/7044838f-643f-4f07-9f45-0468786d3798-kube-api-access-6xst5\") pod \"machine-config-server-jtqjq\" (UID: \"7044838f-643f-4f07-9f45-0468786d3798\") " pod="openshift-machine-config-operator/machine-config-server-jtqjq" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.515336 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9mdr\" (UniqueName: \"kubernetes.io/projected/17d9ce44-f425-4262-ab22-5edef1fad72e-kube-api-access-k9mdr\") pod \"ingress-operator-5b745b69d9-hj4tg\" (UID: \"17d9ce44-f425-4262-ab22-5edef1fad72e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hj4tg" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.516204 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vznn2\" (UniqueName: \"kubernetes.io/projected/5d9e8554-0dac-404d-8be9-6bb656818cd2-kube-api-access-vznn2\") pod \"migrator-59844c95c7-96ksj\" (UID: \"5d9e8554-0dac-404d-8be9-6bb656818cd2\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-96ksj" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.516423 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed7e49ef-0cce-4e76-b000-e3f6b9542246-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-2hgfm\" (UID: \"ed7e49ef-0cce-4e76-b000-e3f6b9542246\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2hgfm" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.516654 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c553be85-38c4-45a7-9406-257b039d7734-serving-cert\") pod \"service-ca-operator-777779d784-dd22q\" (UID: \"c553be85-38c4-45a7-9406-257b039d7734\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dd22q" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.516713 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/cf30c324-b218-45df-8462-1b76cc2825c2-socket-dir\") pod \"csi-hostpathplugin-z6v85\" (UID: \"cf30c324-b218-45df-8462-1b76cc2825c2\") " pod="hostpath-provisioner/csi-hostpathplugin-z6v85" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.517228 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/de2fac75-67e1-47c9-9507-b8b5e5857c32-installation-pull-secrets\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.517308 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/cf30c324-b218-45df-8462-1b76cc2825c2-mountpoint-dir\") pod \"csi-hostpathplugin-z6v85\" (UID: \"cf30c324-b218-45df-8462-1b76cc2825c2\") " pod="hostpath-provisioner/csi-hostpathplugin-z6v85" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.517506 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/17d9ce44-f425-4262-ab22-5edef1fad72e-metrics-tls\") pod \"ingress-operator-5b745b69d9-hj4tg\" (UID: \"17d9ce44-f425-4262-ab22-5edef1fad72e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hj4tg" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.521714 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-c6sj6" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.521947 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/de2fac75-67e1-47c9-9507-b8b5e5857c32-trusted-ca\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.522041 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ed7e49ef-0cce-4e76-b000-e3f6b9542246-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-2hgfm\" (UID: \"ed7e49ef-0cce-4e76-b000-e3f6b9542246\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2hgfm" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.522135 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/cf30c324-b218-45df-8462-1b76cc2825c2-csi-data-dir\") pod \"csi-hostpathplugin-z6v85\" (UID: \"cf30c324-b218-45df-8462-1b76cc2825c2\") " pod="hostpath-provisioner/csi-hostpathplugin-z6v85" Mar 09 09:07:37 crc kubenswrapper[4861]: E0309 09:07:37.522194 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:07:38.02213967 +0000 UTC m=+101.107179071 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.522233 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4eed3eac-42f8-4683-9c1f-3733965e6af7-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-wdsnl\" (UID: \"4eed3eac-42f8-4683-9c1f-3733965e6af7\") " pod="openshift-multus/cni-sysctl-allowlist-ds-wdsnl" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.522269 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/eed5737e-ff85-457b-a0ed-6b0a750f68b1-srv-cert\") pod \"catalog-operator-68c6474976-qmmgv\" (UID: \"eed5737e-ff85-457b-a0ed-6b0a750f68b1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qmmgv" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.522328 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/dea29153-0067-4ab9-b93c-54ad9fff1590-etcd-service-ca\") pod \"etcd-operator-b45778765-fsckq\" (UID: \"dea29153-0067-4ab9-b93c-54ad9fff1590\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fsckq" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.522377 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7m5cn\" (UniqueName: \"kubernetes.io/projected/41fbb42d-a8a5-4d95-ba2c-ceeac7e96bff-kube-api-access-7m5cn\") pod \"machine-config-operator-74547568cd-tm5tk\" (UID: \"41fbb42d-a8a5-4d95-ba2c-ceeac7e96bff\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tm5tk" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.522418 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/dea29153-0067-4ab9-b93c-54ad9fff1590-etcd-ca\") pod \"etcd-operator-b45778765-fsckq\" (UID: \"dea29153-0067-4ab9-b93c-54ad9fff1590\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fsckq" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.522437 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9r5p5\" (UniqueName: \"kubernetes.io/projected/3b714fda-1eb1-4637-9808-a2d0d47b4a91-kube-api-access-9r5p5\") pod \"ingress-canary-5t4mn\" (UID: \"3b714fda-1eb1-4637-9808-a2d0d47b4a91\") " pod="openshift-ingress-canary/ingress-canary-5t4mn" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.522479 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/de2fac75-67e1-47c9-9507-b8b5e5857c32-registry-tls\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.522497 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/abda7bc3-76f8-43a5-b0e6-0aa9271a5758-webhook-cert\") pod \"packageserver-d55dfcdfc-7mk6w\" (UID: \"abda7bc3-76f8-43a5-b0e6-0aa9271a5758\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7mk6w" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.522558 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/abda7bc3-76f8-43a5-b0e6-0aa9271a5758-tmpfs\") pod \"packageserver-d55dfcdfc-7mk6w\" (UID: \"abda7bc3-76f8-43a5-b0e6-0aa9271a5758\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7mk6w" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.522573 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/abda7bc3-76f8-43a5-b0e6-0aa9271a5758-apiservice-cert\") pod \"packageserver-d55dfcdfc-7mk6w\" (UID: \"abda7bc3-76f8-43a5-b0e6-0aa9271a5758\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7mk6w" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.522595 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/de2fac75-67e1-47c9-9507-b8b5e5857c32-bound-sa-token\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.522612 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/7044838f-643f-4f07-9f45-0468786d3798-node-bootstrap-token\") pod \"machine-config-server-jtqjq\" (UID: \"7044838f-643f-4f07-9f45-0468786d3798\") " pod="openshift-machine-config-operator/machine-config-server-jtqjq" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.522646 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3bffc0ff-7dbe-425b-b81d-bad8f9a42e12-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-zc95n\" (UID: \"3bffc0ff-7dbe-425b-b81d-bad8f9a42e12\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zc95n" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.522702 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6t4js\" (UniqueName: \"kubernetes.io/projected/dea29153-0067-4ab9-b93c-54ad9fff1590-kube-api-access-6t4js\") pod \"etcd-operator-b45778765-fsckq\" (UID: \"dea29153-0067-4ab9-b93c-54ad9fff1590\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fsckq" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.522722 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db5f0bfe-4d3e-4b72-9dba-437fc3e94b93-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fhw9h\" (UID: \"db5f0bfe-4d3e-4b72-9dba-437fc3e94b93\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fhw9h" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.522765 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nplhw\" (UniqueName: \"kubernetes.io/projected/bf346b6e-1bfb-4753-b113-aa981202e7e7-kube-api-access-nplhw\") pod \"dns-default-zrbdj\" (UID: \"bf346b6e-1bfb-4753-b113-aa981202e7e7\") " pod="openshift-dns/dns-default-zrbdj" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.522838 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/d8afafe8-bf56-46a7-bab9-c5a1c221a740-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-trcr9\" (UID: \"d8afafe8-bf56-46a7-bab9-c5a1c221a740\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-trcr9" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.522857 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2r7p\" (UniqueName: \"kubernetes.io/projected/abda7bc3-76f8-43a5-b0e6-0aa9271a5758-kube-api-access-q2r7p\") pod \"packageserver-d55dfcdfc-7mk6w\" (UID: \"abda7bc3-76f8-43a5-b0e6-0aa9271a5758\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7mk6w" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.522874 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/cf30c324-b218-45df-8462-1b76cc2825c2-registration-dir\") pod \"csi-hostpathplugin-z6v85\" (UID: \"cf30c324-b218-45df-8462-1b76cc2825c2\") " pod="hostpath-provisioner/csi-hostpathplugin-z6v85" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.522925 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3d2a0407-66d1-4d05-9623-fe968aa3b516-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-k2z5n\" (UID: \"3d2a0407-66d1-4d05-9623-fe968aa3b516\") " pod="openshift-marketplace/marketplace-operator-79b997595-k2z5n" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.522955 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/41fbb42d-a8a5-4d95-ba2c-ceeac7e96bff-auth-proxy-config\") pod \"machine-config-operator-74547568cd-tm5tk\" (UID: \"41fbb42d-a8a5-4d95-ba2c-ceeac7e96bff\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tm5tk" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.522978 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7bed8f1-d158-4e04-aa35-99ff2c7cd59e-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-wr7xx\" (UID: \"c7bed8f1-d158-4e04-aa35-99ff2c7cd59e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wr7xx" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.522996 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/036eac53-a3f2-4efc-bd3e-4616e47d8901-signing-key\") pod \"service-ca-9c57cc56f-kpns7\" (UID: \"036eac53-a3f2-4efc-bd3e-4616e47d8901\") " pod="openshift-service-ca/service-ca-9c57cc56f-kpns7" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.523012 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c553be85-38c4-45a7-9406-257b039d7734-config\") pod \"service-ca-operator-777779d784-dd22q\" (UID: \"c553be85-38c4-45a7-9406-257b039d7734\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dd22q" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.523030 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/eed5737e-ff85-457b-a0ed-6b0a750f68b1-profile-collector-cert\") pod \"catalog-operator-68c6474976-qmmgv\" (UID: \"eed5737e-ff85-457b-a0ed-6b0a750f68b1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qmmgv" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.523045 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7blm\" (UniqueName: \"kubernetes.io/projected/eed5737e-ff85-457b-a0ed-6b0a750f68b1-kube-api-access-m7blm\") pod \"catalog-operator-68c6474976-qmmgv\" (UID: \"eed5737e-ff85-457b-a0ed-6b0a750f68b1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qmmgv" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.523061 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/4eed3eac-42f8-4683-9c1f-3733965e6af7-ready\") pod \"cni-sysctl-allowlist-ds-wdsnl\" (UID: \"4eed3eac-42f8-4683-9c1f-3733965e6af7\") " pod="openshift-multus/cni-sysctl-allowlist-ds-wdsnl" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.523083 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.523099 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db5f0bfe-4d3e-4b72-9dba-437fc3e94b93-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fhw9h\" (UID: \"db5f0bfe-4d3e-4b72-9dba-437fc3e94b93\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fhw9h" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.523115 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dk4ng\" (UniqueName: \"kubernetes.io/projected/608b7b11-f38a-4c4b-9e61-dab4f84c34c1-kube-api-access-dk4ng\") pod \"collect-profiles-29550780-wpnp9\" (UID: \"608b7b11-f38a-4c4b-9e61-dab4f84c34c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550780-wpnp9" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.523695 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/de2fac75-67e1-47c9-9507-b8b5e5857c32-trusted-ca\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.527593 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/de2fac75-67e1-47c9-9507-b8b5e5857c32-registry-certificates\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.528034 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/41fbb42d-a8a5-4d95-ba2c-ceeac7e96bff-images\") pod \"machine-config-operator-74547568cd-tm5tk\" (UID: \"41fbb42d-a8a5-4d95-ba2c-ceeac7e96bff\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tm5tk" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.532229 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/de2fac75-67e1-47c9-9507-b8b5e5857c32-installation-pull-secrets\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.532567 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/41fbb42d-a8a5-4d95-ba2c-ceeac7e96bff-auth-proxy-config\") pod \"machine-config-operator-74547568cd-tm5tk\" (UID: \"41fbb42d-a8a5-4d95-ba2c-ceeac7e96bff\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tm5tk" Mar 09 09:07:37 crc kubenswrapper[4861]: E0309 09:07:37.533093 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:07:38.033077514 +0000 UTC m=+101.118116915 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cjkj6" (UID: "de2fac75-67e1-47c9-9507-b8b5e5857c32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.533438 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3bffc0ff-7dbe-425b-b81d-bad8f9a42e12-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-zc95n\" (UID: \"3bffc0ff-7dbe-425b-b81d-bad8f9a42e12\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zc95n" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.533643 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c57s5" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.533983 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/dea29153-0067-4ab9-b93c-54ad9fff1590-etcd-ca\") pod \"etcd-operator-b45778765-fsckq\" (UID: \"dea29153-0067-4ab9-b93c-54ad9fff1590\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fsckq" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.535651 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/dea29153-0067-4ab9-b93c-54ad9fff1590-etcd-service-ca\") pod \"etcd-operator-b45778765-fsckq\" (UID: \"dea29153-0067-4ab9-b93c-54ad9fff1590\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fsckq" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.535772 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed7e49ef-0cce-4e76-b000-e3f6b9542246-config\") pod \"kube-controller-manager-operator-78b949d7b-2hgfm\" (UID: \"ed7e49ef-0cce-4e76-b000-e3f6b9542246\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2hgfm" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.536130 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dea29153-0067-4ab9-b93c-54ad9fff1590-config\") pod \"etcd-operator-b45778765-fsckq\" (UID: \"dea29153-0067-4ab9-b93c-54ad9fff1590\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fsckq" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.538336 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dea29153-0067-4ab9-b93c-54ad9fff1590-serving-cert\") pod \"etcd-operator-b45778765-fsckq\" (UID: \"dea29153-0067-4ab9-b93c-54ad9fff1590\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fsckq" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.538778 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/41fbb42d-a8a5-4d95-ba2c-ceeac7e96bff-proxy-tls\") pod \"machine-config-operator-74547568cd-tm5tk\" (UID: \"41fbb42d-a8a5-4d95-ba2c-ceeac7e96bff\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tm5tk" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.542301 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/de2fac75-67e1-47c9-9507-b8b5e5857c32-registry-tls\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.542511 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/d8afafe8-bf56-46a7-bab9-c5a1c221a740-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-trcr9\" (UID: \"d8afafe8-bf56-46a7-bab9-c5a1c221a740\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-trcr9" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.543648 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/de2fac75-67e1-47c9-9507-b8b5e5857c32-ca-trust-extracted\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.545884 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed7e49ef-0cce-4e76-b000-e3f6b9542246-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-2hgfm\" (UID: \"ed7e49ef-0cce-4e76-b000-e3f6b9542246\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2hgfm" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.549189 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/dea29153-0067-4ab9-b93c-54ad9fff1590-etcd-client\") pod \"etcd-operator-b45778765-fsckq\" (UID: \"dea29153-0067-4ab9-b93c-54ad9fff1590\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fsckq" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.560433 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ed7e49ef-0cce-4e76-b000-e3f6b9542246-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-2hgfm\" (UID: \"ed7e49ef-0cce-4e76-b000-e3f6b9542246\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2hgfm" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.609637 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jw22\" (UniqueName: \"kubernetes.io/projected/de2fac75-67e1-47c9-9507-b8b5e5857c32-kube-api-access-6jw22\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.623784 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.623942 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/608b7b11-f38a-4c4b-9e61-dab4f84c34c1-config-volume\") pod \"collect-profiles-29550780-wpnp9\" (UID: \"608b7b11-f38a-4c4b-9e61-dab4f84c34c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550780-wpnp9" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.623961 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7bed8f1-d158-4e04-aa35-99ff2c7cd59e-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-wr7xx\" (UID: \"c7bed8f1-d158-4e04-aa35-99ff2c7cd59e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wr7xx" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.623977 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3d2a0407-66d1-4d05-9623-fe968aa3b516-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-k2z5n\" (UID: \"3d2a0407-66d1-4d05-9623-fe968aa3b516\") " pod="openshift-marketplace/marketplace-operator-79b997595-k2z5n" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.624000 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3b714fda-1eb1-4637-9808-a2d0d47b4a91-cert\") pod \"ingress-canary-5t4mn\" (UID: \"3b714fda-1eb1-4637-9808-a2d0d47b4a91\") " pod="openshift-ingress-canary/ingress-canary-5t4mn" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.624016 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5mtx\" (UniqueName: \"kubernetes.io/projected/c7bed8f1-d158-4e04-aa35-99ff2c7cd59e-kube-api-access-n5mtx\") pod \"kube-storage-version-migrator-operator-b67b599dd-wr7xx\" (UID: \"c7bed8f1-d158-4e04-aa35-99ff2c7cd59e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wr7xx" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.624031 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/17d9ce44-f425-4262-ab22-5edef1fad72e-bound-sa-token\") pod \"ingress-operator-5b745b69d9-hj4tg\" (UID: \"17d9ce44-f425-4262-ab22-5edef1fad72e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hj4tg" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.624048 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/608b7b11-f38a-4c4b-9e61-dab4f84c34c1-secret-volume\") pod \"collect-profiles-29550780-wpnp9\" (UID: \"608b7b11-f38a-4c4b-9e61-dab4f84c34c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550780-wpnp9" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.624065 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/036eac53-a3f2-4efc-bd3e-4616e47d8901-signing-cabundle\") pod \"service-ca-9c57cc56f-kpns7\" (UID: \"036eac53-a3f2-4efc-bd3e-4616e47d8901\") " pod="openshift-service-ca/service-ca-9c57cc56f-kpns7" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.624085 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvsqs\" (UniqueName: \"kubernetes.io/projected/036eac53-a3f2-4efc-bd3e-4616e47d8901-kube-api-access-jvsqs\") pod \"service-ca-9c57cc56f-kpns7\" (UID: \"036eac53-a3f2-4efc-bd3e-4616e47d8901\") " pod="openshift-service-ca/service-ca-9c57cc56f-kpns7" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.624103 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xst5\" (UniqueName: \"kubernetes.io/projected/7044838f-643f-4f07-9f45-0468786d3798-kube-api-access-6xst5\") pod \"machine-config-server-jtqjq\" (UID: \"7044838f-643f-4f07-9f45-0468786d3798\") " pod="openshift-machine-config-operator/machine-config-server-jtqjq" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.624123 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9mdr\" (UniqueName: \"kubernetes.io/projected/17d9ce44-f425-4262-ab22-5edef1fad72e-kube-api-access-k9mdr\") pod \"ingress-operator-5b745b69d9-hj4tg\" (UID: \"17d9ce44-f425-4262-ab22-5edef1fad72e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hj4tg" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.624148 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c553be85-38c4-45a7-9406-257b039d7734-serving-cert\") pod \"service-ca-operator-777779d784-dd22q\" (UID: \"c553be85-38c4-45a7-9406-257b039d7734\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dd22q" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.624163 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/cf30c324-b218-45df-8462-1b76cc2825c2-socket-dir\") pod \"csi-hostpathplugin-z6v85\" (UID: \"cf30c324-b218-45df-8462-1b76cc2825c2\") " pod="hostpath-provisioner/csi-hostpathplugin-z6v85" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.624193 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/cf30c324-b218-45df-8462-1b76cc2825c2-mountpoint-dir\") pod \"csi-hostpathplugin-z6v85\" (UID: \"cf30c324-b218-45df-8462-1b76cc2825c2\") " pod="hostpath-provisioner/csi-hostpathplugin-z6v85" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.624209 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/17d9ce44-f425-4262-ab22-5edef1fad72e-metrics-tls\") pod \"ingress-operator-5b745b69d9-hj4tg\" (UID: \"17d9ce44-f425-4262-ab22-5edef1fad72e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hj4tg" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.624228 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/cf30c324-b218-45df-8462-1b76cc2825c2-csi-data-dir\") pod \"csi-hostpathplugin-z6v85\" (UID: \"cf30c324-b218-45df-8462-1b76cc2825c2\") " pod="hostpath-provisioner/csi-hostpathplugin-z6v85" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.624245 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4eed3eac-42f8-4683-9c1f-3733965e6af7-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-wdsnl\" (UID: \"4eed3eac-42f8-4683-9c1f-3733965e6af7\") " pod="openshift-multus/cni-sysctl-allowlist-ds-wdsnl" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.624264 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/eed5737e-ff85-457b-a0ed-6b0a750f68b1-srv-cert\") pod \"catalog-operator-68c6474976-qmmgv\" (UID: \"eed5737e-ff85-457b-a0ed-6b0a750f68b1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qmmgv" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.624288 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9r5p5\" (UniqueName: \"kubernetes.io/projected/3b714fda-1eb1-4637-9808-a2d0d47b4a91-kube-api-access-9r5p5\") pod \"ingress-canary-5t4mn\" (UID: \"3b714fda-1eb1-4637-9808-a2d0d47b4a91\") " pod="openshift-ingress-canary/ingress-canary-5t4mn" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.624305 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/abda7bc3-76f8-43a5-b0e6-0aa9271a5758-webhook-cert\") pod \"packageserver-d55dfcdfc-7mk6w\" (UID: \"abda7bc3-76f8-43a5-b0e6-0aa9271a5758\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7mk6w" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.624322 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/abda7bc3-76f8-43a5-b0e6-0aa9271a5758-tmpfs\") pod \"packageserver-d55dfcdfc-7mk6w\" (UID: \"abda7bc3-76f8-43a5-b0e6-0aa9271a5758\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7mk6w" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.624342 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/abda7bc3-76f8-43a5-b0e6-0aa9271a5758-apiservice-cert\") pod \"packageserver-d55dfcdfc-7mk6w\" (UID: \"abda7bc3-76f8-43a5-b0e6-0aa9271a5758\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7mk6w" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.624391 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/7044838f-643f-4f07-9f45-0468786d3798-node-bootstrap-token\") pod \"machine-config-server-jtqjq\" (UID: \"7044838f-643f-4f07-9f45-0468786d3798\") " pod="openshift-machine-config-operator/machine-config-server-jtqjq" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.624450 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db5f0bfe-4d3e-4b72-9dba-437fc3e94b93-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fhw9h\" (UID: \"db5f0bfe-4d3e-4b72-9dba-437fc3e94b93\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fhw9h" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.624477 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nplhw\" (UniqueName: \"kubernetes.io/projected/bf346b6e-1bfb-4753-b113-aa981202e7e7-kube-api-access-nplhw\") pod \"dns-default-zrbdj\" (UID: \"bf346b6e-1bfb-4753-b113-aa981202e7e7\") " pod="openshift-dns/dns-default-zrbdj" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.624503 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2r7p\" (UniqueName: \"kubernetes.io/projected/abda7bc3-76f8-43a5-b0e6-0aa9271a5758-kube-api-access-q2r7p\") pod \"packageserver-d55dfcdfc-7mk6w\" (UID: \"abda7bc3-76f8-43a5-b0e6-0aa9271a5758\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7mk6w" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.624534 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/cf30c324-b218-45df-8462-1b76cc2825c2-registration-dir\") pod \"csi-hostpathplugin-z6v85\" (UID: \"cf30c324-b218-45df-8462-1b76cc2825c2\") " pod="hostpath-provisioner/csi-hostpathplugin-z6v85" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.624592 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3d2a0407-66d1-4d05-9623-fe968aa3b516-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-k2z5n\" (UID: \"3d2a0407-66d1-4d05-9623-fe968aa3b516\") " pod="openshift-marketplace/marketplace-operator-79b997595-k2z5n" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.624615 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7bed8f1-d158-4e04-aa35-99ff2c7cd59e-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-wr7xx\" (UID: \"c7bed8f1-d158-4e04-aa35-99ff2c7cd59e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wr7xx" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.624630 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/036eac53-a3f2-4efc-bd3e-4616e47d8901-signing-key\") pod \"service-ca-9c57cc56f-kpns7\" (UID: \"036eac53-a3f2-4efc-bd3e-4616e47d8901\") " pod="openshift-service-ca/service-ca-9c57cc56f-kpns7" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.624651 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c553be85-38c4-45a7-9406-257b039d7734-config\") pod \"service-ca-operator-777779d784-dd22q\" (UID: \"c553be85-38c4-45a7-9406-257b039d7734\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dd22q" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.624667 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/eed5737e-ff85-457b-a0ed-6b0a750f68b1-profile-collector-cert\") pod \"catalog-operator-68c6474976-qmmgv\" (UID: \"eed5737e-ff85-457b-a0ed-6b0a750f68b1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qmmgv" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.624696 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7blm\" (UniqueName: \"kubernetes.io/projected/eed5737e-ff85-457b-a0ed-6b0a750f68b1-kube-api-access-m7blm\") pod \"catalog-operator-68c6474976-qmmgv\" (UID: \"eed5737e-ff85-457b-a0ed-6b0a750f68b1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qmmgv" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.624714 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/4eed3eac-42f8-4683-9c1f-3733965e6af7-ready\") pod \"cni-sysctl-allowlist-ds-wdsnl\" (UID: \"4eed3eac-42f8-4683-9c1f-3733965e6af7\") " pod="openshift-multus/cni-sysctl-allowlist-ds-wdsnl" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.624736 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db5f0bfe-4d3e-4b72-9dba-437fc3e94b93-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fhw9h\" (UID: \"db5f0bfe-4d3e-4b72-9dba-437fc3e94b93\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fhw9h" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.624752 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dk4ng\" (UniqueName: \"kubernetes.io/projected/608b7b11-f38a-4c4b-9e61-dab4f84c34c1-kube-api-access-dk4ng\") pod \"collect-profiles-29550780-wpnp9\" (UID: \"608b7b11-f38a-4c4b-9e61-dab4f84c34c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550780-wpnp9" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.624768 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7672aac2-6c30-4cab-82aa-285ef39ea67d-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-68457\" (UID: \"7672aac2-6c30-4cab-82aa-285ef39ea67d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-68457" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.624786 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpbm2\" (UniqueName: \"kubernetes.io/projected/3d2a0407-66d1-4d05-9623-fe968aa3b516-kube-api-access-dpbm2\") pod \"marketplace-operator-79b997595-k2z5n\" (UID: \"3d2a0407-66d1-4d05-9623-fe968aa3b516\") " pod="openshift-marketplace/marketplace-operator-79b997595-k2z5n" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.624802 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggs6q\" (UniqueName: \"kubernetes.io/projected/7672aac2-6c30-4cab-82aa-285ef39ea67d-kube-api-access-ggs6q\") pod \"multus-admission-controller-857f4d67dd-68457\" (UID: \"7672aac2-6c30-4cab-82aa-285ef39ea67d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-68457" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.624818 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bf346b6e-1bfb-4753-b113-aa981202e7e7-metrics-tls\") pod \"dns-default-zrbdj\" (UID: \"bf346b6e-1bfb-4753-b113-aa981202e7e7\") " pod="openshift-dns/dns-default-zrbdj" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.624839 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djxts\" (UniqueName: \"kubernetes.io/projected/4eed3eac-42f8-4683-9c1f-3733965e6af7-kube-api-access-djxts\") pod \"cni-sysctl-allowlist-ds-wdsnl\" (UID: \"4eed3eac-42f8-4683-9c1f-3733965e6af7\") " pod="openshift-multus/cni-sysctl-allowlist-ds-wdsnl" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.624856 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nfpw\" (UniqueName: \"kubernetes.io/projected/c553be85-38c4-45a7-9406-257b039d7734-kube-api-access-4nfpw\") pod \"service-ca-operator-777779d784-dd22q\" (UID: \"c553be85-38c4-45a7-9406-257b039d7734\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dd22q" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.624873 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/7044838f-643f-4f07-9f45-0468786d3798-certs\") pod \"machine-config-server-jtqjq\" (UID: \"7044838f-643f-4f07-9f45-0468786d3798\") " pod="openshift-machine-config-operator/machine-config-server-jtqjq" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.624916 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/db5f0bfe-4d3e-4b72-9dba-437fc3e94b93-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fhw9h\" (UID: \"db5f0bfe-4d3e-4b72-9dba-437fc3e94b93\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fhw9h" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.624938 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bf346b6e-1bfb-4753-b113-aa981202e7e7-config-volume\") pod \"dns-default-zrbdj\" (UID: \"bf346b6e-1bfb-4753-b113-aa981202e7e7\") " pod="openshift-dns/dns-default-zrbdj" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.624956 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vxzx\" (UniqueName: \"kubernetes.io/projected/cf30c324-b218-45df-8462-1b76cc2825c2-kube-api-access-6vxzx\") pod \"csi-hostpathplugin-z6v85\" (UID: \"cf30c324-b218-45df-8462-1b76cc2825c2\") " pod="hostpath-provisioner/csi-hostpathplugin-z6v85" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.624972 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/17d9ce44-f425-4262-ab22-5edef1fad72e-trusted-ca\") pod \"ingress-operator-5b745b69d9-hj4tg\" (UID: \"17d9ce44-f425-4262-ab22-5edef1fad72e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hj4tg" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.624988 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/cf30c324-b218-45df-8462-1b76cc2825c2-plugins-dir\") pod \"csi-hostpathplugin-z6v85\" (UID: \"cf30c324-b218-45df-8462-1b76cc2825c2\") " pod="hostpath-provisioner/csi-hostpathplugin-z6v85" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.625016 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4eed3eac-42f8-4683-9c1f-3733965e6af7-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-wdsnl\" (UID: \"4eed3eac-42f8-4683-9c1f-3733965e6af7\") " pod="openshift-multus/cni-sysctl-allowlist-ds-wdsnl" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.625142 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4eed3eac-42f8-4683-9c1f-3733965e6af7-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-wdsnl\" (UID: \"4eed3eac-42f8-4683-9c1f-3733965e6af7\") " pod="openshift-multus/cni-sysctl-allowlist-ds-wdsnl" Mar 09 09:07:37 crc kubenswrapper[4861]: E0309 09:07:37.625224 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:07:38.125208462 +0000 UTC m=+101.210247863 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.625943 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/608b7b11-f38a-4c4b-9e61-dab4f84c34c1-config-volume\") pod \"collect-profiles-29550780-wpnp9\" (UID: \"608b7b11-f38a-4c4b-9e61-dab4f84c34c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550780-wpnp9" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.636967 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbkxb\" (UniqueName: \"kubernetes.io/projected/d8afafe8-bf56-46a7-bab9-c5a1c221a740-kube-api-access-kbkxb\") pod \"control-plane-machine-set-operator-78cbb6b69f-trcr9\" (UID: \"d8afafe8-bf56-46a7-bab9-c5a1c221a740\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-trcr9" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.639840 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6t4js\" (UniqueName: \"kubernetes.io/projected/dea29153-0067-4ab9-b93c-54ad9fff1590-kube-api-access-6t4js\") pod \"etcd-operator-b45778765-fsckq\" (UID: \"dea29153-0067-4ab9-b93c-54ad9fff1590\") " pod="openshift-etcd-operator/etcd-operator-b45778765-fsckq" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.639918 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/cf30c324-b218-45df-8462-1b76cc2825c2-mountpoint-dir\") pod \"csi-hostpathplugin-z6v85\" (UID: \"cf30c324-b218-45df-8462-1b76cc2825c2\") " pod="hostpath-provisioner/csi-hostpathplugin-z6v85" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.640086 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7672aac2-6c30-4cab-82aa-285ef39ea67d-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-68457\" (UID: \"7672aac2-6c30-4cab-82aa-285ef39ea67d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-68457" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.640766 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/cf30c324-b218-45df-8462-1b76cc2825c2-registration-dir\") pod \"csi-hostpathplugin-z6v85\" (UID: \"cf30c324-b218-45df-8462-1b76cc2825c2\") " pod="hostpath-provisioner/csi-hostpathplugin-z6v85" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.641982 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3d2a0407-66d1-4d05-9623-fe968aa3b516-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-k2z5n\" (UID: \"3d2a0407-66d1-4d05-9623-fe968aa3b516\") " pod="openshift-marketplace/marketplace-operator-79b997595-k2z5n" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.642302 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/17d9ce44-f425-4262-ab22-5edef1fad72e-trusted-ca\") pod \"ingress-operator-5b745b69d9-hj4tg\" (UID: \"17d9ce44-f425-4262-ab22-5edef1fad72e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hj4tg" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.642714 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7bed8f1-d158-4e04-aa35-99ff2c7cd59e-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-wr7xx\" (UID: \"c7bed8f1-d158-4e04-aa35-99ff2c7cd59e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wr7xx" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.642961 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/abda7bc3-76f8-43a5-b0e6-0aa9271a5758-tmpfs\") pod \"packageserver-d55dfcdfc-7mk6w\" (UID: \"abda7bc3-76f8-43a5-b0e6-0aa9271a5758\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7mk6w" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.643027 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/cf30c324-b218-45df-8462-1b76cc2825c2-plugins-dir\") pod \"csi-hostpathplugin-z6v85\" (UID: \"cf30c324-b218-45df-8462-1b76cc2825c2\") " pod="hostpath-provisioner/csi-hostpathplugin-z6v85" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.643125 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/cf30c324-b218-45df-8462-1b76cc2825c2-csi-data-dir\") pod \"csi-hostpathplugin-z6v85\" (UID: \"cf30c324-b218-45df-8462-1b76cc2825c2\") " pod="hostpath-provisioner/csi-hostpathplugin-z6v85" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.645221 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db5f0bfe-4d3e-4b72-9dba-437fc3e94b93-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fhw9h\" (UID: \"db5f0bfe-4d3e-4b72-9dba-437fc3e94b93\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fhw9h" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.645614 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bf346b6e-1bfb-4753-b113-aa981202e7e7-config-volume\") pod \"dns-default-zrbdj\" (UID: \"bf346b6e-1bfb-4753-b113-aa981202e7e7\") " pod="openshift-dns/dns-default-zrbdj" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.645880 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/036eac53-a3f2-4efc-bd3e-4616e47d8901-signing-key\") pod \"service-ca-9c57cc56f-kpns7\" (UID: \"036eac53-a3f2-4efc-bd3e-4616e47d8901\") " pod="openshift-service-ca/service-ca-9c57cc56f-kpns7" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.646262 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c553be85-38c4-45a7-9406-257b039d7734-config\") pod \"service-ca-operator-777779d784-dd22q\" (UID: \"c553be85-38c4-45a7-9406-257b039d7734\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dd22q" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.646630 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4eed3eac-42f8-4683-9c1f-3733965e6af7-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-wdsnl\" (UID: \"4eed3eac-42f8-4683-9c1f-3733965e6af7\") " pod="openshift-multus/cni-sysctl-allowlist-ds-wdsnl" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.646661 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/036eac53-a3f2-4efc-bd3e-4616e47d8901-signing-cabundle\") pod \"service-ca-9c57cc56f-kpns7\" (UID: \"036eac53-a3f2-4efc-bd3e-4616e47d8901\") " pod="openshift-service-ca/service-ca-9c57cc56f-kpns7" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.646907 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/bf346b6e-1bfb-4753-b113-aa981202e7e7-metrics-tls\") pod \"dns-default-zrbdj\" (UID: \"bf346b6e-1bfb-4753-b113-aa981202e7e7\") " pod="openshift-dns/dns-default-zrbdj" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.647153 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/abda7bc3-76f8-43a5-b0e6-0aa9271a5758-webhook-cert\") pod \"packageserver-d55dfcdfc-7mk6w\" (UID: \"abda7bc3-76f8-43a5-b0e6-0aa9271a5758\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7mk6w" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.647184 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/4eed3eac-42f8-4683-9c1f-3733965e6af7-ready\") pod \"cni-sysctl-allowlist-ds-wdsnl\" (UID: \"4eed3eac-42f8-4683-9c1f-3733965e6af7\") " pod="openshift-multus/cni-sysctl-allowlist-ds-wdsnl" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.647260 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7bed8f1-d158-4e04-aa35-99ff2c7cd59e-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-wr7xx\" (UID: \"c7bed8f1-d158-4e04-aa35-99ff2c7cd59e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wr7xx" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.647782 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/17d9ce44-f425-4262-ab22-5edef1fad72e-metrics-tls\") pod \"ingress-operator-5b745b69d9-hj4tg\" (UID: \"17d9ce44-f425-4262-ab22-5edef1fad72e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hj4tg" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.648134 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/cf30c324-b218-45df-8462-1b76cc2825c2-socket-dir\") pod \"csi-hostpathplugin-z6v85\" (UID: \"cf30c324-b218-45df-8462-1b76cc2825c2\") " pod="hostpath-provisioner/csi-hostpathplugin-z6v85" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.648892 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3d2a0407-66d1-4d05-9623-fe968aa3b516-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-k2z5n\" (UID: \"3d2a0407-66d1-4d05-9623-fe968aa3b516\") " pod="openshift-marketplace/marketplace-operator-79b997595-k2z5n" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.649664 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/eed5737e-ff85-457b-a0ed-6b0a750f68b1-profile-collector-cert\") pod \"catalog-operator-68c6474976-qmmgv\" (UID: \"eed5737e-ff85-457b-a0ed-6b0a750f68b1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qmmgv" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.649750 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c553be85-38c4-45a7-9406-257b039d7734-serving-cert\") pod \"service-ca-operator-777779d784-dd22q\" (UID: \"c553be85-38c4-45a7-9406-257b039d7734\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dd22q" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.649849 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db5f0bfe-4d3e-4b72-9dba-437fc3e94b93-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fhw9h\" (UID: \"db5f0bfe-4d3e-4b72-9dba-437fc3e94b93\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fhw9h" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.650349 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/608b7b11-f38a-4c4b-9e61-dab4f84c34c1-secret-volume\") pod \"collect-profiles-29550780-wpnp9\" (UID: \"608b7b11-f38a-4c4b-9e61-dab4f84c34c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550780-wpnp9" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.650857 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3b714fda-1eb1-4637-9808-a2d0d47b4a91-cert\") pod \"ingress-canary-5t4mn\" (UID: \"3b714fda-1eb1-4637-9808-a2d0d47b4a91\") " pod="openshift-ingress-canary/ingress-canary-5t4mn" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.650913 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/eed5737e-ff85-457b-a0ed-6b0a750f68b1-srv-cert\") pod \"catalog-operator-68c6474976-qmmgv\" (UID: \"eed5737e-ff85-457b-a0ed-6b0a750f68b1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qmmgv" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.653930 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/abda7bc3-76f8-43a5-b0e6-0aa9271a5758-apiservice-cert\") pod \"packageserver-d55dfcdfc-7mk6w\" (UID: \"abda7bc3-76f8-43a5-b0e6-0aa9271a5758\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7mk6w" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.655225 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/7044838f-643f-4f07-9f45-0468786d3798-certs\") pod \"machine-config-server-jtqjq\" (UID: \"7044838f-643f-4f07-9f45-0468786d3798\") " pod="openshift-machine-config-operator/machine-config-server-jtqjq" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.661207 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/7044838f-643f-4f07-9f45-0468786d3798-node-bootstrap-token\") pod \"machine-config-server-jtqjq\" (UID: \"7044838f-643f-4f07-9f45-0468786d3798\") " pod="openshift-machine-config-operator/machine-config-server-jtqjq" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.661602 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rqsq\" (UniqueName: \"kubernetes.io/projected/3bffc0ff-7dbe-425b-b81d-bad8f9a42e12-kube-api-access-4rqsq\") pod \"package-server-manager-789f6589d5-zc95n\" (UID: \"3bffc0ff-7dbe-425b-b81d-bad8f9a42e12\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zc95n" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.682464 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vznn2\" (UniqueName: \"kubernetes.io/projected/5d9e8554-0dac-404d-8be9-6bb656818cd2-kube-api-access-vznn2\") pod \"migrator-59844c95c7-96ksj\" (UID: \"5d9e8554-0dac-404d-8be9-6bb656818cd2\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-96ksj" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.700448 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/de2fac75-67e1-47c9-9507-b8b5e5857c32-bound-sa-token\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.726116 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:37 crc kubenswrapper[4861]: E0309 09:07:37.726650 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:07:38.226638011 +0000 UTC m=+101.311677412 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cjkj6" (UID: "de2fac75-67e1-47c9-9507-b8b5e5857c32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.727738 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m5cn\" (UniqueName: \"kubernetes.io/projected/41fbb42d-a8a5-4d95-ba2c-ceeac7e96bff-kube-api-access-7m5cn\") pod \"machine-config-operator-74547568cd-tm5tk\" (UID: \"41fbb42d-a8a5-4d95-ba2c-ceeac7e96bff\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tm5tk" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.732106 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-5m9c6"] Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.743126 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-gkrbr"] Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.743186 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lj7m8"] Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.765171 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nplhw\" (UniqueName: \"kubernetes.io/projected/bf346b6e-1bfb-4753-b113-aa981202e7e7-kube-api-access-nplhw\") pod \"dns-default-zrbdj\" (UID: \"bf346b6e-1bfb-4753-b113-aa981202e7e7\") " pod="openshift-dns/dns-default-zrbdj" Mar 09 09:07:37 crc kubenswrapper[4861]: W0309 09:07:37.765238 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20b8695b_8129_4f02_824d_5ca2a451d899.slice/crio-32a528bff3599f95196bef59a995a79ac5f994c3a1fa22230888690c326f8578 WatchSource:0}: Error finding container 32a528bff3599f95196bef59a995a79ac5f994c3a1fa22230888690c326f8578: Status 404 returned error can't find the container with id 32a528bff3599f95196bef59a995a79ac5f994c3a1fa22230888690c326f8578 Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.784405 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-fsckq" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.791599 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2r7p\" (UniqueName: \"kubernetes.io/projected/abda7bc3-76f8-43a5-b0e6-0aa9271a5758-kube-api-access-q2r7p\") pod \"packageserver-d55dfcdfc-7mk6w\" (UID: \"abda7bc3-76f8-43a5-b0e6-0aa9271a5758\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7mk6w" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.795764 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpbm2\" (UniqueName: \"kubernetes.io/projected/3d2a0407-66d1-4d05-9623-fe968aa3b516-kube-api-access-dpbm2\") pod \"marketplace-operator-79b997595-k2z5n\" (UID: \"3d2a0407-66d1-4d05-9623-fe968aa3b516\") " pod="openshift-marketplace/marketplace-operator-79b997595-k2z5n" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.810011 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tm5tk" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.822728 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/db5f0bfe-4d3e-4b72-9dba-437fc3e94b93-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-fhw9h\" (UID: \"db5f0bfe-4d3e-4b72-9dba-437fc3e94b93\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fhw9h" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.823786 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zc95n" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.828819 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:07:37 crc kubenswrapper[4861]: E0309 09:07:37.829429 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:07:38.329411416 +0000 UTC m=+101.414450817 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.837423 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vxzx\" (UniqueName: \"kubernetes.io/projected/cf30c324-b218-45df-8462-1b76cc2825c2-kube-api-access-6vxzx\") pod \"csi-hostpathplugin-z6v85\" (UID: \"cf30c324-b218-45df-8462-1b76cc2825c2\") " pod="hostpath-provisioner/csi-hostpathplugin-z6v85" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.838002 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-96ksj" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.859423 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2hgfm" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.861494 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7blm\" (UniqueName: \"kubernetes.io/projected/eed5737e-ff85-457b-a0ed-6b0a750f68b1-kube-api-access-m7blm\") pod \"catalog-operator-68c6474976-qmmgv\" (UID: \"eed5737e-ff85-457b-a0ed-6b0a750f68b1\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qmmgv" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.863014 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-trcr9" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.880113 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-jtnvg"] Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.880930 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-gqnxr"] Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.882044 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dk4ng\" (UniqueName: \"kubernetes.io/projected/608b7b11-f38a-4c4b-9e61-dab4f84c34c1-kube-api-access-dk4ng\") pod \"collect-profiles-29550780-wpnp9\" (UID: \"608b7b11-f38a-4c4b-9e61-dab4f84c34c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550780-wpnp9" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.882351 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ldhj8"] Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.884090 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qmmgv" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.887777 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-l2jkp"] Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.892689 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-h84v9"] Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.900774 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9mdr\" (UniqueName: \"kubernetes.io/projected/17d9ce44-f425-4262-ab22-5edef1fad72e-kube-api-access-k9mdr\") pod \"ingress-operator-5b745b69d9-hj4tg\" (UID: \"17d9ce44-f425-4262-ab22-5edef1fad72e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hj4tg" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.903901 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fhw9h" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.917212 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djxts\" (UniqueName: \"kubernetes.io/projected/4eed3eac-42f8-4683-9c1f-3733965e6af7-kube-api-access-djxts\") pod \"cni-sysctl-allowlist-ds-wdsnl\" (UID: \"4eed3eac-42f8-4683-9c1f-3733965e6af7\") " pod="openshift-multus/cni-sysctl-allowlist-ds-wdsnl" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.923140 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-k2z5n" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.932529 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.933438 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7mk6w" Mar 09 09:07:37 crc kubenswrapper[4861]: E0309 09:07:37.935339 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:07:38.435320835 +0000 UTC m=+101.520360236 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cjkj6" (UID: "de2fac75-67e1-47c9-9507-b8b5e5857c32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.940385 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggs6q\" (UniqueName: \"kubernetes.io/projected/7672aac2-6c30-4cab-82aa-285ef39ea67d-kube-api-access-ggs6q\") pod \"multus-admission-controller-857f4d67dd-68457\" (UID: \"7672aac2-6c30-4cab-82aa-285ef39ea67d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-68457" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.940871 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-dwjwj"] Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.943743 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zrbdj" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.960790 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550780-wpnp9" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.963074 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-s96f9"] Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.965462 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xst5\" (UniqueName: \"kubernetes.io/projected/7044838f-643f-4f07-9f45-0468786d3798-kube-api-access-6xst5\") pod \"machine-config-server-jtqjq\" (UID: \"7044838f-643f-4f07-9f45-0468786d3798\") " pod="openshift-machine-config-operator/machine-config-server-jtqjq" Mar 09 09:07:37 crc kubenswrapper[4861]: W0309 09:07:37.979216 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2afc7adc_5b22_4203_9265_2ea4293f132f.slice/crio-6fba3dde66eaf44651758d6aa3fc602c6b76a8520573bc790937fdfec9cb3ab4 WatchSource:0}: Error finding container 6fba3dde66eaf44651758d6aa3fc602c6b76a8520573bc790937fdfec9cb3ab4: Status 404 returned error can't find the container with id 6fba3dde66eaf44651758d6aa3fc602c6b76a8520573bc790937fdfec9cb3ab4 Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.981097 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5mtx\" (UniqueName: \"kubernetes.io/projected/c7bed8f1-d158-4e04-aa35-99ff2c7cd59e-kube-api-access-n5mtx\") pod \"kube-storage-version-migrator-operator-b67b599dd-wr7xx\" (UID: \"c7bed8f1-d158-4e04-aa35-99ff2c7cd59e\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wr7xx" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.989454 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-wdsnl" Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.994837 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-h6j45"] Mar 09 09:07:37 crc kubenswrapper[4861]: I0309 09:07:37.996222 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-z6v85" Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.002612 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/17d9ce44-f425-4262-ab22-5edef1fad72e-bound-sa-token\") pod \"ingress-operator-5b745b69d9-hj4tg\" (UID: \"17d9ce44-f425-4262-ab22-5edef1fad72e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hj4tg" Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.017438 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvsqs\" (UniqueName: \"kubernetes.io/projected/036eac53-a3f2-4efc-bd3e-4616e47d8901-kube-api-access-jvsqs\") pod \"service-ca-9c57cc56f-kpns7\" (UID: \"036eac53-a3f2-4efc-bd3e-4616e47d8901\") " pod="openshift-service-ca/service-ca-9c57cc56f-kpns7" Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.032581 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jzt4r"] Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.033038 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:07:38 crc kubenswrapper[4861]: E0309 09:07:38.033141 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:07:38.533125166 +0000 UTC m=+101.618164567 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.034302 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:38 crc kubenswrapper[4861]: E0309 09:07:38.035421 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:07:38.535405617 +0000 UTC m=+101.620445018 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cjkj6" (UID: "de2fac75-67e1-47c9-9507-b8b5e5857c32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.043521 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9r5p5\" (UniqueName: \"kubernetes.io/projected/3b714fda-1eb1-4637-9808-a2d0d47b4a91-kube-api-access-9r5p5\") pod \"ingress-canary-5t4mn\" (UID: \"3b714fda-1eb1-4637-9808-a2d0d47b4a91\") " pod="openshift-ingress-canary/ingress-canary-5t4mn" Mar 09 09:07:38 crc kubenswrapper[4861]: W0309 09:07:38.062025 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad46ee5f_9c08_438c_8284_aa488f48e522.slice/crio-dd41511ed0365463d236b40d45e6ecffd2b78c1251f4942f077ef256c5bef0f6 WatchSource:0}: Error finding container dd41511ed0365463d236b40d45e6ecffd2b78c1251f4942f077ef256c5bef0f6: Status 404 returned error can't find the container with id dd41511ed0365463d236b40d45e6ecffd2b78c1251f4942f077ef256c5bef0f6 Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.064969 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nfpw\" (UniqueName: \"kubernetes.io/projected/c553be85-38c4-45a7-9406-257b039d7734-kube-api-access-4nfpw\") pod \"service-ca-operator-777779d784-dd22q\" (UID: \"c553be85-38c4-45a7-9406-257b039d7734\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dd22q" Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.080436 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c57s5"] Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.102135 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-q8fz7"] Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.139079 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:07:38 crc kubenswrapper[4861]: E0309 09:07:38.144450 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:07:38.641888223 +0000 UTC m=+101.726927614 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.157439 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-68457" Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.172362 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dd22q" Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.196610 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wr7xx" Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.198240 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-96ksj"] Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.211536 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hj4tg" Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.235037 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-fsckq"] Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.242610 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:38 crc kubenswrapper[4861]: E0309 09:07:38.243039 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:07:38.743026473 +0000 UTC m=+101.828065874 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cjkj6" (UID: "de2fac75-67e1-47c9-9507-b8b5e5857c32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.251871 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-kpns7" Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.279477 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-jtqjq" Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.284422 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5t4mn" Mar 09 09:07:38 crc kubenswrapper[4861]: W0309 09:07:38.314441 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5239e6e7_4afa_4b37_9582_e159b201453a.slice/crio-f1c16e1f96fecb5f8d89d31a1afa6d5834e85ba2463ac8b179cc87e64de783b3 WatchSource:0}: Error finding container f1c16e1f96fecb5f8d89d31a1afa6d5834e85ba2463ac8b179cc87e64de783b3: Status 404 returned error can't find the container with id f1c16e1f96fecb5f8d89d31a1afa6d5834e85ba2463ac8b179cc87e64de783b3 Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.315900 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ldhj8" event={"ID":"271a4e3c-1aa2-4bf5-bdfe-0495910f75d6","Type":"ContainerStarted","Data":"1686dbfb5f1d15848a3cc8b2dfd24b343049895bfaea1444e79d09aff5d04c81"} Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.338522 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h84v9" event={"ID":"ba6c5b9f-7812-4b6d-998b-6f368a6edf83","Type":"ContainerStarted","Data":"88a7002d91ca75fb5f1504397b1aef71964957e788d306a9c8ee16e1cc6f3fe8"} Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.344325 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.344395 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dwjwj" event={"ID":"2a29855a-bbc0-458f-a9a0-0ddfd8763f2d","Type":"ContainerStarted","Data":"aa8e30dc51006693d5370c3aa2db1bce1aaa6fb70e7f9dc5286ab005f6885025"} Mar 09 09:07:38 crc kubenswrapper[4861]: E0309 09:07:38.344559 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:07:38.844544784 +0000 UTC m=+101.929584185 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.346949 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-97dkg" event={"ID":"eaee5667-e42c-4ef1-8c6b-279ee6fc171a","Type":"ContainerStarted","Data":"824adca6296b893cfd963278ecdc1fb80b2fa49869c278a275449de57a9ee084"} Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.347474 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-97dkg" Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.349561 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-gkrbr" event={"ID":"20b8695b-8129-4f02-824d-5ca2a451d899","Type":"ContainerStarted","Data":"32a528bff3599f95196bef59a995a79ac5f994c3a1fa22230888690c326f8578"} Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.352161 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-l2jkp" event={"ID":"2afc7adc-5b22-4203-9265-2ea4293f132f","Type":"ContainerStarted","Data":"6fba3dde66eaf44651758d6aa3fc602c6b76a8520573bc790937fdfec9cb3ab4"} Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.357195 4861 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-97dkg container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.25:6443/healthz\": dial tcp 10.217.0.25:6443: connect: connection refused" start-of-body= Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.357251 4861 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-97dkg" podUID="eaee5667-e42c-4ef1-8c6b-279ee6fc171a" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.25:6443/healthz\": dial tcp 10.217.0.25:6443: connect: connection refused" Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.359207 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-qh9sg" event={"ID":"85a3bbcb-e663-4a97-980c-606c979409d7","Type":"ContainerStarted","Data":"b7d6201629b31402e71bf297f3c26e30d1eebf556b0e934c4a6cae1f37951cd1"} Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.359269 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-qh9sg" event={"ID":"85a3bbcb-e663-4a97-980c-606c979409d7","Type":"ContainerStarted","Data":"7a6771f05c9be9277ced67aa4fa1ab94db723fa076712b772ef67122474e30f3"} Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.395799 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qmmgv"] Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.400113 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-c6sj6" event={"ID":"8ff37c1b-1688-42ce-8b0c-952d297ae4a0","Type":"ContainerStarted","Data":"4be4612c757c4d423518dc653d68ea44b821897164c4f6363a409a3d6d806a69"} Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.400151 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-c6sj6" event={"ID":"8ff37c1b-1688-42ce-8b0c-952d297ae4a0","Type":"ContainerStarted","Data":"89e2b8e5f05601194c49b3a7a11cf543cd67e4effaa7b16fc89e753b050f6d1a"} Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.403230 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-h6j45" event={"ID":"566d8448-e794-44ee-9d17-e92493adcd87","Type":"ContainerStarted","Data":"05ae463d2e16bfacd5fcb9bdff052a779440b4416fcc67f7354e51aa765a3761"} Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.404118 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-s96f9" event={"ID":"ad46ee5f-9c08-438c-8284-aa488f48e522","Type":"ContainerStarted","Data":"dd41511ed0365463d236b40d45e6ecffd2b78c1251f4942f077ef256c5bef0f6"} Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.409813 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2hgfm"] Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.420429 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-gqnxr" event={"ID":"da4004a6-c6fd-41d6-a651-b4aaec2d6454","Type":"ContainerStarted","Data":"7f31ed07a329d94a636470476e4280f7f8511087dd3343f4067ef55e286f8ae2"} Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.434024 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-5m9c6" event={"ID":"b820aa76-e1f5-440b-b942-2a4468dc4d51","Type":"ContainerStarted","Data":"45927ceca568c2890024643ce9bed05adc301c4e4cc84ca99bc58851f3978595"} Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.448138 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:38 crc kubenswrapper[4861]: E0309 09:07:38.450826 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:07:38.950807413 +0000 UTC m=+102.035846814 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cjkj6" (UID: "de2fac75-67e1-47c9-9507-b8b5e5857c32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.468430 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-24x2s" event={"ID":"e3056772-5de8-4fed-9796-440422743470","Type":"ContainerStarted","Data":"275c111c05e4cf3a1b0e199b92a8f6f329eeab7b389d36cf5efeb1b4e0f51913"} Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.468467 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-24x2s" event={"ID":"e3056772-5de8-4fed-9796-440422743470","Type":"ContainerStarted","Data":"968e48cf777609bd441962d2f66c5ef4bef60f664e6769c77997356a4cbf3421"} Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.471978 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-tm5tk"] Mar 09 09:07:38 crc kubenswrapper[4861]: W0309 09:07:38.484035 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeed5737e_ff85_457b_a0ed_6b0a750f68b1.slice/crio-4684c0cd62c893d7a67e12aa3a53c866932e78f1576fb0348416ea74c5a3a221 WatchSource:0}: Error finding container 4684c0cd62c893d7a67e12aa3a53c866932e78f1576fb0348416ea74c5a3a221: Status 404 returned error can't find the container with id 4684c0cd62c893d7a67e12aa3a53c866932e78f1576fb0348416ea74c5a3a221 Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.484418 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lj7m8" event={"ID":"8415c9be-22de-49bb-8b53-6f06a923ef33","Type":"ContainerStarted","Data":"1540081788a2d4f3d65d298fe0189fd64a0d779cae5e97f9fafd7017c4ebbbc1"} Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.490192 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-75sbr" event={"ID":"54ee5cc6-676d-412c-b2e8-66308fcfc3d6","Type":"ContainerStarted","Data":"bbf6f2125df286303742ffdff78dcddf4fdc14d9d23c0287f2f81ff23a27f331"} Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.494796 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-jtnvg" event={"ID":"f300ad16-f055-45cf-ac02-960f49b5d426","Type":"ContainerStarted","Data":"186e11df2d1a064873eddca42628740c570fa812a5a6afded4630bd3f9ec4891"} Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.495652 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-jtnvg" Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.496870 4861 patch_prober.go:28] interesting pod/console-operator-58897d9998-jtnvg container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.496912 4861 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-jtnvg" podUID="f300ad16-f055-45cf-ac02-960f49b5d426" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" Mar 09 09:07:38 crc kubenswrapper[4861]: W0309 09:07:38.497172 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddea29153_0067_4ab9_b93c_54ad9fff1590.slice/crio-1449d480e71e6565e7c64bcfacf6be904f12d9067521bcb2cad4d89cd941730d WatchSource:0}: Error finding container 1449d480e71e6565e7c64bcfacf6be904f12d9067521bcb2cad4d89cd941730d: Status 404 returned error can't find the container with id 1449d480e71e6565e7c64bcfacf6be904f12d9067521bcb2cad4d89cd941730d Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.498916 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-trcr9"] Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.531872 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-c6sj6" Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.543285 4861 generic.go:334] "Generic (PLEG): container finished" podID="6273411b-70f9-4fdf-bf82-d156b10a5824" containerID="cf2af8adb555c78182406e665daa75ec356f326ac2b80b008a1375744a9c56e6" exitCode=0 Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.543441 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hn7r4" event={"ID":"6273411b-70f9-4fdf-bf82-d156b10a5824","Type":"ContainerDied","Data":"cf2af8adb555c78182406e665daa75ec356f326ac2b80b008a1375744a9c56e6"} Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.543478 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hn7r4" event={"ID":"6273411b-70f9-4fdf-bf82-d156b10a5824","Type":"ContainerStarted","Data":"c412164bd0809a2f41df29c6344ff28fd6762a46a60a5a0228dc741c4818c6a8"} Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.549630 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.551257 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jzt4r" event={"ID":"8e2a968f-c9a7-415e-895a-3d1f8a8a3e0b","Type":"ContainerStarted","Data":"4216f9398863c8e8d0e86388b8670a655a5077d0eb4ceef5f164bf7b41d89306"} Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.553666 4861 patch_prober.go:28] interesting pod/router-default-5444994796-c6sj6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 09:07:38 crc kubenswrapper[4861]: [-]has-synced failed: reason withheld Mar 09 09:07:38 crc kubenswrapper[4861]: [+]process-running ok Mar 09 09:07:38 crc kubenswrapper[4861]: healthz check failed Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.553700 4861 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-c6sj6" podUID="8ff37c1b-1688-42ce-8b0c-952d297ae4a0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 09:07:38 crc kubenswrapper[4861]: E0309 09:07:38.558238 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:07:39.058168091 +0000 UTC m=+102.143207522 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.653572 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:38 crc kubenswrapper[4861]: E0309 09:07:38.657207 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:07:39.157187005 +0000 UTC m=+102.242226406 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cjkj6" (UID: "de2fac75-67e1-47c9-9507-b8b5e5857c32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.707348 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zc95n"] Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.720239 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fhw9h"] Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.768201 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:07:38 crc kubenswrapper[4861]: E0309 09:07:38.770534 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:07:39.270195565 +0000 UTC m=+102.355235086 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.853269 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550780-wpnp9"] Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.876026 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:38 crc kubenswrapper[4861]: E0309 09:07:38.876933 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:07:39.376918665 +0000 UTC m=+102.461958076 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cjkj6" (UID: "de2fac75-67e1-47c9-9507-b8b5e5857c32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.880657 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wr7xx"] Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.894221 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7mk6w"] Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.941016 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-z6v85"] Mar 09 09:07:38 crc kubenswrapper[4861]: W0309 09:07:38.948702 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod608b7b11_f38a_4c4b_9e61_dab4f84c34c1.slice/crio-7c31071fdbbf656b6b9449cb94bebb761b671f078696c06cd2a736d5dee90660 WatchSource:0}: Error finding container 7c31071fdbbf656b6b9449cb94bebb761b671f078696c06cd2a736d5dee90660: Status 404 returned error can't find the container with id 7c31071fdbbf656b6b9449cb94bebb761b671f078696c06cd2a736d5dee90660 Mar 09 09:07:38 crc kubenswrapper[4861]: W0309 09:07:38.962301 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podabda7bc3_76f8_43a5_b0e6_0aa9271a5758.slice/crio-c3f58314dbe70bd6a088bc38f834b9811853ce71fec8bc6f525083b3fb2787a1 WatchSource:0}: Error finding container c3f58314dbe70bd6a088bc38f834b9811853ce71fec8bc6f525083b3fb2787a1: Status 404 returned error can't find the container with id c3f58314dbe70bd6a088bc38f834b9811853ce71fec8bc6f525083b3fb2787a1 Mar 09 09:07:38 crc kubenswrapper[4861]: I0309 09:07:38.977469 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:07:38 crc kubenswrapper[4861]: E0309 09:07:38.977823 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:07:39.477807751 +0000 UTC m=+102.562847152 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:39 crc kubenswrapper[4861]: I0309 09:07:39.056919 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-k2z5n"] Mar 09 09:07:39 crc kubenswrapper[4861]: I0309 09:07:39.079748 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:39 crc kubenswrapper[4861]: E0309 09:07:39.080193 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:07:39.580173074 +0000 UTC m=+102.665212475 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cjkj6" (UID: "de2fac75-67e1-47c9-9507-b8b5e5857c32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:39 crc kubenswrapper[4861]: I0309 09:07:39.180833 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:07:39 crc kubenswrapper[4861]: E0309 09:07:39.181155 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:07:39.68113844 +0000 UTC m=+102.766177841 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:39 crc kubenswrapper[4861]: I0309 09:07:39.289940 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:39 crc kubenswrapper[4861]: E0309 09:07:39.295405 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:07:39.795201428 +0000 UTC m=+102.880240829 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cjkj6" (UID: "de2fac75-67e1-47c9-9507-b8b5e5857c32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:39 crc kubenswrapper[4861]: I0309 09:07:39.326967 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-zrbdj"] Mar 09 09:07:39 crc kubenswrapper[4861]: I0309 09:07:39.391382 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:07:39 crc kubenswrapper[4861]: E0309 09:07:39.391720 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:07:39.891705705 +0000 UTC m=+102.976745106 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:39 crc kubenswrapper[4861]: I0309 09:07:39.406542 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-68457"] Mar 09 09:07:39 crc kubenswrapper[4861]: I0309 09:07:39.438993 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5t4mn"] Mar 09 09:07:39 crc kubenswrapper[4861]: I0309 09:07:39.464159 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-kpns7"] Mar 09 09:07:39 crc kubenswrapper[4861]: I0309 09:07:39.492485 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-lz22f" podStartSLOduration=59.492441805 podStartE2EDuration="59.492441805s" podCreationTimestamp="2026-03-09 09:06:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:07:39.476682301 +0000 UTC m=+102.561721702" watchObservedRunningTime="2026-03-09 09:07:39.492441805 +0000 UTC m=+102.577481196" Mar 09 09:07:39 crc kubenswrapper[4861]: I0309 09:07:39.506528 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:39 crc kubenswrapper[4861]: E0309 09:07:39.506977 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:07:40.006961155 +0000 UTC m=+103.092000556 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cjkj6" (UID: "de2fac75-67e1-47c9-9507-b8b5e5857c32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:39 crc kubenswrapper[4861]: I0309 09:07:39.537551 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-dd22q"] Mar 09 09:07:39 crc kubenswrapper[4861]: I0309 09:07:39.542859 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-hj4tg"] Mar 09 09:07:39 crc kubenswrapper[4861]: I0309 09:07:39.546124 4861 patch_prober.go:28] interesting pod/router-default-5444994796-c6sj6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 09:07:39 crc kubenswrapper[4861]: [-]has-synced failed: reason withheld Mar 09 09:07:39 crc kubenswrapper[4861]: [+]process-running ok Mar 09 09:07:39 crc kubenswrapper[4861]: healthz check failed Mar 09 09:07:39 crc kubenswrapper[4861]: I0309 09:07:39.546406 4861 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-c6sj6" podUID="8ff37c1b-1688-42ce-8b0c-952d297ae4a0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 09:07:39 crc kubenswrapper[4861]: I0309 09:07:39.608110 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:07:39 crc kubenswrapper[4861]: E0309 09:07:39.610135 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:07:40.11010998 +0000 UTC m=+103.195149381 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:39 crc kubenswrapper[4861]: I0309 09:07:39.611616 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ldhj8" event={"ID":"271a4e3c-1aa2-4bf5-bdfe-0495910f75d6","Type":"ContainerStarted","Data":"86f70952db71737189d7d22a1692dc1b4120fdd8e66a0b46b8312fe8f9666696"} Mar 09 09:07:39 crc kubenswrapper[4861]: I0309 09:07:39.670803 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h84v9" event={"ID":"ba6c5b9f-7812-4b6d-998b-6f368a6edf83","Type":"ContainerStarted","Data":"899ea0ba0c4b5f62853960fa69081f92b17aed5305f09ed483af5fa1a0833ac6"} Mar 09 09:07:39 crc kubenswrapper[4861]: I0309 09:07:39.670868 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h84v9" Mar 09 09:07:39 crc kubenswrapper[4861]: I0309 09:07:39.674140 4861 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-h84v9 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Mar 09 09:07:39 crc kubenswrapper[4861]: I0309 09:07:39.674241 4861 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h84v9" podUID="ba6c5b9f-7812-4b6d-998b-6f368a6edf83" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" Mar 09 09:07:39 crc kubenswrapper[4861]: I0309 09:07:39.687355 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-h6j45" event={"ID":"566d8448-e794-44ee-9d17-e92493adcd87","Type":"ContainerStarted","Data":"30f39dc9224d58e70ed45fd8eb103f38b56b55f1f7f630c2e10906e65466a189"} Mar 09 09:07:39 crc kubenswrapper[4861]: I0309 09:07:39.688185 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-h6j45" Mar 09 09:07:39 crc kubenswrapper[4861]: I0309 09:07:39.689146 4861 patch_prober.go:28] interesting pod/downloads-7954f5f757-h6j45 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.5:8080/\": dial tcp 10.217.0.5:8080: connect: connection refused" start-of-body= Mar 09 09:07:39 crc kubenswrapper[4861]: I0309 09:07:39.689180 4861 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-h6j45" podUID="566d8448-e794-44ee-9d17-e92493adcd87" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.5:8080/\": dial tcp 10.217.0.5:8080: connect: connection refused" Mar 09 09:07:39 crc kubenswrapper[4861]: I0309 09:07:39.698704 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-trcr9" event={"ID":"d8afafe8-bf56-46a7-bab9-c5a1c221a740","Type":"ContainerStarted","Data":"0f356639ed5036a6c6754a7a70356f01b6adfd60de16ad6bf6e480dfe5a83d99"} Mar 09 09:07:39 crc kubenswrapper[4861]: I0309 09:07:39.702625 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-75sbr" event={"ID":"54ee5cc6-676d-412c-b2e8-66308fcfc3d6","Type":"ContainerStarted","Data":"8a32c84d353c0edaf18972a411084e45db0bc125f957f2f8309e521eeb9c248d"} Mar 09 09:07:39 crc kubenswrapper[4861]: I0309 09:07:39.713109 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:39 crc kubenswrapper[4861]: E0309 09:07:39.714942 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:07:40.21492379 +0000 UTC m=+103.299963181 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cjkj6" (UID: "de2fac75-67e1-47c9-9507-b8b5e5857c32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:39 crc kubenswrapper[4861]: I0309 09:07:39.737884 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-gqnxr" event={"ID":"da4004a6-c6fd-41d6-a651-b4aaec2d6454","Type":"ContainerStarted","Data":"ea7b4e7c8ab930a8b87c23ba7eeed2c587138e5013f84b26c69d6b490a3ee5ed"} Mar 09 09:07:39 crc kubenswrapper[4861]: I0309 09:07:39.742095 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lj7m8" event={"ID":"8415c9be-22de-49bb-8b53-6f06a923ef33","Type":"ContainerStarted","Data":"a3dc043d1258ff475cee317b5deb7493febcfe8fc0edab6f777e75b14857e95f"} Mar 09 09:07:39 crc kubenswrapper[4861]: I0309 09:07:39.753253 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-fsckq" event={"ID":"dea29153-0067-4ab9-b93c-54ad9fff1590","Type":"ContainerStarted","Data":"1449d480e71e6565e7c64bcfacf6be904f12d9067521bcb2cad4d89cd941730d"} Mar 09 09:07:39 crc kubenswrapper[4861]: I0309 09:07:39.762289 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-24x2s" event={"ID":"e3056772-5de8-4fed-9796-440422743470","Type":"ContainerStarted","Data":"4d1ce2601cad052786827071d75393c6ccf6804a3e692457583394ae96cb1cfc"} Mar 09 09:07:39 crc kubenswrapper[4861]: I0309 09:07:39.811886 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h84v9" podStartSLOduration=58.811831477 podStartE2EDuration="58.811831477s" podCreationTimestamp="2026-03-09 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:07:39.806914875 +0000 UTC m=+102.891954296" watchObservedRunningTime="2026-03-09 09:07:39.811831477 +0000 UTC m=+102.896870878" Mar 09 09:07:39 crc kubenswrapper[4861]: I0309 09:07:39.814060 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:07:39 crc kubenswrapper[4861]: E0309 09:07:39.814206 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:07:40.314182251 +0000 UTC m=+103.399221652 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:39 crc kubenswrapper[4861]: I0309 09:07:39.814914 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:39 crc kubenswrapper[4861]: E0309 09:07:39.815236 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:07:40.315226659 +0000 UTC m=+103.400266060 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cjkj6" (UID: "de2fac75-67e1-47c9-9507-b8b5e5857c32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:39 crc kubenswrapper[4861]: I0309 09:07:39.816576 4861 generic.go:334] "Generic (PLEG): container finished" podID="2a29855a-bbc0-458f-a9a0-0ddfd8763f2d" containerID="2625c0e819ec8a3279db408ee0a1b09c0b0286033d801f27bc6c528cf5458829" exitCode=0 Mar 09 09:07:39 crc kubenswrapper[4861]: I0309 09:07:39.816850 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dwjwj" event={"ID":"2a29855a-bbc0-458f-a9a0-0ddfd8763f2d","Type":"ContainerDied","Data":"2625c0e819ec8a3279db408ee0a1b09c0b0286033d801f27bc6c528cf5458829"} Mar 09 09:07:39 crc kubenswrapper[4861]: I0309 09:07:39.845409 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5t4mn" event={"ID":"3b714fda-1eb1-4637-9808-a2d0d47b4a91","Type":"ContainerStarted","Data":"2dcfadb182802d5f4b2c212bc4b605cdf3970b67000e7e24ac7c1681036a06d2"} Mar 09 09:07:39 crc kubenswrapper[4861]: I0309 09:07:39.882500 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tm5tk" event={"ID":"41fbb42d-a8a5-4d95-ba2c-ceeac7e96bff","Type":"ContainerStarted","Data":"75f79e3e4567baccbfb013a1609bf815cc1aa778c7edd62983cd3745a2bf3cc3"} Mar 09 09:07:39 crc kubenswrapper[4861]: I0309 09:07:39.893877 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ldhj8" podStartSLOduration=58.893857984 podStartE2EDuration="58.893857984s" podCreationTimestamp="2026-03-09 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:07:39.892845487 +0000 UTC m=+102.977884898" watchObservedRunningTime="2026-03-09 09:07:39.893857984 +0000 UTC m=+102.978897385" Mar 09 09:07:39 crc kubenswrapper[4861]: I0309 09:07:39.894151 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-5m9c6" podStartSLOduration=59.894146232 podStartE2EDuration="59.894146232s" podCreationTimestamp="2026-03-09 09:06:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:07:39.850321103 +0000 UTC m=+102.935360504" watchObservedRunningTime="2026-03-09 09:07:39.894146232 +0000 UTC m=+102.979185633" Mar 09 09:07:39 crc kubenswrapper[4861]: I0309 09:07:39.906146 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c57s5" event={"ID":"5239e6e7-4afa-4b37-9582-e159b201453a","Type":"ContainerStarted","Data":"f1c16e1f96fecb5f8d89d31a1afa6d5834e85ba2463ac8b179cc87e64de783b3"} Mar 09 09:07:39 crc kubenswrapper[4861]: I0309 09:07:39.917126 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:07:39 crc kubenswrapper[4861]: E0309 09:07:39.917620 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:07:40.41750954 +0000 UTC m=+103.502548941 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:39 crc kubenswrapper[4861]: I0309 09:07:39.917941 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:39 crc kubenswrapper[4861]: E0309 09:07:39.922974 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:07:40.422951757 +0000 UTC m=+103.507991158 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cjkj6" (UID: "de2fac75-67e1-47c9-9507-b8b5e5857c32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:39 crc kubenswrapper[4861]: I0309 09:07:39.972968 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-qh9sg" podStartSLOduration=58.972949751 podStartE2EDuration="58.972949751s" podCreationTimestamp="2026-03-09 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:07:39.922480684 +0000 UTC m=+103.007520095" watchObservedRunningTime="2026-03-09 09:07:39.972949751 +0000 UTC m=+103.057989152" Mar 09 09:07:39 crc kubenswrapper[4861]: I0309 09:07:39.994352 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jzt4r" event={"ID":"8e2a968f-c9a7-415e-895a-3d1f8a8a3e0b","Type":"ContainerStarted","Data":"0c08f9d88cfe75035dc13909cb79be9c89c7abc14a5b52db24868f10ec61ca35"} Mar 09 09:07:40 crc kubenswrapper[4861]: I0309 09:07:40.002457 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-5m9c6" event={"ID":"b820aa76-e1f5-440b-b942-2a4468dc4d51","Type":"ContainerStarted","Data":"f42ea82f7d32c93173be8978541099278a3164a359b2f5cb20c856d8c53705db"} Mar 09 09:07:40 crc kubenswrapper[4861]: I0309 09:07:40.004276 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lj7m8" podStartSLOduration=60.004255594 podStartE2EDuration="1m0.004255594s" podCreationTimestamp="2026-03-09 09:06:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:07:40.000204185 +0000 UTC m=+103.085243596" watchObservedRunningTime="2026-03-09 09:07:40.004255594 +0000 UTC m=+103.089294995" Mar 09 09:07:40 crc kubenswrapper[4861]: I0309 09:07:40.004509 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-97dkg" podStartSLOduration=60.004503241 podStartE2EDuration="1m0.004503241s" podCreationTimestamp="2026-03-09 09:06:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:07:39.971318138 +0000 UTC m=+103.056357539" watchObservedRunningTime="2026-03-09 09:07:40.004503241 +0000 UTC m=+103.089542642" Mar 09 09:07:40 crc kubenswrapper[4861]: I0309 09:07:40.019524 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:07:40 crc kubenswrapper[4861]: E0309 09:07:40.019931 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:07:40.519907285 +0000 UTC m=+103.604946706 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:40 crc kubenswrapper[4861]: I0309 09:07:40.026441 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-q8fz7" event={"ID":"71340af0-2591-4b41-ae96-e7f5fada7318","Type":"ContainerStarted","Data":"0b52a537c749a2966f26d7d9020972ba0e6fdea47cae4f9943ad4ef6c65130ba"} Mar 09 09:07:40 crc kubenswrapper[4861]: I0309 09:07:40.026485 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-q8fz7" event={"ID":"71340af0-2591-4b41-ae96-e7f5fada7318","Type":"ContainerStarted","Data":"b1794384438e68eec48fd08efa13e95432b0de4ecc368a7132faf1310c2df05d"} Mar 09 09:07:40 crc kubenswrapper[4861]: I0309 09:07:40.053114 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-97dkg" event={"ID":"eaee5667-e42c-4ef1-8c6b-279ee6fc171a","Type":"ContainerStarted","Data":"1cc70892b0dd7216c8ade7ec994511e85d17018d17079894b1284beeef727892"} Mar 09 09:07:40 crc kubenswrapper[4861]: I0309 09:07:40.069615 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-96ksj" event={"ID":"5d9e8554-0dac-404d-8be9-6bb656818cd2","Type":"ContainerStarted","Data":"36b4e38ce358a47ee9fb2b0c9e7b3c319fc1b27b3df6295199f361c0dc4822b1"} Mar 09 09:07:40 crc kubenswrapper[4861]: I0309 09:07:40.071090 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fhw9h" event={"ID":"db5f0bfe-4d3e-4b72-9dba-437fc3e94b93","Type":"ContainerStarted","Data":"1923db9d1dd0cb064f7c81a7d45037d036ec700c759881da4c7558590f1abc5d"} Mar 09 09:07:40 crc kubenswrapper[4861]: I0309 09:07:40.072477 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-c6sj6" podStartSLOduration=59.072448658 podStartE2EDuration="59.072448658s" podCreationTimestamp="2026-03-09 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:07:40.053417217 +0000 UTC m=+103.138456618" watchObservedRunningTime="2026-03-09 09:07:40.072448658 +0000 UTC m=+103.157488059" Mar 09 09:07:40 crc kubenswrapper[4861]: I0309 09:07:40.077054 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-68457" event={"ID":"7672aac2-6c30-4cab-82aa-285ef39ea67d","Type":"ContainerStarted","Data":"c6ca8411cd2fbf894a23f7ac22272eb619166d9b7677180725c85c5647bb2343"} Mar 09 09:07:40 crc kubenswrapper[4861]: I0309 09:07:40.082853 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-jtnvg" podStartSLOduration=59.082824858 podStartE2EDuration="59.082824858s" podCreationTimestamp="2026-03-09 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:07:40.082083587 +0000 UTC m=+103.167122998" watchObservedRunningTime="2026-03-09 09:07:40.082824858 +0000 UTC m=+103.167864259" Mar 09 09:07:40 crc kubenswrapper[4861]: I0309 09:07:40.110261 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-97dkg" Mar 09 09:07:40 crc kubenswrapper[4861]: I0309 09:07:40.121472 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wr7xx" event={"ID":"c7bed8f1-d158-4e04-aa35-99ff2c7cd59e","Type":"ContainerStarted","Data":"715f399f9baf539880be21f033f300bc8366774bd16699ea0ab62b21da477e6f"} Mar 09 09:07:40 crc kubenswrapper[4861]: I0309 09:07:40.122826 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:40 crc kubenswrapper[4861]: E0309 09:07:40.126994 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:07:40.626978155 +0000 UTC m=+103.712017556 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cjkj6" (UID: "de2fac75-67e1-47c9-9507-b8b5e5857c32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:40 crc kubenswrapper[4861]: I0309 09:07:40.141139 4861 generic.go:334] "Generic (PLEG): container finished" podID="2afc7adc-5b22-4203-9265-2ea4293f132f" containerID="48de69ee563042c54ff3561fb441ade309623b8192b4ceb2d2258ed471fa496b" exitCode=0 Mar 09 09:07:40 crc kubenswrapper[4861]: I0309 09:07:40.141290 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-l2jkp" event={"ID":"2afc7adc-5b22-4203-9265-2ea4293f132f","Type":"ContainerDied","Data":"48de69ee563042c54ff3561fb441ade309623b8192b4ceb2d2258ed471fa496b"} Mar 09 09:07:40 crc kubenswrapper[4861]: I0309 09:07:40.141302 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-h6j45" podStartSLOduration=59.14127468 podStartE2EDuration="59.14127468s" podCreationTimestamp="2026-03-09 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:07:40.130604423 +0000 UTC m=+103.215643834" watchObservedRunningTime="2026-03-09 09:07:40.14127468 +0000 UTC m=+103.226314081" Mar 09 09:07:40 crc kubenswrapper[4861]: I0309 09:07:40.181777 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2hgfm" event={"ID":"ed7e49ef-0cce-4e76-b000-e3f6b9542246","Type":"ContainerStarted","Data":"1f128dc477a945ecf2d7982b9540d00732edbd98beea5c2f71fb1ce82e0238b2"} Mar 09 09:07:40 crc kubenswrapper[4861]: I0309 09:07:40.181841 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2hgfm" event={"ID":"ed7e49ef-0cce-4e76-b000-e3f6b9542246","Type":"ContainerStarted","Data":"b0ed207c3ab11defc50548e503be7aaaab888afe25ec4b59dbaf7d78ffc4fb08"} Mar 09 09:07:40 crc kubenswrapper[4861]: I0309 09:07:40.224719 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550780-wpnp9" event={"ID":"608b7b11-f38a-4c4b-9e61-dab4f84c34c1","Type":"ContainerStarted","Data":"7c31071fdbbf656b6b9449cb94bebb761b671f078696c06cd2a736d5dee90660"} Mar 09 09:07:40 crc kubenswrapper[4861]: I0309 09:07:40.225788 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:07:40 crc kubenswrapper[4861]: E0309 09:07:40.226911 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:07:40.726889933 +0000 UTC m=+103.811929334 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:40 crc kubenswrapper[4861]: I0309 09:07:40.243102 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-z6v85" event={"ID":"cf30c324-b218-45df-8462-1b76cc2825c2","Type":"ContainerStarted","Data":"7c20f3b2585a5449e8cfb82b3dc0ee47227eb7d0a3afac35945b5b81955d7507"} Mar 09 09:07:40 crc kubenswrapper[4861]: I0309 09:07:40.249877 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-wdsnl" event={"ID":"4eed3eac-42f8-4683-9c1f-3733965e6af7","Type":"ContainerStarted","Data":"bb7717cb2e1a96f62fe1e45a3412bd03575544eb76b70a878b5f4243d0307f93"} Mar 09 09:07:40 crc kubenswrapper[4861]: I0309 09:07:40.278910 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qmmgv" event={"ID":"eed5737e-ff85-457b-a0ed-6b0a750f68b1","Type":"ContainerStarted","Data":"bb55986ca402c1298b09faa6d12e0bf282e6894c544f2fa7c03d3c73e22d59ee"} Mar 09 09:07:40 crc kubenswrapper[4861]: I0309 09:07:40.278970 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qmmgv" event={"ID":"eed5737e-ff85-457b-a0ed-6b0a750f68b1","Type":"ContainerStarted","Data":"4684c0cd62c893d7a67e12aa3a53c866932e78f1576fb0348416ea74c5a3a221"} Mar 09 09:07:40 crc kubenswrapper[4861]: I0309 09:07:40.280085 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qmmgv" Mar 09 09:07:40 crc kubenswrapper[4861]: I0309 09:07:40.282250 4861 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-qmmgv container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" start-of-body= Mar 09 09:07:40 crc kubenswrapper[4861]: I0309 09:07:40.282296 4861 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qmmgv" podUID="eed5737e-ff85-457b-a0ed-6b0a750f68b1" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" Mar 09 09:07:40 crc kubenswrapper[4861]: I0309 09:07:40.291280 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-jtnvg" event={"ID":"f300ad16-f055-45cf-ac02-960f49b5d426","Type":"ContainerStarted","Data":"1913d64d82928a82b57cb1d61f1c57affb918353f9c67d06a2fc2d5f10fd2ce2"} Mar 09 09:07:40 crc kubenswrapper[4861]: I0309 09:07:40.298719 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-24x2s" podStartSLOduration=59.298702755 podStartE2EDuration="59.298702755s" podCreationTimestamp="2026-03-09 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:07:40.278902023 +0000 UTC m=+103.363941424" watchObservedRunningTime="2026-03-09 09:07:40.298702755 +0000 UTC m=+103.383742146" Mar 09 09:07:40 crc kubenswrapper[4861]: I0309 09:07:40.304262 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-jtqjq" event={"ID":"7044838f-643f-4f07-9f45-0468786d3798","Type":"ContainerStarted","Data":"2d5905e965d7c5d905d07b624945914891767a8fa306fad79869203ed9ad9cb3"} Mar 09 09:07:40 crc kubenswrapper[4861]: I0309 09:07:40.313551 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-jtnvg" Mar 09 09:07:40 crc kubenswrapper[4861]: I0309 09:07:40.327651 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:40 crc kubenswrapper[4861]: E0309 09:07:40.329824 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:07:40.829813372 +0000 UTC m=+103.914852773 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cjkj6" (UID: "de2fac75-67e1-47c9-9507-b8b5e5857c32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:40 crc kubenswrapper[4861]: I0309 09:07:40.348428 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7mk6w" event={"ID":"abda7bc3-76f8-43a5-b0e6-0aa9271a5758","Type":"ContainerStarted","Data":"c3f58314dbe70bd6a088bc38f834b9811853ce71fec8bc6f525083b3fb2787a1"} Mar 09 09:07:40 crc kubenswrapper[4861]: I0309 09:07:40.362067 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-k2z5n" event={"ID":"3d2a0407-66d1-4d05-9623-fe968aa3b516","Type":"ContainerStarted","Data":"16e36988113813a7e5ac0aeca14109571e1840ea060a7df6bbed04c2a67dde39"} Mar 09 09:07:40 crc kubenswrapper[4861]: I0309 09:07:40.371848 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-s96f9" event={"ID":"ad46ee5f-9c08-438c-8284-aa488f48e522","Type":"ContainerStarted","Data":"4fb02491f1732659df06c0d1171a3136f0c6b4403ff15b826b1cb98ad715fb57"} Mar 09 09:07:40 crc kubenswrapper[4861]: I0309 09:07:40.372257 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-s96f9" Mar 09 09:07:40 crc kubenswrapper[4861]: I0309 09:07:40.382802 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-s96f9" Mar 09 09:07:40 crc kubenswrapper[4861]: I0309 09:07:40.383231 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-gkrbr" event={"ID":"20b8695b-8129-4f02-824d-5ca2a451d899","Type":"ContainerStarted","Data":"46db51df03b63c7c935e50810a78502a1ddf34243cb4a0ff7a9d13373c6a62db"} Mar 09 09:07:40 crc kubenswrapper[4861]: I0309 09:07:40.384263 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-gkrbr" Mar 09 09:07:40 crc kubenswrapper[4861]: I0309 09:07:40.395093 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-jzt4r" podStartSLOduration=59.395075228 podStartE2EDuration="59.395075228s" podCreationTimestamp="2026-03-09 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:07:40.394473041 +0000 UTC m=+103.479512442" watchObservedRunningTime="2026-03-09 09:07:40.395075228 +0000 UTC m=+103.480114619" Mar 09 09:07:40 crc kubenswrapper[4861]: I0309 09:07:40.398335 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-gkrbr" Mar 09 09:07:40 crc kubenswrapper[4861]: I0309 09:07:40.398384 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zrbdj" event={"ID":"bf346b6e-1bfb-4753-b113-aa981202e7e7","Type":"ContainerStarted","Data":"e5a42371f33e312657c612ed5b7c4eccf4a1d47c90a278cb73d0ad5fb53e09f2"} Mar 09 09:07:40 crc kubenswrapper[4861]: I0309 09:07:40.418468 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zc95n" event={"ID":"3bffc0ff-7dbe-425b-b81d-bad8f9a42e12","Type":"ContainerStarted","Data":"7809181ae3fbe796032cadb0380a5b2e884a759558e1929698393c33794f0a47"} Mar 09 09:07:40 crc kubenswrapper[4861]: I0309 09:07:40.439060 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:07:40 crc kubenswrapper[4861]: E0309 09:07:40.439980 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:07:40.939965376 +0000 UTC m=+104.025004777 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:40 crc kubenswrapper[4861]: I0309 09:07:40.541979 4861 patch_prober.go:28] interesting pod/router-default-5444994796-c6sj6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 09:07:40 crc kubenswrapper[4861]: [-]has-synced failed: reason withheld Mar 09 09:07:40 crc kubenswrapper[4861]: [+]process-running ok Mar 09 09:07:40 crc kubenswrapper[4861]: healthz check failed Mar 09 09:07:40 crc kubenswrapper[4861]: I0309 09:07:40.542272 4861 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-c6sj6" podUID="8ff37c1b-1688-42ce-8b0c-952d297ae4a0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 09:07:40 crc kubenswrapper[4861]: I0309 09:07:40.543033 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:40 crc kubenswrapper[4861]: I0309 09:07:40.543113 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ce13338-8caa-4be7-80e7-791207626053-metrics-certs\") pod \"network-metrics-daemon-pp5xh\" (UID: \"1ce13338-8caa-4be7-80e7-791207626053\") " pod="openshift-multus/network-metrics-daemon-pp5xh" Mar 09 09:07:40 crc kubenswrapper[4861]: E0309 09:07:40.545536 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:07:41.045525155 +0000 UTC m=+104.130564556 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cjkj6" (UID: "de2fac75-67e1-47c9-9507-b8b5e5857c32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:40 crc kubenswrapper[4861]: I0309 09:07:40.564743 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ce13338-8caa-4be7-80e7-791207626053-metrics-certs\") pod \"network-metrics-daemon-pp5xh\" (UID: \"1ce13338-8caa-4be7-80e7-791207626053\") " pod="openshift-multus/network-metrics-daemon-pp5xh" Mar 09 09:07:40 crc kubenswrapper[4861]: I0309 09:07:40.595415 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2hgfm" podStartSLOduration=59.595399997 podStartE2EDuration="59.595399997s" podCreationTimestamp="2026-03-09 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:07:40.595188962 +0000 UTC m=+103.680228363" watchObservedRunningTime="2026-03-09 09:07:40.595399997 +0000 UTC m=+103.680439398" Mar 09 09:07:40 crc kubenswrapper[4861]: I0309 09:07:40.644028 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:07:40 crc kubenswrapper[4861]: E0309 09:07:40.646213 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:07:41.146182013 +0000 UTC m=+104.231221414 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:40 crc kubenswrapper[4861]: I0309 09:07:40.646528 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qmmgv" podStartSLOduration=59.646512552 podStartE2EDuration="59.646512552s" podCreationTimestamp="2026-03-09 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:07:40.646504621 +0000 UTC m=+103.731544022" watchObservedRunningTime="2026-03-09 09:07:40.646512552 +0000 UTC m=+103.731551953" Mar 09 09:07:40 crc kubenswrapper[4861]: I0309 09:07:40.669993 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-gkrbr" podStartSLOduration=59.669972313 podStartE2EDuration="59.669972313s" podCreationTimestamp="2026-03-09 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:07:40.669601083 +0000 UTC m=+103.754640504" watchObservedRunningTime="2026-03-09 09:07:40.669972313 +0000 UTC m=+103.755011724" Mar 09 09:07:40 crc kubenswrapper[4861]: I0309 09:07:40.747152 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:40 crc kubenswrapper[4861]: E0309 09:07:40.747476 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:07:41.247463657 +0000 UTC m=+104.332503058 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cjkj6" (UID: "de2fac75-67e1-47c9-9507-b8b5e5857c32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:40 crc kubenswrapper[4861]: I0309 09:07:40.775619 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pp5xh" Mar 09 09:07:40 crc kubenswrapper[4861]: I0309 09:07:40.853899 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:07:40 crc kubenswrapper[4861]: E0309 09:07:40.854358 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:07:41.354339613 +0000 UTC m=+104.439379014 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:40 crc kubenswrapper[4861]: I0309 09:07:40.955435 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:40 crc kubenswrapper[4861]: E0309 09:07:40.955934 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:07:41.455921245 +0000 UTC m=+104.540960646 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cjkj6" (UID: "de2fac75-67e1-47c9-9507-b8b5e5857c32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:41 crc kubenswrapper[4861]: I0309 09:07:41.057464 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:07:41 crc kubenswrapper[4861]: E0309 09:07:41.057884 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:07:41.557868148 +0000 UTC m=+104.642907549 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:41 crc kubenswrapper[4861]: I0309 09:07:41.163326 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:41 crc kubenswrapper[4861]: E0309 09:07:41.163793 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:07:41.663780377 +0000 UTC m=+104.748819778 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cjkj6" (UID: "de2fac75-67e1-47c9-9507-b8b5e5857c32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:41 crc kubenswrapper[4861]: I0309 09:07:41.265547 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:07:41 crc kubenswrapper[4861]: E0309 09:07:41.266092 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:07:41.766075879 +0000 UTC m=+104.851115280 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:41 crc kubenswrapper[4861]: I0309 09:07:41.367357 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:41 crc kubenswrapper[4861]: E0309 09:07:41.367730 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:07:41.867716623 +0000 UTC m=+104.952756024 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cjkj6" (UID: "de2fac75-67e1-47c9-9507-b8b5e5857c32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:41 crc kubenswrapper[4861]: I0309 09:07:41.463876 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fhw9h" event={"ID":"db5f0bfe-4d3e-4b72-9dba-437fc3e94b93","Type":"ContainerStarted","Data":"8bae33b038dbccd7e323dafdf2c72a7136035ebe499e8ac8d7e6ff2ae5aaf28d"} Mar 09 09:07:41 crc kubenswrapper[4861]: I0309 09:07:41.470839 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:07:41 crc kubenswrapper[4861]: E0309 09:07:41.471203 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:07:41.971182717 +0000 UTC m=+105.056222118 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:41 crc kubenswrapper[4861]: I0309 09:07:41.493292 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-trcr9" event={"ID":"d8afafe8-bf56-46a7-bab9-c5a1c221a740","Type":"ContainerStarted","Data":"7199f6a775b6e4a8ccd93f7cb64a7c8ed292d09dccf51d872fcb647f5c68f35a"} Mar 09 09:07:41 crc kubenswrapper[4861]: I0309 09:07:41.512813 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-s96f9" podStartSLOduration=60.512798397 podStartE2EDuration="1m0.512798397s" podCreationTimestamp="2026-03-09 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:07:40.77134022 +0000 UTC m=+103.856379621" watchObservedRunningTime="2026-03-09 09:07:41.512798397 +0000 UTC m=+104.597837798" Mar 09 09:07:41 crc kubenswrapper[4861]: I0309 09:07:41.516323 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-fhw9h" podStartSLOduration=60.516314032 podStartE2EDuration="1m0.516314032s" podCreationTimestamp="2026-03-09 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:07:41.512018286 +0000 UTC m=+104.597057687" watchObservedRunningTime="2026-03-09 09:07:41.516314032 +0000 UTC m=+104.601353433" Mar 09 09:07:41 crc kubenswrapper[4861]: I0309 09:07:41.542179 4861 patch_prober.go:28] interesting pod/router-default-5444994796-c6sj6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 09:07:41 crc kubenswrapper[4861]: [-]has-synced failed: reason withheld Mar 09 09:07:41 crc kubenswrapper[4861]: [+]process-running ok Mar 09 09:07:41 crc kubenswrapper[4861]: healthz check failed Mar 09 09:07:41 crc kubenswrapper[4861]: I0309 09:07:41.542542 4861 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-c6sj6" podUID="8ff37c1b-1688-42ce-8b0c-952d297ae4a0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 09:07:41 crc kubenswrapper[4861]: I0309 09:07:41.542264 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-q8fz7" event={"ID":"71340af0-2591-4b41-ae96-e7f5fada7318","Type":"ContainerStarted","Data":"098dc83e95b89664fa0bd42699b85cf06dfdcb15cf56a18955ff27a01ea533dd"} Mar 09 09:07:41 crc kubenswrapper[4861]: I0309 09:07:41.565276 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-kpns7" event={"ID":"036eac53-a3f2-4efc-bd3e-4616e47d8901","Type":"ContainerStarted","Data":"a3a40691a8c906f623e2989907ffd0e45f1bccce5ebdfe9756057a98b671fabd"} Mar 09 09:07:41 crc kubenswrapper[4861]: I0309 09:07:41.571745 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-kpns7" event={"ID":"036eac53-a3f2-4efc-bd3e-4616e47d8901","Type":"ContainerStarted","Data":"8107cfb5416bec5598508f8a8ce9ae03c601eb26614ee64896e1bd97047a4b7e"} Mar 09 09:07:41 crc kubenswrapper[4861]: I0309 09:07:41.572767 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:41 crc kubenswrapper[4861]: E0309 09:07:41.574534 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:07:42.074523247 +0000 UTC m=+105.159562648 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cjkj6" (UID: "de2fac75-67e1-47c9-9507-b8b5e5857c32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:41 crc kubenswrapper[4861]: I0309 09:07:41.589299 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-trcr9" podStartSLOduration=60.589279814 podStartE2EDuration="1m0.589279814s" podCreationTimestamp="2026-03-09 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:07:41.560746686 +0000 UTC m=+104.645786087" watchObservedRunningTime="2026-03-09 09:07:41.589279814 +0000 UTC m=+104.674319215" Mar 09 09:07:41 crc kubenswrapper[4861]: I0309 09:07:41.639050 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-jtqjq" event={"ID":"7044838f-643f-4f07-9f45-0468786d3798","Type":"ContainerStarted","Data":"cad633688d9a70b7290f4ccdd498b72134ea9e872f235e0a2304ea7d8312d85b"} Mar 09 09:07:41 crc kubenswrapper[4861]: I0309 09:07:41.640798 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-q8fz7" podStartSLOduration=60.640789789 podStartE2EDuration="1m0.640789789s" podCreationTimestamp="2026-03-09 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:07:41.639568117 +0000 UTC m=+104.724607528" watchObservedRunningTime="2026-03-09 09:07:41.640789789 +0000 UTC m=+104.725829190" Mar 09 09:07:41 crc kubenswrapper[4861]: I0309 09:07:41.673297 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:07:41 crc kubenswrapper[4861]: E0309 09:07:41.673941 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:07:42.173912921 +0000 UTC m=+105.258952322 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:41 crc kubenswrapper[4861]: I0309 09:07:41.704298 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-fsckq" event={"ID":"dea29153-0067-4ab9-b93c-54ad9fff1590","Type":"ContainerStarted","Data":"4c251a0e018157b7e6dc2381e4da438b03f2ec7a61f315b8ad5aedd0aa41d00b"} Mar 09 09:07:41 crc kubenswrapper[4861]: I0309 09:07:41.707746 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-kpns7" podStartSLOduration=60.70772668 podStartE2EDuration="1m0.70772668s" podCreationTimestamp="2026-03-09 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:07:41.705784068 +0000 UTC m=+104.790823469" watchObservedRunningTime="2026-03-09 09:07:41.70772668 +0000 UTC m=+104.792766081" Mar 09 09:07:41 crc kubenswrapper[4861]: I0309 09:07:41.718486 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hj4tg" event={"ID":"17d9ce44-f425-4262-ab22-5edef1fad72e","Type":"ContainerStarted","Data":"c51d3a8645053dd0bb7f89c103c67248e551cb5642c0f81bd8b250dd7409e451"} Mar 09 09:07:41 crc kubenswrapper[4861]: I0309 09:07:41.718563 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hj4tg" event={"ID":"17d9ce44-f425-4262-ab22-5edef1fad72e","Type":"ContainerStarted","Data":"1cc433937b74a8834379a258885f79abdd4c1a8891d5c31d761e83e871e7d5e2"} Mar 09 09:07:41 crc kubenswrapper[4861]: I0309 09:07:41.749180 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-jtqjq" podStartSLOduration=7.7491623050000005 podStartE2EDuration="7.749162305s" podCreationTimestamp="2026-03-09 09:07:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:07:41.74860544 +0000 UTC m=+104.833644831" watchObservedRunningTime="2026-03-09 09:07:41.749162305 +0000 UTC m=+104.834201706" Mar 09 09:07:41 crc kubenswrapper[4861]: I0309 09:07:41.763590 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zc95n" event={"ID":"3bffc0ff-7dbe-425b-b81d-bad8f9a42e12","Type":"ContainerStarted","Data":"f38dc5bceb438a293beb977cd39a1d8d87496177fb96b20ebf18ad30e94f5c0c"} Mar 09 09:07:41 crc kubenswrapper[4861]: I0309 09:07:41.764401 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zc95n" Mar 09 09:07:41 crc kubenswrapper[4861]: I0309 09:07:41.778565 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:41 crc kubenswrapper[4861]: E0309 09:07:41.778964 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:07:42.278948366 +0000 UTC m=+105.363987767 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cjkj6" (UID: "de2fac75-67e1-47c9-9507-b8b5e5857c32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:41 crc kubenswrapper[4861]: I0309 09:07:41.797002 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-75sbr" event={"ID":"54ee5cc6-676d-412c-b2e8-66308fcfc3d6","Type":"ContainerStarted","Data":"fb5209b4ac2f0d0d2dce51bbd4049f24991445193a4160066362f19c736d340d"} Mar 09 09:07:41 crc kubenswrapper[4861]: I0309 09:07:41.800484 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-wdsnl" event={"ID":"4eed3eac-42f8-4683-9c1f-3733965e6af7","Type":"ContainerStarted","Data":"60d212b5a01a67560a782f62fd8dcd1024f51380874431f6df6a561a0cc668d3"} Mar 09 09:07:41 crc kubenswrapper[4861]: I0309 09:07:41.800624 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-wdsnl" Mar 09 09:07:41 crc kubenswrapper[4861]: I0309 09:07:41.813977 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-fsckq" podStartSLOduration=60.813946058 podStartE2EDuration="1m0.813946058s" podCreationTimestamp="2026-03-09 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:07:41.797890426 +0000 UTC m=+104.882929827" watchObservedRunningTime="2026-03-09 09:07:41.813946058 +0000 UTC m=+104.898985459" Mar 09 09:07:41 crc kubenswrapper[4861]: I0309 09:07:41.818138 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550780-wpnp9" event={"ID":"608b7b11-f38a-4c4b-9e61-dab4f84c34c1","Type":"ContainerStarted","Data":"32abcafa9c87fb0e758e3fc32846e0cfb9678f874a5c21171ddebf46b78f0189"} Mar 09 09:07:41 crc kubenswrapper[4861]: I0309 09:07:41.850625 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-gqnxr" event={"ID":"da4004a6-c6fd-41d6-a651-b4aaec2d6454","Type":"ContainerStarted","Data":"d9dd85fa0f214c95ea6defee74aa76c3985c64ea1988bf36e236385fc280958e"} Mar 09 09:07:41 crc kubenswrapper[4861]: I0309 09:07:41.851523 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hj4tg" podStartSLOduration=60.851501588 podStartE2EDuration="1m0.851501588s" podCreationTimestamp="2026-03-09 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:07:41.840461801 +0000 UTC m=+104.925501202" watchObservedRunningTime="2026-03-09 09:07:41.851501588 +0000 UTC m=+104.936540989" Mar 09 09:07:41 crc kubenswrapper[4861]: I0309 09:07:41.872255 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wr7xx" event={"ID":"c7bed8f1-d158-4e04-aa35-99ff2c7cd59e","Type":"ContainerStarted","Data":"c199cae3f41d51bc663ca34ff8d031f385cd397a960168929730f44ff314fd17"} Mar 09 09:07:41 crc kubenswrapper[4861]: I0309 09:07:41.879665 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:07:41 crc kubenswrapper[4861]: E0309 09:07:41.880311 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:07:42.380278963 +0000 UTC m=+105.465318364 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:41 crc kubenswrapper[4861]: I0309 09:07:41.895438 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29550780-wpnp9" podStartSLOduration=60.895409369 podStartE2EDuration="1m0.895409369s" podCreationTimestamp="2026-03-09 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:07:41.877866878 +0000 UTC m=+104.962906279" watchObservedRunningTime="2026-03-09 09:07:41.895409369 +0000 UTC m=+104.980448770" Mar 09 09:07:41 crc kubenswrapper[4861]: I0309 09:07:41.905327 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-75sbr" podStartSLOduration=60.905308375 podStartE2EDuration="1m0.905308375s" podCreationTimestamp="2026-03-09 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:07:41.903027935 +0000 UTC m=+104.988067336" watchObservedRunningTime="2026-03-09 09:07:41.905308375 +0000 UTC m=+104.990347766" Mar 09 09:07:41 crc kubenswrapper[4861]: I0309 09:07:41.913931 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-wdsnl" Mar 09 09:07:41 crc kubenswrapper[4861]: I0309 09:07:41.925817 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c57s5" event={"ID":"5239e6e7-4afa-4b37-9582-e159b201453a","Type":"ContainerStarted","Data":"b64fb4d6a85410d56580a24ce2564e610759d4445f2c83f95aeb1b52d9ec475b"} Mar 09 09:07:41 crc kubenswrapper[4861]: I0309 09:07:41.954573 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zc95n" podStartSLOduration=60.954555221 podStartE2EDuration="1m0.954555221s" podCreationTimestamp="2026-03-09 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:07:41.954443827 +0000 UTC m=+105.039483238" watchObservedRunningTime="2026-03-09 09:07:41.954555221 +0000 UTC m=+105.039594622" Mar 09 09:07:41 crc kubenswrapper[4861]: I0309 09:07:41.967346 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-pp5xh"] Mar 09 09:07:41 crc kubenswrapper[4861]: I0309 09:07:41.968363 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hn7r4" event={"ID":"6273411b-70f9-4fdf-bf82-d156b10a5824","Type":"ContainerStarted","Data":"c7d1797b92e94ceb0fe085e3e11eb1ad15cf9e8a2e607b3435234350bbd7a4ff"} Mar 09 09:07:42 crc kubenswrapper[4861]: I0309 09:07:41.994819 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:42 crc kubenswrapper[4861]: E0309 09:07:41.998470 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:07:42.498457392 +0000 UTC m=+105.583496793 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cjkj6" (UID: "de2fac75-67e1-47c9-9507-b8b5e5857c32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:42 crc kubenswrapper[4861]: I0309 09:07:42.008304 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-wdsnl" podStartSLOduration=8.008283216 podStartE2EDuration="8.008283216s" podCreationTimestamp="2026-03-09 09:07:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:07:42.004546075 +0000 UTC m=+105.089585476" watchObservedRunningTime="2026-03-09 09:07:42.008283216 +0000 UTC m=+105.093322617" Mar 09 09:07:42 crc kubenswrapper[4861]: I0309 09:07:42.040163 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-68457" event={"ID":"7672aac2-6c30-4cab-82aa-285ef39ea67d","Type":"ContainerStarted","Data":"8745065cb6258361a24099104f35c1c468a273e8d7b9d6ff17b82bd3580513e6"} Mar 09 09:07:42 crc kubenswrapper[4861]: I0309 09:07:42.060351 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-c57s5" podStartSLOduration=61.060256594 podStartE2EDuration="1m1.060256594s" podCreationTimestamp="2026-03-09 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:07:42.059911755 +0000 UTC m=+105.144951156" watchObservedRunningTime="2026-03-09 09:07:42.060256594 +0000 UTC m=+105.145295995" Mar 09 09:07:42 crc kubenswrapper[4861]: I0309 09:07:42.097783 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:07:42 crc kubenswrapper[4861]: E0309 09:07:42.098230 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:07:42.598201195 +0000 UTC m=+105.683240596 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:42 crc kubenswrapper[4861]: I0309 09:07:42.134637 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dd22q" event={"ID":"c553be85-38c4-45a7-9406-257b039d7734","Type":"ContainerStarted","Data":"522f8e298d6e14a89b5d591aa3210c97dc8e45bf0efb04d0c30ae18bbb34e244"} Mar 09 09:07:42 crc kubenswrapper[4861]: I0309 09:07:42.134694 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dd22q" event={"ID":"c553be85-38c4-45a7-9406-257b039d7734","Type":"ContainerStarted","Data":"6c1e19b64d7b29833cdeb1ce93fd9e68ec61de35f6d9378f1c56df9822c5788f"} Mar 09 09:07:42 crc kubenswrapper[4861]: I0309 09:07:42.162061 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-wr7xx" podStartSLOduration=61.162023952 podStartE2EDuration="1m1.162023952s" podCreationTimestamp="2026-03-09 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:07:42.142074856 +0000 UTC m=+105.227114287" watchObservedRunningTime="2026-03-09 09:07:42.162023952 +0000 UTC m=+105.247063353" Mar 09 09:07:42 crc kubenswrapper[4861]: I0309 09:07:42.204145 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:42 crc kubenswrapper[4861]: E0309 09:07:42.206397 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:07:42.706358934 +0000 UTC m=+105.791398335 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cjkj6" (UID: "de2fac75-67e1-47c9-9507-b8b5e5857c32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:42 crc kubenswrapper[4861]: I0309 09:07:42.225716 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zrbdj" event={"ID":"bf346b6e-1bfb-4753-b113-aa981202e7e7","Type":"ContainerStarted","Data":"c44614f2501ad4ca77fcb9c7100fc38a2c2f3e3998eb8759a8bb9ed61cb600cc"} Mar 09 09:07:42 crc kubenswrapper[4861]: I0309 09:07:42.230723 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hn7r4" podStartSLOduration=61.23068447 podStartE2EDuration="1m1.23068447s" podCreationTimestamp="2026-03-09 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:07:42.227259287 +0000 UTC m=+105.312298688" watchObservedRunningTime="2026-03-09 09:07:42.23068447 +0000 UTC m=+105.315723871" Mar 09 09:07:42 crc kubenswrapper[4861]: I0309 09:07:42.260871 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-gqnxr" podStartSLOduration=61.260856551 podStartE2EDuration="1m1.260856551s" podCreationTimestamp="2026-03-09 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:07:42.258814346 +0000 UTC m=+105.343853747" watchObservedRunningTime="2026-03-09 09:07:42.260856551 +0000 UTC m=+105.345895952" Mar 09 09:07:42 crc kubenswrapper[4861]: I0309 09:07:42.271403 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7mk6w" event={"ID":"abda7bc3-76f8-43a5-b0e6-0aa9271a5758","Type":"ContainerStarted","Data":"bf35324f7f4ba334003ce69ef658a779a146ce4cd5b9ca0a2569103498ef0029"} Mar 09 09:07:42 crc kubenswrapper[4861]: I0309 09:07:42.273433 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7mk6w" Mar 09 09:07:42 crc kubenswrapper[4861]: I0309 09:07:42.281509 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5t4mn" event={"ID":"3b714fda-1eb1-4637-9808-a2d0d47b4a91","Type":"ContainerStarted","Data":"5b0e5853c20df9bf82b0ae6f95005597286bd2e8baeaa2bc6c7f3ebefc12f827"} Mar 09 09:07:42 crc kubenswrapper[4861]: I0309 09:07:42.309343 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:07:42 crc kubenswrapper[4861]: E0309 09:07:42.309483 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:07:42.809455068 +0000 UTC m=+105.894494469 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:42 crc kubenswrapper[4861]: I0309 09:07:42.309752 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:42 crc kubenswrapper[4861]: E0309 09:07:42.311285 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:07:42.811266697 +0000 UTC m=+105.896306098 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cjkj6" (UID: "de2fac75-67e1-47c9-9507-b8b5e5857c32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:42 crc kubenswrapper[4861]: I0309 09:07:42.311833 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-k2z5n" event={"ID":"3d2a0407-66d1-4d05-9623-fe968aa3b516","Type":"ContainerStarted","Data":"33d40b96522231571b97073f146c87cba1bd5191afe63d1eae02ecdf7d93b9ea"} Mar 09 09:07:42 crc kubenswrapper[4861]: I0309 09:07:42.312295 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-k2z5n" Mar 09 09:07:42 crc kubenswrapper[4861]: I0309 09:07:42.313911 4861 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-k2z5n container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.28:8080/healthz\": dial tcp 10.217.0.28:8080: connect: connection refused" start-of-body= Mar 09 09:07:42 crc kubenswrapper[4861]: I0309 09:07:42.313955 4861 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-k2z5n" podUID="3d2a0407-66d1-4d05-9623-fe968aa3b516" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.28:8080/healthz\": dial tcp 10.217.0.28:8080: connect: connection refused" Mar 09 09:07:42 crc kubenswrapper[4861]: I0309 09:07:42.339807 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tm5tk" event={"ID":"41fbb42d-a8a5-4d95-ba2c-ceeac7e96bff","Type":"ContainerStarted","Data":"64d08a44ec7b38684d55f2dc20c106ce5386fc950921c37399bb303261b12a0b"} Mar 09 09:07:42 crc kubenswrapper[4861]: I0309 09:07:42.342331 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-96ksj" event={"ID":"5d9e8554-0dac-404d-8be9-6bb656818cd2","Type":"ContainerStarted","Data":"88cfedb6007ceff42484722243ce2f11cb09a48296babae72fbea45b8ec4400d"} Mar 09 09:07:42 crc kubenswrapper[4861]: I0309 09:07:42.344091 4861 patch_prober.go:28] interesting pod/downloads-7954f5f757-h6j45 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.5:8080/\": dial tcp 10.217.0.5:8080: connect: connection refused" start-of-body= Mar 09 09:07:42 crc kubenswrapper[4861]: I0309 09:07:42.344126 4861 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-h6j45" podUID="566d8448-e794-44ee-9d17-e92493adcd87" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.5:8080/\": dial tcp 10.217.0.5:8080: connect: connection refused" Mar 09 09:07:42 crc kubenswrapper[4861]: I0309 09:07:42.358626 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-qmmgv" Mar 09 09:07:42 crc kubenswrapper[4861]: I0309 09:07:42.368447 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h84v9" Mar 09 09:07:42 crc kubenswrapper[4861]: I0309 09:07:42.412142 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:07:42 crc kubenswrapper[4861]: E0309 09:07:42.415095 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:07:42.915070479 +0000 UTC m=+106.000109880 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:42 crc kubenswrapper[4861]: I0309 09:07:42.432489 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dd22q" podStartSLOduration=61.432468898 podStartE2EDuration="1m1.432468898s" podCreationTimestamp="2026-03-09 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:07:42.430834354 +0000 UTC m=+105.515873775" watchObservedRunningTime="2026-03-09 09:07:42.432468898 +0000 UTC m=+105.517508299" Mar 09 09:07:42 crc kubenswrapper[4861]: I0309 09:07:42.516081 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:42 crc kubenswrapper[4861]: E0309 09:07:42.516592 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:07:43.01657917 +0000 UTC m=+106.101618571 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cjkj6" (UID: "de2fac75-67e1-47c9-9507-b8b5e5857c32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:42 crc kubenswrapper[4861]: I0309 09:07:42.537516 4861 patch_prober.go:28] interesting pod/router-default-5444994796-c6sj6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 09:07:42 crc kubenswrapper[4861]: [-]has-synced failed: reason withheld Mar 09 09:07:42 crc kubenswrapper[4861]: [+]process-running ok Mar 09 09:07:42 crc kubenswrapper[4861]: healthz check failed Mar 09 09:07:42 crc kubenswrapper[4861]: I0309 09:07:42.537803 4861 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-c6sj6" podUID="8ff37c1b-1688-42ce-8b0c-952d297ae4a0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 09:07:42 crc kubenswrapper[4861]: I0309 09:07:42.582900 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tm5tk" podStartSLOduration=61.582874833 podStartE2EDuration="1m1.582874833s" podCreationTimestamp="2026-03-09 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:07:42.579213336 +0000 UTC m=+105.664252747" watchObservedRunningTime="2026-03-09 09:07:42.582874833 +0000 UTC m=+105.667914234" Mar 09 09:07:42 crc kubenswrapper[4861]: I0309 09:07:42.583515 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-96ksj" podStartSLOduration=61.583507221 podStartE2EDuration="1m1.583507221s" podCreationTimestamp="2026-03-09 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:07:42.488014192 +0000 UTC m=+105.573053593" watchObservedRunningTime="2026-03-09 09:07:42.583507221 +0000 UTC m=+105.668546632" Mar 09 09:07:42 crc kubenswrapper[4861]: I0309 09:07:42.613087 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-5t4mn" podStartSLOduration=8.613052316 podStartE2EDuration="8.613052316s" podCreationTimestamp="2026-03-09 09:07:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:07:42.60763227 +0000 UTC m=+105.692671661" watchObservedRunningTime="2026-03-09 09:07:42.613052316 +0000 UTC m=+105.698091717" Mar 09 09:07:42 crc kubenswrapper[4861]: I0309 09:07:42.620156 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:07:42 crc kubenswrapper[4861]: E0309 09:07:42.620462 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:07:43.120448034 +0000 UTC m=+106.205487435 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:42 crc kubenswrapper[4861]: I0309 09:07:42.654358 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-k2z5n" podStartSLOduration=61.654340946 podStartE2EDuration="1m1.654340946s" podCreationTimestamp="2026-03-09 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:07:42.651806689 +0000 UTC m=+105.736846090" watchObservedRunningTime="2026-03-09 09:07:42.654340946 +0000 UTC m=+105.739380347" Mar 09 09:07:42 crc kubenswrapper[4861]: I0309 09:07:42.721067 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:42 crc kubenswrapper[4861]: E0309 09:07:42.721462 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:07:43.221449872 +0000 UTC m=+106.306489273 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cjkj6" (UID: "de2fac75-67e1-47c9-9507-b8b5e5857c32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:42 crc kubenswrapper[4861]: I0309 09:07:42.723882 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-wdsnl"] Mar 09 09:07:42 crc kubenswrapper[4861]: I0309 09:07:42.770534 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7mk6w" podStartSLOduration=61.770506281 podStartE2EDuration="1m1.770506281s" podCreationTimestamp="2026-03-09 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:07:42.768228011 +0000 UTC m=+105.853267412" watchObservedRunningTime="2026-03-09 09:07:42.770506281 +0000 UTC m=+105.855545682" Mar 09 09:07:42 crc kubenswrapper[4861]: I0309 09:07:42.826403 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:07:42 crc kubenswrapper[4861]: E0309 09:07:42.855072 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:07:43.355025856 +0000 UTC m=+106.440065257 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:42 crc kubenswrapper[4861]: I0309 09:07:42.929606 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:42 crc kubenswrapper[4861]: E0309 09:07:42.930219 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:07:43.430198527 +0000 UTC m=+106.515237928 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cjkj6" (UID: "de2fac75-67e1-47c9-9507-b8b5e5857c32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.030907 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:07:43 crc kubenswrapper[4861]: E0309 09:07:43.031324 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:07:43.531307928 +0000 UTC m=+106.616347329 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.135018 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:43 crc kubenswrapper[4861]: E0309 09:07:43.135466 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:07:43.635359257 +0000 UTC m=+106.720398658 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cjkj6" (UID: "de2fac75-67e1-47c9-9507-b8b5e5857c32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.236230 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:07:43 crc kubenswrapper[4861]: E0309 09:07:43.236563 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:07:43.736527238 +0000 UTC m=+106.821566639 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.236788 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:43 crc kubenswrapper[4861]: E0309 09:07:43.237109 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:07:43.737096744 +0000 UTC m=+106.822136145 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cjkj6" (UID: "de2fac75-67e1-47c9-9507-b8b5e5857c32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.273247 4861 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-7mk6w container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.273493 4861 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7mk6w" podUID="abda7bc3-76f8-43a5-b0e6-0aa9271a5758" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.40:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.312259 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cjwtv"] Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.313498 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cjwtv" Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.337882 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:07:43 crc kubenswrapper[4861]: E0309 09:07:43.338038 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:07:43.838015929 +0000 UTC m=+106.923055330 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.338074 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.338106 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ad8332f-9ca2-4dd0-903f-2bf5723aa51e-catalog-content\") pod \"community-operators-cjwtv\" (UID: \"3ad8332f-9ca2-4dd0-903f-2bf5723aa51e\") " pod="openshift-marketplace/community-operators-cjwtv" Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.338175 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsg92\" (UniqueName: \"kubernetes.io/projected/3ad8332f-9ca2-4dd0-903f-2bf5723aa51e-kube-api-access-zsg92\") pod \"community-operators-cjwtv\" (UID: \"3ad8332f-9ca2-4dd0-903f-2bf5723aa51e\") " pod="openshift-marketplace/community-operators-cjwtv" Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.338223 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ad8332f-9ca2-4dd0-903f-2bf5723aa51e-utilities\") pod \"community-operators-cjwtv\" (UID: \"3ad8332f-9ca2-4dd0-903f-2bf5723aa51e\") " pod="openshift-marketplace/community-operators-cjwtv" Mar 09 09:07:43 crc kubenswrapper[4861]: E0309 09:07:43.338514 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:07:43.838500912 +0000 UTC m=+106.923540313 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cjkj6" (UID: "de2fac75-67e1-47c9-9507-b8b5e5857c32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.355240 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.365853 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-96ksj" event={"ID":"5d9e8554-0dac-404d-8be9-6bb656818cd2","Type":"ContainerStarted","Data":"9b2662b0f918ee6a622d84a9dc96180797ef71d1b36c2dfc6ef6be748a81132c"} Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.371009 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-hj4tg" event={"ID":"17d9ce44-f425-4262-ab22-5edef1fad72e","Type":"ContainerStarted","Data":"b2b150011ac715bf8657c168afca61b235bc8a0ae86414822093c00939192381"} Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.372522 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cjwtv"] Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.376553 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zc95n" event={"ID":"3bffc0ff-7dbe-425b-b81d-bad8f9a42e12","Type":"ContainerStarted","Data":"8255710396fc4f0ae50b0dda6341eafb882c9f75d7333ee825a0e02341ebe792"} Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.380751 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-tm5tk" event={"ID":"41fbb42d-a8a5-4d95-ba2c-ceeac7e96bff","Type":"ContainerStarted","Data":"c6c901c5aacd8ac6a9f2b3d1d8f56f978b90d0dc590afe185adc927aeb094e8d"} Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.390241 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-68457" event={"ID":"7672aac2-6c30-4cab-82aa-285ef39ea67d","Type":"ContainerStarted","Data":"d84674a0fe830fe32d1bf2edd53d7f5d0450ede6e18f11b60f35c3c5b588ff13"} Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.395183 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-pp5xh" event={"ID":"1ce13338-8caa-4be7-80e7-791207626053","Type":"ContainerStarted","Data":"6a973c69e816207f5a3c301ac971118636529ac07575ed3b109ce5ba1a6c3556"} Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.395222 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-pp5xh" event={"ID":"1ce13338-8caa-4be7-80e7-791207626053","Type":"ContainerStarted","Data":"09f33b89df7c1aae26ab8e084086c867eb938a960ac2984906ebd6e95f21e5b9"} Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.395234 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-pp5xh" event={"ID":"1ce13338-8caa-4be7-80e7-791207626053","Type":"ContainerStarted","Data":"9417447864d3879f14bccff6d059ad21f9a87f2fd0b2551f9dd65c06901fd769"} Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.399413 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zrbdj" event={"ID":"bf346b6e-1bfb-4753-b113-aa981202e7e7","Type":"ContainerStarted","Data":"1f574f5c05e4ca5d1cf0dafd64a893c2870d15d95b11fe5ca5b6a0a32e612ce0"} Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.399912 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-zrbdj" Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.406010 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-z6v85" event={"ID":"cf30c324-b218-45df-8462-1b76cc2825c2","Type":"ContainerStarted","Data":"4011d574795506bec93b1328ba68e65df9a7db6369bc051ff839edbbbfc6a551"} Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.416391 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dwjwj" event={"ID":"2a29855a-bbc0-458f-a9a0-0ddfd8763f2d","Type":"ContainerStarted","Data":"01570b11f061efb23a6a803b120d525a74476702d11e6fcb94140e617396c015"} Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.416521 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dwjwj" Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.418242 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-l2jkp" event={"ID":"2afc7adc-5b22-4203-9265-2ea4293f132f","Type":"ContainerStarted","Data":"c0329b48b12696a14aa9af2e02ee3d9fd12cbfd999a82bd98c2f562461cadf1e"} Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.418277 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-l2jkp" event={"ID":"2afc7adc-5b22-4203-9265-2ea4293f132f","Type":"ContainerStarted","Data":"83afadeeae6b00206704da7cce5470b32616d039fbe8feaa0fa3ec4000c665f9"} Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.419731 4861 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-k2z5n container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.28:8080/healthz\": dial tcp 10.217.0.28:8080: connect: connection refused" start-of-body= Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.419772 4861 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-k2z5n" podUID="3d2a0407-66d1-4d05-9623-fe968aa3b516" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.28:8080/healthz\": dial tcp 10.217.0.28:8080: connect: connection refused" Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.419820 4861 patch_prober.go:28] interesting pod/downloads-7954f5f757-h6j45 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.5:8080/\": dial tcp 10.217.0.5:8080: connect: connection refused" start-of-body= Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.419879 4861 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-h6j45" podUID="566d8448-e794-44ee-9d17-e92493adcd87" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.5:8080/\": dial tcp 10.217.0.5:8080: connect: connection refused" Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.444828 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.445093 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ad8332f-9ca2-4dd0-903f-2bf5723aa51e-utilities\") pod \"community-operators-cjwtv\" (UID: \"3ad8332f-9ca2-4dd0-903f-2bf5723aa51e\") " pod="openshift-marketplace/community-operators-cjwtv" Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.445158 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ad8332f-9ca2-4dd0-903f-2bf5723aa51e-catalog-content\") pod \"community-operators-cjwtv\" (UID: \"3ad8332f-9ca2-4dd0-903f-2bf5723aa51e\") " pod="openshift-marketplace/community-operators-cjwtv" Mar 09 09:07:43 crc kubenswrapper[4861]: E0309 09:07:43.445292 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:07:43.945245283 +0000 UTC m=+107.030284694 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.445699 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ad8332f-9ca2-4dd0-903f-2bf5723aa51e-catalog-content\") pod \"community-operators-cjwtv\" (UID: \"3ad8332f-9ca2-4dd0-903f-2bf5723aa51e\") " pod="openshift-marketplace/community-operators-cjwtv" Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.445747 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ad8332f-9ca2-4dd0-903f-2bf5723aa51e-utilities\") pod \"community-operators-cjwtv\" (UID: \"3ad8332f-9ca2-4dd0-903f-2bf5723aa51e\") " pod="openshift-marketplace/community-operators-cjwtv" Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.450498 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsg92\" (UniqueName: \"kubernetes.io/projected/3ad8332f-9ca2-4dd0-903f-2bf5723aa51e-kube-api-access-zsg92\") pod \"community-operators-cjwtv\" (UID: \"3ad8332f-9ca2-4dd0-903f-2bf5723aa51e\") " pod="openshift-marketplace/community-operators-cjwtv" Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.456576 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7mk6w" Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.458416 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gcb6x"] Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.459351 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gcb6x" Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.465267 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.480172 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gcb6x"] Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.487147 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dwjwj" podStartSLOduration=62.48713146 podStartE2EDuration="1m2.48713146s" podCreationTimestamp="2026-03-09 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:07:43.487005097 +0000 UTC m=+106.572044498" watchObservedRunningTime="2026-03-09 09:07:43.48713146 +0000 UTC m=+106.572170861" Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.505566 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsg92\" (UniqueName: \"kubernetes.io/projected/3ad8332f-9ca2-4dd0-903f-2bf5723aa51e-kube-api-access-zsg92\") pod \"community-operators-cjwtv\" (UID: \"3ad8332f-9ca2-4dd0-903f-2bf5723aa51e\") " pod="openshift-marketplace/community-operators-cjwtv" Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.533502 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-pp5xh" podStartSLOduration=62.533485517 podStartE2EDuration="1m2.533485517s" podCreationTimestamp="2026-03-09 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:07:43.532541782 +0000 UTC m=+106.617581193" watchObservedRunningTime="2026-03-09 09:07:43.533485517 +0000 UTC m=+106.618524918" Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.548726 4861 patch_prober.go:28] interesting pod/router-default-5444994796-c6sj6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 09:07:43 crc kubenswrapper[4861]: [-]has-synced failed: reason withheld Mar 09 09:07:43 crc kubenswrapper[4861]: [+]process-running ok Mar 09 09:07:43 crc kubenswrapper[4861]: healthz check failed Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.548805 4861 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-c6sj6" podUID="8ff37c1b-1688-42ce-8b0c-952d297ae4a0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.559074 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7595\" (UniqueName: \"kubernetes.io/projected/245d74cf-545f-43d3-ad40-5260aef18260-kube-api-access-t7595\") pod \"certified-operators-gcb6x\" (UID: \"245d74cf-545f-43d3-ad40-5260aef18260\") " pod="openshift-marketplace/certified-operators-gcb6x" Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.559121 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/245d74cf-545f-43d3-ad40-5260aef18260-catalog-content\") pod \"certified-operators-gcb6x\" (UID: \"245d74cf-545f-43d3-ad40-5260aef18260\") " pod="openshift-marketplace/certified-operators-gcb6x" Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.559566 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.559753 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/245d74cf-545f-43d3-ad40-5260aef18260-utilities\") pod \"certified-operators-gcb6x\" (UID: \"245d74cf-545f-43d3-ad40-5260aef18260\") " pod="openshift-marketplace/certified-operators-gcb6x" Mar 09 09:07:43 crc kubenswrapper[4861]: E0309 09:07:43.580662 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:07:44.080638446 +0000 UTC m=+107.165677847 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cjkj6" (UID: "de2fac75-67e1-47c9-9507-b8b5e5857c32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.599025 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-l2jkp" podStartSLOduration=63.59899776 podStartE2EDuration="1m3.59899776s" podCreationTimestamp="2026-03-09 09:06:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:07:43.582025243 +0000 UTC m=+106.667064644" watchObservedRunningTime="2026-03-09 09:07:43.59899776 +0000 UTC m=+106.684037151" Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.600319 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mlqmj"] Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.608567 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mlqmj" Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.625705 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cjwtv" Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.638650 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mlqmj"] Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.651214 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-gkrbr"] Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.652099 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-68457" podStartSLOduration=62.652089218 podStartE2EDuration="1m2.652089218s" podCreationTimestamp="2026-03-09 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:07:43.605441984 +0000 UTC m=+106.690481385" watchObservedRunningTime="2026-03-09 09:07:43.652089218 +0000 UTC m=+106.737128619" Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.669647 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.669909 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/245d74cf-545f-43d3-ad40-5260aef18260-utilities\") pod \"certified-operators-gcb6x\" (UID: \"245d74cf-545f-43d3-ad40-5260aef18260\") " pod="openshift-marketplace/certified-operators-gcb6x" Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.669977 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8af01b5-95f0-43a2-b228-675b98c6203f-catalog-content\") pod \"community-operators-mlqmj\" (UID: \"e8af01b5-95f0-43a2-b228-675b98c6203f\") " pod="openshift-marketplace/community-operators-mlqmj" Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.670026 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7595\" (UniqueName: \"kubernetes.io/projected/245d74cf-545f-43d3-ad40-5260aef18260-kube-api-access-t7595\") pod \"certified-operators-gcb6x\" (UID: \"245d74cf-545f-43d3-ad40-5260aef18260\") " pod="openshift-marketplace/certified-operators-gcb6x" Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.670059 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/245d74cf-545f-43d3-ad40-5260aef18260-catalog-content\") pod \"certified-operators-gcb6x\" (UID: \"245d74cf-545f-43d3-ad40-5260aef18260\") " pod="openshift-marketplace/certified-operators-gcb6x" Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.670075 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8af01b5-95f0-43a2-b228-675b98c6203f-utilities\") pod \"community-operators-mlqmj\" (UID: \"e8af01b5-95f0-43a2-b228-675b98c6203f\") " pod="openshift-marketplace/community-operators-mlqmj" Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.670142 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rq2j\" (UniqueName: \"kubernetes.io/projected/e8af01b5-95f0-43a2-b228-675b98c6203f-kube-api-access-8rq2j\") pod \"community-operators-mlqmj\" (UID: \"e8af01b5-95f0-43a2-b228-675b98c6203f\") " pod="openshift-marketplace/community-operators-mlqmj" Mar 09 09:07:43 crc kubenswrapper[4861]: E0309 09:07:43.670264 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:07:44.170246867 +0000 UTC m=+107.255286268 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.670660 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/245d74cf-545f-43d3-ad40-5260aef18260-utilities\") pod \"certified-operators-gcb6x\" (UID: \"245d74cf-545f-43d3-ad40-5260aef18260\") " pod="openshift-marketplace/certified-operators-gcb6x" Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.671079 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/245d74cf-545f-43d3-ad40-5260aef18260-catalog-content\") pod \"certified-operators-gcb6x\" (UID: \"245d74cf-545f-43d3-ad40-5260aef18260\") " pod="openshift-marketplace/certified-operators-gcb6x" Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.706967 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7595\" (UniqueName: \"kubernetes.io/projected/245d74cf-545f-43d3-ad40-5260aef18260-kube-api-access-t7595\") pod \"certified-operators-gcb6x\" (UID: \"245d74cf-545f-43d3-ad40-5260aef18260\") " pod="openshift-marketplace/certified-operators-gcb6x" Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.717589 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-zrbdj" podStartSLOduration=9.7175727 podStartE2EDuration="9.7175727s" podCreationTimestamp="2026-03-09 09:07:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:07:43.678464898 +0000 UTC m=+106.763504289" watchObservedRunningTime="2026-03-09 09:07:43.7175727 +0000 UTC m=+106.802612101" Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.718402 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-h84v9"] Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.779869 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-48m9h"] Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.782069 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.782410 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8af01b5-95f0-43a2-b228-675b98c6203f-catalog-content\") pod \"community-operators-mlqmj\" (UID: \"e8af01b5-95f0-43a2-b228-675b98c6203f\") " pod="openshift-marketplace/community-operators-mlqmj" Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.782454 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8af01b5-95f0-43a2-b228-675b98c6203f-utilities\") pod \"community-operators-mlqmj\" (UID: \"e8af01b5-95f0-43a2-b228-675b98c6203f\") " pod="openshift-marketplace/community-operators-mlqmj" Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.782495 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rq2j\" (UniqueName: \"kubernetes.io/projected/e8af01b5-95f0-43a2-b228-675b98c6203f-kube-api-access-8rq2j\") pod \"community-operators-mlqmj\" (UID: \"e8af01b5-95f0-43a2-b228-675b98c6203f\") " pod="openshift-marketplace/community-operators-mlqmj" Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.782636 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-48m9h" Mar 09 09:07:43 crc kubenswrapper[4861]: E0309 09:07:43.782997 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:07:44.282986219 +0000 UTC m=+107.368025620 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cjkj6" (UID: "de2fac75-67e1-47c9-9507-b8b5e5857c32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.783524 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8af01b5-95f0-43a2-b228-675b98c6203f-catalog-content\") pod \"community-operators-mlqmj\" (UID: \"e8af01b5-95f0-43a2-b228-675b98c6203f\") " pod="openshift-marketplace/community-operators-mlqmj" Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.786842 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8af01b5-95f0-43a2-b228-675b98c6203f-utilities\") pod \"community-operators-mlqmj\" (UID: \"e8af01b5-95f0-43a2-b228-675b98c6203f\") " pod="openshift-marketplace/community-operators-mlqmj" Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.787206 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gcb6x" Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.816426 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-48m9h"] Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.859293 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rq2j\" (UniqueName: \"kubernetes.io/projected/e8af01b5-95f0-43a2-b228-675b98c6203f-kube-api-access-8rq2j\") pod \"community-operators-mlqmj\" (UID: \"e8af01b5-95f0-43a2-b228-675b98c6203f\") " pod="openshift-marketplace/community-operators-mlqmj" Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.883708 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.884009 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6c8pq\" (UniqueName: \"kubernetes.io/projected/56bad3ba-c8af-4b69-96ae-93311a9d6151-kube-api-access-6c8pq\") pod \"certified-operators-48m9h\" (UID: \"56bad3ba-c8af-4b69-96ae-93311a9d6151\") " pod="openshift-marketplace/certified-operators-48m9h" Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.884036 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56bad3ba-c8af-4b69-96ae-93311a9d6151-utilities\") pod \"certified-operators-48m9h\" (UID: \"56bad3ba-c8af-4b69-96ae-93311a9d6151\") " pod="openshift-marketplace/certified-operators-48m9h" Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.884076 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56bad3ba-c8af-4b69-96ae-93311a9d6151-catalog-content\") pod \"certified-operators-48m9h\" (UID: \"56bad3ba-c8af-4b69-96ae-93311a9d6151\") " pod="openshift-marketplace/certified-operators-48m9h" Mar 09 09:07:43 crc kubenswrapper[4861]: E0309 09:07:43.884224 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:07:44.384205453 +0000 UTC m=+107.469244854 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.946916 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mlqmj" Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.986140 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6c8pq\" (UniqueName: \"kubernetes.io/projected/56bad3ba-c8af-4b69-96ae-93311a9d6151-kube-api-access-6c8pq\") pod \"certified-operators-48m9h\" (UID: \"56bad3ba-c8af-4b69-96ae-93311a9d6151\") " pod="openshift-marketplace/certified-operators-48m9h" Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.986642 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56bad3ba-c8af-4b69-96ae-93311a9d6151-utilities\") pod \"certified-operators-48m9h\" (UID: \"56bad3ba-c8af-4b69-96ae-93311a9d6151\") " pod="openshift-marketplace/certified-operators-48m9h" Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.986696 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56bad3ba-c8af-4b69-96ae-93311a9d6151-catalog-content\") pod \"certified-operators-48m9h\" (UID: \"56bad3ba-c8af-4b69-96ae-93311a9d6151\") " pod="openshift-marketplace/certified-operators-48m9h" Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.986764 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:43 crc kubenswrapper[4861]: E0309 09:07:43.987167 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:07:44.487150032 +0000 UTC m=+107.572189433 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cjkj6" (UID: "de2fac75-67e1-47c9-9507-b8b5e5857c32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.988257 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56bad3ba-c8af-4b69-96ae-93311a9d6151-utilities\") pod \"certified-operators-48m9h\" (UID: \"56bad3ba-c8af-4b69-96ae-93311a9d6151\") " pod="openshift-marketplace/certified-operators-48m9h" Mar 09 09:07:43 crc kubenswrapper[4861]: I0309 09:07:43.988552 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56bad3ba-c8af-4b69-96ae-93311a9d6151-catalog-content\") pod \"certified-operators-48m9h\" (UID: \"56bad3ba-c8af-4b69-96ae-93311a9d6151\") " pod="openshift-marketplace/certified-operators-48m9h" Mar 09 09:07:44 crc kubenswrapper[4861]: I0309 09:07:44.039140 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6c8pq\" (UniqueName: \"kubernetes.io/projected/56bad3ba-c8af-4b69-96ae-93311a9d6151-kube-api-access-6c8pq\") pod \"certified-operators-48m9h\" (UID: \"56bad3ba-c8af-4b69-96ae-93311a9d6151\") " pod="openshift-marketplace/certified-operators-48m9h" Mar 09 09:07:44 crc kubenswrapper[4861]: I0309 09:07:44.087749 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:07:44 crc kubenswrapper[4861]: E0309 09:07:44.087894 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:07:44.587869781 +0000 UTC m=+107.672909182 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:44 crc kubenswrapper[4861]: I0309 09:07:44.088050 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:44 crc kubenswrapper[4861]: E0309 09:07:44.088384 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:07:44.588361955 +0000 UTC m=+107.673401356 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cjkj6" (UID: "de2fac75-67e1-47c9-9507-b8b5e5857c32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:44 crc kubenswrapper[4861]: I0309 09:07:44.158199 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-48m9h" Mar 09 09:07:44 crc kubenswrapper[4861]: I0309 09:07:44.203344 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:07:44 crc kubenswrapper[4861]: E0309 09:07:44.203886 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:07:44.703844911 +0000 UTC m=+107.788884312 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:44 crc kubenswrapper[4861]: I0309 09:07:44.264559 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cjwtv"] Mar 09 09:07:44 crc kubenswrapper[4861]: I0309 09:07:44.305413 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:44 crc kubenswrapper[4861]: E0309 09:07:44.305805 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:07:44.805792014 +0000 UTC m=+107.890831415 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cjkj6" (UID: "de2fac75-67e1-47c9-9507-b8b5e5857c32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:44 crc kubenswrapper[4861]: W0309 09:07:44.349556 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ad8332f_9ca2_4dd0_903f_2bf5723aa51e.slice/crio-a27e8c3afe1afaa886e4e2322205334f8246ea013e7a8dec44b7b6f5eeb2e3e9 WatchSource:0}: Error finding container a27e8c3afe1afaa886e4e2322205334f8246ea013e7a8dec44b7b6f5eeb2e3e9: Status 404 returned error can't find the container with id a27e8c3afe1afaa886e4e2322205334f8246ea013e7a8dec44b7b6f5eeb2e3e9 Mar 09 09:07:44 crc kubenswrapper[4861]: I0309 09:07:44.406227 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:07:44 crc kubenswrapper[4861]: E0309 09:07:44.406522 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:07:44.906508004 +0000 UTC m=+107.991547405 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:44 crc kubenswrapper[4861]: I0309 09:07:44.463270 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cjwtv" event={"ID":"3ad8332f-9ca2-4dd0-903f-2bf5723aa51e","Type":"ContainerStarted","Data":"a27e8c3afe1afaa886e4e2322205334f8246ea013e7a8dec44b7b6f5eeb2e3e9"} Mar 09 09:07:44 crc kubenswrapper[4861]: I0309 09:07:44.491062 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-z6v85" event={"ID":"cf30c324-b218-45df-8462-1b76cc2825c2","Type":"ContainerStarted","Data":"c2c68b97c39fb3d3cc3bbfda20e1e3ef15d98dae1caf9a05855cb1fe5dbc540c"} Mar 09 09:07:44 crc kubenswrapper[4861]: I0309 09:07:44.493870 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h84v9" podUID="ba6c5b9f-7812-4b6d-998b-6f368a6edf83" containerName="route-controller-manager" containerID="cri-o://899ea0ba0c4b5f62853960fa69081f92b17aed5305f09ed483af5fa1a0833ac6" gracePeriod=30 Mar 09 09:07:44 crc kubenswrapper[4861]: I0309 09:07:44.494093 4861 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-k2z5n container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.28:8080/healthz\": dial tcp 10.217.0.28:8080: connect: connection refused" start-of-body= Mar 09 09:07:44 crc kubenswrapper[4861]: I0309 09:07:44.494129 4861 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-k2z5n" podUID="3d2a0407-66d1-4d05-9623-fe968aa3b516" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.28:8080/healthz\": dial tcp 10.217.0.28:8080: connect: connection refused" Mar 09 09:07:44 crc kubenswrapper[4861]: I0309 09:07:44.494312 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-wdsnl" podUID="4eed3eac-42f8-4683-9c1f-3733965e6af7" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://60d212b5a01a67560a782f62fd8dcd1024f51380874431f6df6a561a0cc668d3" gracePeriod=30 Mar 09 09:07:44 crc kubenswrapper[4861]: I0309 09:07:44.496760 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-gkrbr" podUID="20b8695b-8129-4f02-824d-5ca2a451d899" containerName="controller-manager" containerID="cri-o://46db51df03b63c7c935e50810a78502a1ddf34243cb4a0ff7a9d13373c6a62db" gracePeriod=30 Mar 09 09:07:44 crc kubenswrapper[4861]: I0309 09:07:44.507192 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:44 crc kubenswrapper[4861]: E0309 09:07:44.507448 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:07:45.007437929 +0000 UTC m=+108.092477330 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cjkj6" (UID: "de2fac75-67e1-47c9-9507-b8b5e5857c32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:44 crc kubenswrapper[4861]: I0309 09:07:44.539879 4861 patch_prober.go:28] interesting pod/router-default-5444994796-c6sj6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 09:07:44 crc kubenswrapper[4861]: [-]has-synced failed: reason withheld Mar 09 09:07:44 crc kubenswrapper[4861]: [+]process-running ok Mar 09 09:07:44 crc kubenswrapper[4861]: healthz check failed Mar 09 09:07:44 crc kubenswrapper[4861]: I0309 09:07:44.539937 4861 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-c6sj6" podUID="8ff37c1b-1688-42ce-8b0c-952d297ae4a0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 09:07:44 crc kubenswrapper[4861]: I0309 09:07:44.596916 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gcb6x"] Mar 09 09:07:44 crc kubenswrapper[4861]: I0309 09:07:44.608581 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:07:44 crc kubenswrapper[4861]: E0309 09:07:44.609977 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:07:45.109961397 +0000 UTC m=+108.195000798 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:44 crc kubenswrapper[4861]: I0309 09:07:44.690518 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-48m9h"] Mar 09 09:07:44 crc kubenswrapper[4861]: I0309 09:07:44.712159 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:44 crc kubenswrapper[4861]: E0309 09:07:44.712464 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:07:45.212451894 +0000 UTC m=+108.297491295 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cjkj6" (UID: "de2fac75-67e1-47c9-9507-b8b5e5857c32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:44 crc kubenswrapper[4861]: W0309 09:07:44.721746 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56bad3ba_c8af_4b69_96ae_93311a9d6151.slice/crio-da8faef02597c2965cc7577d9c3211e38a595d8ccb3e164bdcd2e162901057cd WatchSource:0}: Error finding container da8faef02597c2965cc7577d9c3211e38a595d8ccb3e164bdcd2e162901057cd: Status 404 returned error can't find the container with id da8faef02597c2965cc7577d9c3211e38a595d8ccb3e164bdcd2e162901057cd Mar 09 09:07:44 crc kubenswrapper[4861]: I0309 09:07:44.813301 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:07:44 crc kubenswrapper[4861]: E0309 09:07:44.813546 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:07:45.313509642 +0000 UTC m=+108.398549043 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:44 crc kubenswrapper[4861]: I0309 09:07:44.813623 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:44 crc kubenswrapper[4861]: E0309 09:07:44.813945 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:07:45.313937205 +0000 UTC m=+108.398976606 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cjkj6" (UID: "de2fac75-67e1-47c9-9507-b8b5e5857c32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:44 crc kubenswrapper[4861]: I0309 09:07:44.888701 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-dwjwj" Mar 09 09:07:44 crc kubenswrapper[4861]: I0309 09:07:44.900131 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mlqmj"] Mar 09 09:07:44 crc kubenswrapper[4861]: I0309 09:07:44.915514 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:07:44 crc kubenswrapper[4861]: E0309 09:07:44.916470 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:07:45.416327159 +0000 UTC m=+108.501366560 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.017137 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:45 crc kubenswrapper[4861]: E0309 09:07:45.017433 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:07:45.517421098 +0000 UTC m=+108.602460499 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cjkj6" (UID: "de2fac75-67e1-47c9-9507-b8b5e5857c32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.100434 4861 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.119741 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:07:45 crc kubenswrapper[4861]: E0309 09:07:45.119885 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:07:45.619850994 +0000 UTC m=+108.704890395 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.120146 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:45 crc kubenswrapper[4861]: E0309 09:07:45.120657 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:07:45.620640766 +0000 UTC m=+108.705680157 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cjkj6" (UID: "de2fac75-67e1-47c9-9507-b8b5e5857c32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.161077 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fnk25"] Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.162061 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fnk25" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.164172 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.175280 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fnk25"] Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.221127 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:07:45 crc kubenswrapper[4861]: E0309 09:07:45.221308 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:07:45.721281993 +0000 UTC m=+108.806321394 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.221399 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.221432 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhs5c\" (UniqueName: \"kubernetes.io/projected/0eaf35dc-b198-4fbb-9e43-ddac97f1f62b-kube-api-access-fhs5c\") pod \"redhat-marketplace-fnk25\" (UID: \"0eaf35dc-b198-4fbb-9e43-ddac97f1f62b\") " pod="openshift-marketplace/redhat-marketplace-fnk25" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.221488 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0eaf35dc-b198-4fbb-9e43-ddac97f1f62b-utilities\") pod \"redhat-marketplace-fnk25\" (UID: \"0eaf35dc-b198-4fbb-9e43-ddac97f1f62b\") " pod="openshift-marketplace/redhat-marketplace-fnk25" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.221510 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0eaf35dc-b198-4fbb-9e43-ddac97f1f62b-catalog-content\") pod \"redhat-marketplace-fnk25\" (UID: \"0eaf35dc-b198-4fbb-9e43-ddac97f1f62b\") " pod="openshift-marketplace/redhat-marketplace-fnk25" Mar 09 09:07:45 crc kubenswrapper[4861]: E0309 09:07:45.221721 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:07:45.721714084 +0000 UTC m=+108.806753485 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cjkj6" (UID: "de2fac75-67e1-47c9-9507-b8b5e5857c32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.323133 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:07:45 crc kubenswrapper[4861]: E0309 09:07:45.323382 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:07:45.823333719 +0000 UTC m=+108.908373130 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.323560 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0eaf35dc-b198-4fbb-9e43-ddac97f1f62b-utilities\") pod \"redhat-marketplace-fnk25\" (UID: \"0eaf35dc-b198-4fbb-9e43-ddac97f1f62b\") " pod="openshift-marketplace/redhat-marketplace-fnk25" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.323598 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0eaf35dc-b198-4fbb-9e43-ddac97f1f62b-catalog-content\") pod \"redhat-marketplace-fnk25\" (UID: \"0eaf35dc-b198-4fbb-9e43-ddac97f1f62b\") " pod="openshift-marketplace/redhat-marketplace-fnk25" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.323685 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.323734 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhs5c\" (UniqueName: \"kubernetes.io/projected/0eaf35dc-b198-4fbb-9e43-ddac97f1f62b-kube-api-access-fhs5c\") pod \"redhat-marketplace-fnk25\" (UID: \"0eaf35dc-b198-4fbb-9e43-ddac97f1f62b\") " pod="openshift-marketplace/redhat-marketplace-fnk25" Mar 09 09:07:45 crc kubenswrapper[4861]: E0309 09:07:45.324006 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:07:45.823992656 +0000 UTC m=+108.909032057 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cjkj6" (UID: "de2fac75-67e1-47c9-9507-b8b5e5857c32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.324203 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0eaf35dc-b198-4fbb-9e43-ddac97f1f62b-utilities\") pod \"redhat-marketplace-fnk25\" (UID: \"0eaf35dc-b198-4fbb-9e43-ddac97f1f62b\") " pod="openshift-marketplace/redhat-marketplace-fnk25" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.324230 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0eaf35dc-b198-4fbb-9e43-ddac97f1f62b-catalog-content\") pod \"redhat-marketplace-fnk25\" (UID: \"0eaf35dc-b198-4fbb-9e43-ddac97f1f62b\") " pod="openshift-marketplace/redhat-marketplace-fnk25" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.352196 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhs5c\" (UniqueName: \"kubernetes.io/projected/0eaf35dc-b198-4fbb-9e43-ddac97f1f62b-kube-api-access-fhs5c\") pod \"redhat-marketplace-fnk25\" (UID: \"0eaf35dc-b198-4fbb-9e43-ddac97f1f62b\") " pod="openshift-marketplace/redhat-marketplace-fnk25" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.425211 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:07:45 crc kubenswrapper[4861]: E0309 09:07:45.425461 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:07:45.925416925 +0000 UTC m=+109.010456326 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.425689 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:45 crc kubenswrapper[4861]: E0309 09:07:45.426061 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:07:45.926045511 +0000 UTC m=+109.011084912 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cjkj6" (UID: "de2fac75-67e1-47c9-9507-b8b5e5857c32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.495019 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-gkrbr" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.509110 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-z6v85" event={"ID":"cf30c324-b218-45df-8462-1b76cc2825c2","Type":"ContainerStarted","Data":"e1641046690c03c7c7b139e8a901d7528d85c06b8b79c5402b579ce4fee4abc3"} Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.509168 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-z6v85" event={"ID":"cf30c324-b218-45df-8462-1b76cc2825c2","Type":"ContainerStarted","Data":"beab83ac784c1f88f62c8db0c0ecea37c457965ac0992ccc801448be5792bd5a"} Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.513171 4861 generic.go:334] "Generic (PLEG): container finished" podID="245d74cf-545f-43d3-ad40-5260aef18260" containerID="dfbeac2019ba0e86c10d28619a8d9daf60a74580546b76ca5ed8562bcbb84c7f" exitCode=0 Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.513740 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gcb6x" event={"ID":"245d74cf-545f-43d3-ad40-5260aef18260","Type":"ContainerDied","Data":"dfbeac2019ba0e86c10d28619a8d9daf60a74580546b76ca5ed8562bcbb84c7f"} Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.513790 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gcb6x" event={"ID":"245d74cf-545f-43d3-ad40-5260aef18260","Type":"ContainerStarted","Data":"5abe4c2768b598d5b864e92b23d9817f646b967b24e4afc434a15d4ea8ba4db2"} Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.515340 4861 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.515472 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h84v9" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.525632 4861 generic.go:334] "Generic (PLEG): container finished" podID="e8af01b5-95f0-43a2-b228-675b98c6203f" containerID="d1ac58f66678ed0efa4f1a5b7ccefaea6deda777c9f9db670dce14c3a9eb20f7" exitCode=0 Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.525719 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mlqmj" event={"ID":"e8af01b5-95f0-43a2-b228-675b98c6203f","Type":"ContainerDied","Data":"d1ac58f66678ed0efa4f1a5b7ccefaea6deda777c9f9db670dce14c3a9eb20f7"} Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.525746 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mlqmj" event={"ID":"e8af01b5-95f0-43a2-b228-675b98c6203f","Type":"ContainerStarted","Data":"68989ef329cac07fd635ca2c012abc573f1bd9468a9fc65d744edfa3c6db3a54"} Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.527601 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.527635 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20b8695b-8129-4f02-824d-5ca2a451d899-config\") pod \"20b8695b-8129-4f02-824d-5ca2a451d899\" (UID: \"20b8695b-8129-4f02-824d-5ca2a451d899\") " Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.527664 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ba6c5b9f-7812-4b6d-998b-6f368a6edf83-client-ca\") pod \"ba6c5b9f-7812-4b6d-998b-6f368a6edf83\" (UID: \"ba6c5b9f-7812-4b6d-998b-6f368a6edf83\") " Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.527689 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mj42q\" (UniqueName: \"kubernetes.io/projected/ba6c5b9f-7812-4b6d-998b-6f368a6edf83-kube-api-access-mj42q\") pod \"ba6c5b9f-7812-4b6d-998b-6f368a6edf83\" (UID: \"ba6c5b9f-7812-4b6d-998b-6f368a6edf83\") " Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.527716 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba6c5b9f-7812-4b6d-998b-6f368a6edf83-serving-cert\") pod \"ba6c5b9f-7812-4b6d-998b-6f368a6edf83\" (UID: \"ba6c5b9f-7812-4b6d-998b-6f368a6edf83\") " Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.527743 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhdr9\" (UniqueName: \"kubernetes.io/projected/20b8695b-8129-4f02-824d-5ca2a451d899-kube-api-access-dhdr9\") pod \"20b8695b-8129-4f02-824d-5ca2a451d899\" (UID: \"20b8695b-8129-4f02-824d-5ca2a451d899\") " Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.527841 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/20b8695b-8129-4f02-824d-5ca2a451d899-proxy-ca-bundles\") pod \"20b8695b-8129-4f02-824d-5ca2a451d899\" (UID: \"20b8695b-8129-4f02-824d-5ca2a451d899\") " Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.527870 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20b8695b-8129-4f02-824d-5ca2a451d899-serving-cert\") pod \"20b8695b-8129-4f02-824d-5ca2a451d899\" (UID: \"20b8695b-8129-4f02-824d-5ca2a451d899\") " Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.527898 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba6c5b9f-7812-4b6d-998b-6f368a6edf83-config\") pod \"ba6c5b9f-7812-4b6d-998b-6f368a6edf83\" (UID: \"ba6c5b9f-7812-4b6d-998b-6f368a6edf83\") " Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.527919 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/20b8695b-8129-4f02-824d-5ca2a451d899-client-ca\") pod \"20b8695b-8129-4f02-824d-5ca2a451d899\" (UID: \"20b8695b-8129-4f02-824d-5ca2a451d899\") " Mar 09 09:07:45 crc kubenswrapper[4861]: E0309 09:07:45.528772 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:07:46.028757034 +0000 UTC m=+109.113796435 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.529941 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20b8695b-8129-4f02-824d-5ca2a451d899-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "20b8695b-8129-4f02-824d-5ca2a451d899" (UID: "20b8695b-8129-4f02-824d-5ca2a451d899"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.530614 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba6c5b9f-7812-4b6d-998b-6f368a6edf83-config" (OuterVolumeSpecName: "config") pod "ba6c5b9f-7812-4b6d-998b-6f368a6edf83" (UID: "ba6c5b9f-7812-4b6d-998b-6f368a6edf83"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.530988 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20b8695b-8129-4f02-824d-5ca2a451d899-client-ca" (OuterVolumeSpecName: "client-ca") pod "20b8695b-8129-4f02-824d-5ca2a451d899" (UID: "20b8695b-8129-4f02-824d-5ca2a451d899"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.530996 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba6c5b9f-7812-4b6d-998b-6f368a6edf83-client-ca" (OuterVolumeSpecName: "client-ca") pod "ba6c5b9f-7812-4b6d-998b-6f368a6edf83" (UID: "ba6c5b9f-7812-4b6d-998b-6f368a6edf83"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.533938 4861 generic.go:334] "Generic (PLEG): container finished" podID="ba6c5b9f-7812-4b6d-998b-6f368a6edf83" containerID="899ea0ba0c4b5f62853960fa69081f92b17aed5305f09ed483af5fa1a0833ac6" exitCode=0 Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.534051 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h84v9" event={"ID":"ba6c5b9f-7812-4b6d-998b-6f368a6edf83","Type":"ContainerDied","Data":"899ea0ba0c4b5f62853960fa69081f92b17aed5305f09ed483af5fa1a0833ac6"} Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.534094 4861 scope.go:117] "RemoveContainer" containerID="899ea0ba0c4b5f62853960fa69081f92b17aed5305f09ed483af5fa1a0833ac6" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.534109 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h84v9" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.535348 4861 patch_prober.go:28] interesting pod/router-default-5444994796-c6sj6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 09:07:45 crc kubenswrapper[4861]: [-]has-synced failed: reason withheld Mar 09 09:07:45 crc kubenswrapper[4861]: [+]process-running ok Mar 09 09:07:45 crc kubenswrapper[4861]: healthz check failed Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.535415 4861 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-c6sj6" podUID="8ff37c1b-1688-42ce-8b0c-952d297ae4a0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.537694 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-748457ff4-72cdj"] Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.537847 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba6c5b9f-7812-4b6d-998b-6f368a6edf83-kube-api-access-mj42q" (OuterVolumeSpecName: "kube-api-access-mj42q") pod "ba6c5b9f-7812-4b6d-998b-6f368a6edf83" (UID: "ba6c5b9f-7812-4b6d-998b-6f368a6edf83"). InnerVolumeSpecName "kube-api-access-mj42q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:07:45 crc kubenswrapper[4861]: E0309 09:07:45.537892 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba6c5b9f-7812-4b6d-998b-6f368a6edf83" containerName="route-controller-manager" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.537902 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba6c5b9f-7812-4b6d-998b-6f368a6edf83" containerName="route-controller-manager" Mar 09 09:07:45 crc kubenswrapper[4861]: E0309 09:07:45.537916 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20b8695b-8129-4f02-824d-5ca2a451d899" containerName="controller-manager" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.537922 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="20b8695b-8129-4f02-824d-5ca2a451d899" containerName="controller-manager" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.538036 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="20b8695b-8129-4f02-824d-5ca2a451d899" containerName="controller-manager" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.538052 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba6c5b9f-7812-4b6d-998b-6f368a6edf83" containerName="route-controller-manager" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.538228 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20b8695b-8129-4f02-824d-5ca2a451d899-config" (OuterVolumeSpecName: "config") pod "20b8695b-8129-4f02-824d-5ca2a451d899" (UID: "20b8695b-8129-4f02-824d-5ca2a451d899"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.538496 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b8695b-8129-4f02-824d-5ca2a451d899-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "20b8695b-8129-4f02-824d-5ca2a451d899" (UID: "20b8695b-8129-4f02-824d-5ca2a451d899"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.538699 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba6c5b9f-7812-4b6d-998b-6f368a6edf83-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ba6c5b9f-7812-4b6d-998b-6f368a6edf83" (UID: "ba6c5b9f-7812-4b6d-998b-6f368a6edf83"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.538724 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-748457ff4-72cdj" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.541905 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b8695b-8129-4f02-824d-5ca2a451d899-kube-api-access-dhdr9" (OuterVolumeSpecName: "kube-api-access-dhdr9") pod "20b8695b-8129-4f02-824d-5ca2a451d899" (UID: "20b8695b-8129-4f02-824d-5ca2a451d899"). InnerVolumeSpecName "kube-api-access-dhdr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.543845 4861 generic.go:334] "Generic (PLEG): container finished" podID="20b8695b-8129-4f02-824d-5ca2a451d899" containerID="46db51df03b63c7c935e50810a78502a1ddf34243cb4a0ff7a9d13373c6a62db" exitCode=0 Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.543939 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-gkrbr" event={"ID":"20b8695b-8129-4f02-824d-5ca2a451d899","Type":"ContainerDied","Data":"46db51df03b63c7c935e50810a78502a1ddf34243cb4a0ff7a9d13373c6a62db"} Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.543986 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-gkrbr" event={"ID":"20b8695b-8129-4f02-824d-5ca2a451d899","Type":"ContainerDied","Data":"32a528bff3599f95196bef59a995a79ac5f994c3a1fa22230888690c326f8578"} Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.543959 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-gkrbr" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.553089 4861 generic.go:334] "Generic (PLEG): container finished" podID="3ad8332f-9ca2-4dd0-903f-2bf5723aa51e" containerID="c405bdf8a6e7ed5d4413daa15ca151b9128a666746fe5186517f5506ec79d414" exitCode=0 Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.553215 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cjwtv" event={"ID":"3ad8332f-9ca2-4dd0-903f-2bf5723aa51e","Type":"ContainerDied","Data":"c405bdf8a6e7ed5d4413daa15ca151b9128a666746fe5186517f5506ec79d414"} Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.559846 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-748457ff4-72cdj"] Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.560340 4861 generic.go:334] "Generic (PLEG): container finished" podID="56bad3ba-c8af-4b69-96ae-93311a9d6151" containerID="23b47bbc46d7c47f0edd12e592c4cd6b4b26f00019821676601739aeeb3778d9" exitCode=0 Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.560489 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-48m9h" event={"ID":"56bad3ba-c8af-4b69-96ae-93311a9d6151","Type":"ContainerDied","Data":"23b47bbc46d7c47f0edd12e592c4cd6b4b26f00019821676601739aeeb3778d9"} Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.560555 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-48m9h" event={"ID":"56bad3ba-c8af-4b69-96ae-93311a9d6151","Type":"ContainerStarted","Data":"da8faef02597c2965cc7577d9c3211e38a595d8ccb3e164bdcd2e162901057cd"} Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.572915 4861 scope.go:117] "RemoveContainer" containerID="46db51df03b63c7c935e50810a78502a1ddf34243cb4a0ff7a9d13373c6a62db" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.575966 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wcp7q"] Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.577979 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wcp7q" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.580208 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fnk25" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.616930 4861 scope.go:117] "RemoveContainer" containerID="46db51df03b63c7c935e50810a78502a1ddf34243cb4a0ff7a9d13373c6a62db" Mar 09 09:07:45 crc kubenswrapper[4861]: E0309 09:07:45.622726 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46db51df03b63c7c935e50810a78502a1ddf34243cb4a0ff7a9d13373c6a62db\": container with ID starting with 46db51df03b63c7c935e50810a78502a1ddf34243cb4a0ff7a9d13373c6a62db not found: ID does not exist" containerID="46db51df03b63c7c935e50810a78502a1ddf34243cb4a0ff7a9d13373c6a62db" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.622785 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46db51df03b63c7c935e50810a78502a1ddf34243cb4a0ff7a9d13373c6a62db"} err="failed to get container status \"46db51df03b63c7c935e50810a78502a1ddf34243cb4a0ff7a9d13373c6a62db\": rpc error: code = NotFound desc = could not find container \"46db51df03b63c7c935e50810a78502a1ddf34243cb4a0ff7a9d13373c6a62db\": container with ID starting with 46db51df03b63c7c935e50810a78502a1ddf34243cb4a0ff7a9d13373c6a62db not found: ID does not exist" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.629333 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7727f6b7-e2a9-4554-9224-a4600931b08a-serving-cert\") pod \"controller-manager-748457ff4-72cdj\" (UID: \"7727f6b7-e2a9-4554-9224-a4600931b08a\") " pod="openshift-controller-manager/controller-manager-748457ff4-72cdj" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.629459 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7727f6b7-e2a9-4554-9224-a4600931b08a-proxy-ca-bundles\") pod \"controller-manager-748457ff4-72cdj\" (UID: \"7727f6b7-e2a9-4554-9224-a4600931b08a\") " pod="openshift-controller-manager/controller-manager-748457ff4-72cdj" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.629530 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0398ae40-9658-4a9a-949c-4419bb1ca9bf-utilities\") pod \"redhat-marketplace-wcp7q\" (UID: \"0398ae40-9658-4a9a-949c-4419bb1ca9bf\") " pod="openshift-marketplace/redhat-marketplace-wcp7q" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.629591 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.629617 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfzmv\" (UniqueName: \"kubernetes.io/projected/0398ae40-9658-4a9a-949c-4419bb1ca9bf-kube-api-access-vfzmv\") pod \"redhat-marketplace-wcp7q\" (UID: \"0398ae40-9658-4a9a-949c-4419bb1ca9bf\") " pod="openshift-marketplace/redhat-marketplace-wcp7q" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.629683 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0398ae40-9658-4a9a-949c-4419bb1ca9bf-catalog-content\") pod \"redhat-marketplace-wcp7q\" (UID: \"0398ae40-9658-4a9a-949c-4419bb1ca9bf\") " pod="openshift-marketplace/redhat-marketplace-wcp7q" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.629780 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7727f6b7-e2a9-4554-9224-a4600931b08a-config\") pod \"controller-manager-748457ff4-72cdj\" (UID: \"7727f6b7-e2a9-4554-9224-a4600931b08a\") " pod="openshift-controller-manager/controller-manager-748457ff4-72cdj" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.629797 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7727f6b7-e2a9-4554-9224-a4600931b08a-client-ca\") pod \"controller-manager-748457ff4-72cdj\" (UID: \"7727f6b7-e2a9-4554-9224-a4600931b08a\") " pod="openshift-controller-manager/controller-manager-748457ff4-72cdj" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.629842 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ww2x6\" (UniqueName: \"kubernetes.io/projected/7727f6b7-e2a9-4554-9224-a4600931b08a-kube-api-access-ww2x6\") pod \"controller-manager-748457ff4-72cdj\" (UID: \"7727f6b7-e2a9-4554-9224-a4600931b08a\") " pod="openshift-controller-manager/controller-manager-748457ff4-72cdj" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.629901 4861 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/20b8695b-8129-4f02-824d-5ca2a451d899-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.629914 4861 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20b8695b-8129-4f02-824d-5ca2a451d899-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.629924 4861 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba6c5b9f-7812-4b6d-998b-6f368a6edf83-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.629936 4861 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/20b8695b-8129-4f02-824d-5ca2a451d899-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.629945 4861 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20b8695b-8129-4f02-824d-5ca2a451d899-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.629955 4861 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ba6c5b9f-7812-4b6d-998b-6f368a6edf83-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.629964 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mj42q\" (UniqueName: \"kubernetes.io/projected/ba6c5b9f-7812-4b6d-998b-6f368a6edf83-kube-api-access-mj42q\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.629976 4861 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba6c5b9f-7812-4b6d-998b-6f368a6edf83-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.629986 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhdr9\" (UniqueName: \"kubernetes.io/projected/20b8695b-8129-4f02-824d-5ca2a451d899-kube-api-access-dhdr9\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:45 crc kubenswrapper[4861]: E0309 09:07:45.631066 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:07:46.131051826 +0000 UTC m=+109.216091227 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cjkj6" (UID: "de2fac75-67e1-47c9-9507-b8b5e5857c32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.655451 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wcp7q"] Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.656717 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-z6v85" podStartSLOduration=11.656702367 podStartE2EDuration="11.656702367s" podCreationTimestamp="2026-03-09 09:07:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:07:45.617667817 +0000 UTC m=+108.702707208" watchObservedRunningTime="2026-03-09 09:07:45.656702367 +0000 UTC m=+108.741741768" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.678318 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-gkrbr"] Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.683080 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-gkrbr"] Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.730893 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:07:45 crc kubenswrapper[4861]: E0309 09:07:45.731460 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:07:46.231441177 +0000 UTC m=+109.316480578 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.731506 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.731537 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfzmv\" (UniqueName: \"kubernetes.io/projected/0398ae40-9658-4a9a-949c-4419bb1ca9bf-kube-api-access-vfzmv\") pod \"redhat-marketplace-wcp7q\" (UID: \"0398ae40-9658-4a9a-949c-4419bb1ca9bf\") " pod="openshift-marketplace/redhat-marketplace-wcp7q" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.731565 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0398ae40-9658-4a9a-949c-4419bb1ca9bf-catalog-content\") pod \"redhat-marketplace-wcp7q\" (UID: \"0398ae40-9658-4a9a-949c-4419bb1ca9bf\") " pod="openshift-marketplace/redhat-marketplace-wcp7q" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.731616 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7727f6b7-e2a9-4554-9224-a4600931b08a-config\") pod \"controller-manager-748457ff4-72cdj\" (UID: \"7727f6b7-e2a9-4554-9224-a4600931b08a\") " pod="openshift-controller-manager/controller-manager-748457ff4-72cdj" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.731631 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7727f6b7-e2a9-4554-9224-a4600931b08a-client-ca\") pod \"controller-manager-748457ff4-72cdj\" (UID: \"7727f6b7-e2a9-4554-9224-a4600931b08a\") " pod="openshift-controller-manager/controller-manager-748457ff4-72cdj" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.731648 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ww2x6\" (UniqueName: \"kubernetes.io/projected/7727f6b7-e2a9-4554-9224-a4600931b08a-kube-api-access-ww2x6\") pod \"controller-manager-748457ff4-72cdj\" (UID: \"7727f6b7-e2a9-4554-9224-a4600931b08a\") " pod="openshift-controller-manager/controller-manager-748457ff4-72cdj" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.731695 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7727f6b7-e2a9-4554-9224-a4600931b08a-serving-cert\") pod \"controller-manager-748457ff4-72cdj\" (UID: \"7727f6b7-e2a9-4554-9224-a4600931b08a\") " pod="openshift-controller-manager/controller-manager-748457ff4-72cdj" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.731725 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7727f6b7-e2a9-4554-9224-a4600931b08a-proxy-ca-bundles\") pod \"controller-manager-748457ff4-72cdj\" (UID: \"7727f6b7-e2a9-4554-9224-a4600931b08a\") " pod="openshift-controller-manager/controller-manager-748457ff4-72cdj" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.731751 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0398ae40-9658-4a9a-949c-4419bb1ca9bf-utilities\") pod \"redhat-marketplace-wcp7q\" (UID: \"0398ae40-9658-4a9a-949c-4419bb1ca9bf\") " pod="openshift-marketplace/redhat-marketplace-wcp7q" Mar 09 09:07:45 crc kubenswrapper[4861]: E0309 09:07:45.732826 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 09:07:46.232808644 +0000 UTC m=+109.317848045 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-cjkj6" (UID: "de2fac75-67e1-47c9-9507-b8b5e5857c32") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.733394 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0398ae40-9658-4a9a-949c-4419bb1ca9bf-utilities\") pod \"redhat-marketplace-wcp7q\" (UID: \"0398ae40-9658-4a9a-949c-4419bb1ca9bf\") " pod="openshift-marketplace/redhat-marketplace-wcp7q" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.733536 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0398ae40-9658-4a9a-949c-4419bb1ca9bf-catalog-content\") pod \"redhat-marketplace-wcp7q\" (UID: \"0398ae40-9658-4a9a-949c-4419bb1ca9bf\") " pod="openshift-marketplace/redhat-marketplace-wcp7q" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.734168 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7727f6b7-e2a9-4554-9224-a4600931b08a-proxy-ca-bundles\") pod \"controller-manager-748457ff4-72cdj\" (UID: \"7727f6b7-e2a9-4554-9224-a4600931b08a\") " pod="openshift-controller-manager/controller-manager-748457ff4-72cdj" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.734520 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7727f6b7-e2a9-4554-9224-a4600931b08a-client-ca\") pod \"controller-manager-748457ff4-72cdj\" (UID: \"7727f6b7-e2a9-4554-9224-a4600931b08a\") " pod="openshift-controller-manager/controller-manager-748457ff4-72cdj" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.735380 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7727f6b7-e2a9-4554-9224-a4600931b08a-config\") pod \"controller-manager-748457ff4-72cdj\" (UID: \"7727f6b7-e2a9-4554-9224-a4600931b08a\") " pod="openshift-controller-manager/controller-manager-748457ff4-72cdj" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.785542 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7727f6b7-e2a9-4554-9224-a4600931b08a-serving-cert\") pod \"controller-manager-748457ff4-72cdj\" (UID: \"7727f6b7-e2a9-4554-9224-a4600931b08a\") " pod="openshift-controller-manager/controller-manager-748457ff4-72cdj" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.789077 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfzmv\" (UniqueName: \"kubernetes.io/projected/0398ae40-9658-4a9a-949c-4419bb1ca9bf-kube-api-access-vfzmv\") pod \"redhat-marketplace-wcp7q\" (UID: \"0398ae40-9658-4a9a-949c-4419bb1ca9bf\") " pod="openshift-marketplace/redhat-marketplace-wcp7q" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.789838 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ww2x6\" (UniqueName: \"kubernetes.io/projected/7727f6b7-e2a9-4554-9224-a4600931b08a-kube-api-access-ww2x6\") pod \"controller-manager-748457ff4-72cdj\" (UID: \"7727f6b7-e2a9-4554-9224-a4600931b08a\") " pod="openshift-controller-manager/controller-manager-748457ff4-72cdj" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.832507 4861 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-09T09:07:45.100454492Z","Handler":null,"Name":""} Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.833146 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:07:45 crc kubenswrapper[4861]: E0309 09:07:45.833532 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 09:07:46.333515273 +0000 UTC m=+109.418554674 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.840080 4861 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.840154 4861 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.861065 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-h84v9"] Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.861339 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-748457ff4-72cdj" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.878700 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-h84v9"] Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.881328 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.882311 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.886210 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.888480 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.889261 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.919893 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fnk25"] Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.935856 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wcp7q" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.936214 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.936260 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/808431cc-74ea-4377-8c81-189b70544602-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"808431cc-74ea-4377-8c81-189b70544602\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.936332 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/808431cc-74ea-4377-8c81-189b70544602-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"808431cc-74ea-4377-8c81-189b70544602\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.941007 4861 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 09:07:45 crc kubenswrapper[4861]: I0309 09:07:45.941032 4861 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:46 crc kubenswrapper[4861]: I0309 09:07:46.001276 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-cjkj6\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:46 crc kubenswrapper[4861]: I0309 09:07:46.037748 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 09:07:46 crc kubenswrapper[4861]: I0309 09:07:46.037933 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/808431cc-74ea-4377-8c81-189b70544602-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"808431cc-74ea-4377-8c81-189b70544602\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 09:07:46 crc kubenswrapper[4861]: I0309 09:07:46.038008 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/808431cc-74ea-4377-8c81-189b70544602-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"808431cc-74ea-4377-8c81-189b70544602\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 09:07:46 crc kubenswrapper[4861]: I0309 09:07:46.038117 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/808431cc-74ea-4377-8c81-189b70544602-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"808431cc-74ea-4377-8c81-189b70544602\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 09:07:46 crc kubenswrapper[4861]: I0309 09:07:46.058842 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 09 09:07:46 crc kubenswrapper[4861]: I0309 09:07:46.064136 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/808431cc-74ea-4377-8c81-189b70544602-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"808431cc-74ea-4377-8c81-189b70544602\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 09:07:46 crc kubenswrapper[4861]: I0309 09:07:46.089957 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-748457ff4-72cdj"] Mar 09 09:07:46 crc kubenswrapper[4861]: I0309 09:07:46.190812 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:46 crc kubenswrapper[4861]: I0309 09:07:46.210221 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 09:07:46 crc kubenswrapper[4861]: I0309 09:07:46.258107 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wcp7q"] Mar 09 09:07:46 crc kubenswrapper[4861]: W0309 09:07:46.275937 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0398ae40_9658_4a9a_949c_4419bb1ca9bf.slice/crio-d3acd6bcc671bf38e672e530f0be65aba4844418fba614c03065a562388d4d84 WatchSource:0}: Error finding container d3acd6bcc671bf38e672e530f0be65aba4844418fba614c03065a562388d4d84: Status 404 returned error can't find the container with id d3acd6bcc671bf38e672e530f0be65aba4844418fba614c03065a562388d4d84 Mar 09 09:07:46 crc kubenswrapper[4861]: I0309 09:07:46.488510 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-cjkj6"] Mar 09 09:07:46 crc kubenswrapper[4861]: W0309 09:07:46.521596 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde2fac75_67e1_47c9_9507_b8b5e5857c32.slice/crio-b621a4388d578ca3799f69b1114c535158e30133e975e54e5ce33728b658a1e2 WatchSource:0}: Error finding container b621a4388d578ca3799f69b1114c535158e30133e975e54e5ce33728b658a1e2: Status 404 returned error can't find the container with id b621a4388d578ca3799f69b1114c535158e30133e975e54e5ce33728b658a1e2 Mar 09 09:07:46 crc kubenswrapper[4861]: I0309 09:07:46.547737 4861 patch_prober.go:28] interesting pod/router-default-5444994796-c6sj6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 09:07:46 crc kubenswrapper[4861]: [-]has-synced failed: reason withheld Mar 09 09:07:46 crc kubenswrapper[4861]: [+]process-running ok Mar 09 09:07:46 crc kubenswrapper[4861]: healthz check failed Mar 09 09:07:46 crc kubenswrapper[4861]: I0309 09:07:46.547823 4861 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-c6sj6" podUID="8ff37c1b-1688-42ce-8b0c-952d297ae4a0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 09:07:46 crc kubenswrapper[4861]: I0309 09:07:46.586065 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-j6fcf"] Mar 09 09:07:46 crc kubenswrapper[4861]: I0309 09:07:46.587194 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j6fcf" Mar 09 09:07:46 crc kubenswrapper[4861]: I0309 09:07:46.588778 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j6fcf"] Mar 09 09:07:46 crc kubenswrapper[4861]: I0309 09:07:46.591250 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 09 09:07:46 crc kubenswrapper[4861]: I0309 09:07:46.622041 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" event={"ID":"de2fac75-67e1-47c9-9507-b8b5e5857c32","Type":"ContainerStarted","Data":"b621a4388d578ca3799f69b1114c535158e30133e975e54e5ce33728b658a1e2"} Mar 09 09:07:46 crc kubenswrapper[4861]: I0309 09:07:46.651672 4861 generic.go:334] "Generic (PLEG): container finished" podID="0eaf35dc-b198-4fbb-9e43-ddac97f1f62b" containerID="4aeb6ff17984f2b9890f04f5eda9b1b52a4da37cd6841d01f3aae50d87d0ac2e" exitCode=0 Mar 09 09:07:46 crc kubenswrapper[4861]: I0309 09:07:46.651749 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fnk25" event={"ID":"0eaf35dc-b198-4fbb-9e43-ddac97f1f62b","Type":"ContainerDied","Data":"4aeb6ff17984f2b9890f04f5eda9b1b52a4da37cd6841d01f3aae50d87d0ac2e"} Mar 09 09:07:46 crc kubenswrapper[4861]: I0309 09:07:46.651781 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fnk25" event={"ID":"0eaf35dc-b198-4fbb-9e43-ddac97f1f62b","Type":"ContainerStarted","Data":"e0f982a3110fccbe81488a0747b1400ad370f4b47fa76b07317c97e98cc950e5"} Mar 09 09:07:46 crc kubenswrapper[4861]: I0309 09:07:46.653752 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33158a78-8c1c-4aa1-9c51-66e21d0e8ae6-catalog-content\") pod \"redhat-operators-j6fcf\" (UID: \"33158a78-8c1c-4aa1-9c51-66e21d0e8ae6\") " pod="openshift-marketplace/redhat-operators-j6fcf" Mar 09 09:07:46 crc kubenswrapper[4861]: I0309 09:07:46.653820 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33158a78-8c1c-4aa1-9c51-66e21d0e8ae6-utilities\") pod \"redhat-operators-j6fcf\" (UID: \"33158a78-8c1c-4aa1-9c51-66e21d0e8ae6\") " pod="openshift-marketplace/redhat-operators-j6fcf" Mar 09 09:07:46 crc kubenswrapper[4861]: I0309 09:07:46.653946 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-md8cx\" (UniqueName: \"kubernetes.io/projected/33158a78-8c1c-4aa1-9c51-66e21d0e8ae6-kube-api-access-md8cx\") pod \"redhat-operators-j6fcf\" (UID: \"33158a78-8c1c-4aa1-9c51-66e21d0e8ae6\") " pod="openshift-marketplace/redhat-operators-j6fcf" Mar 09 09:07:46 crc kubenswrapper[4861]: I0309 09:07:46.659078 4861 scope.go:117] "RemoveContainer" containerID="f88f2e129d4bfc17b0ac4416da5c5096bf314e097dfa40a48858a83425ca91e1" Mar 09 09:07:46 crc kubenswrapper[4861]: I0309 09:07:46.670731 4861 generic.go:334] "Generic (PLEG): container finished" podID="0398ae40-9658-4a9a-949c-4419bb1ca9bf" containerID="db0199b19b12f610f5e637145782ea749a5d98fe1c2e92629063b7700da24cf1" exitCode=0 Mar 09 09:07:46 crc kubenswrapper[4861]: I0309 09:07:46.670800 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wcp7q" event={"ID":"0398ae40-9658-4a9a-949c-4419bb1ca9bf","Type":"ContainerDied","Data":"db0199b19b12f610f5e637145782ea749a5d98fe1c2e92629063b7700da24cf1"} Mar 09 09:07:46 crc kubenswrapper[4861]: I0309 09:07:46.670828 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wcp7q" event={"ID":"0398ae40-9658-4a9a-949c-4419bb1ca9bf","Type":"ContainerStarted","Data":"d3acd6bcc671bf38e672e530f0be65aba4844418fba614c03065a562388d4d84"} Mar 09 09:07:46 crc kubenswrapper[4861]: I0309 09:07:46.690229 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 09 09:07:46 crc kubenswrapper[4861]: I0309 09:07:46.703858 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-748457ff4-72cdj" event={"ID":"7727f6b7-e2a9-4554-9224-a4600931b08a","Type":"ContainerStarted","Data":"02258d50bdab0b7d450ab4919645b266eba5654e6192f751394ccaa5699570f5"} Mar 09 09:07:46 crc kubenswrapper[4861]: I0309 09:07:46.703896 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-748457ff4-72cdj" event={"ID":"7727f6b7-e2a9-4554-9224-a4600931b08a","Type":"ContainerStarted","Data":"36acfb6bac664e13af285b17638ea22829a9e47517fc2952b3a367bddac703c3"} Mar 09 09:07:46 crc kubenswrapper[4861]: I0309 09:07:46.705466 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-748457ff4-72cdj" Mar 09 09:07:46 crc kubenswrapper[4861]: I0309 09:07:46.721710 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-748457ff4-72cdj" Mar 09 09:07:46 crc kubenswrapper[4861]: I0309 09:07:46.725420 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-748457ff4-72cdj" podStartSLOduration=2.725401107 podStartE2EDuration="2.725401107s" podCreationTimestamp="2026-03-09 09:07:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:07:46.722284653 +0000 UTC m=+109.807324054" watchObservedRunningTime="2026-03-09 09:07:46.725401107 +0000 UTC m=+109.810440508" Mar 09 09:07:46 crc kubenswrapper[4861]: I0309 09:07:46.756594 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33158a78-8c1c-4aa1-9c51-66e21d0e8ae6-catalog-content\") pod \"redhat-operators-j6fcf\" (UID: \"33158a78-8c1c-4aa1-9c51-66e21d0e8ae6\") " pod="openshift-marketplace/redhat-operators-j6fcf" Mar 09 09:07:46 crc kubenswrapper[4861]: I0309 09:07:46.756634 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33158a78-8c1c-4aa1-9c51-66e21d0e8ae6-utilities\") pod \"redhat-operators-j6fcf\" (UID: \"33158a78-8c1c-4aa1-9c51-66e21d0e8ae6\") " pod="openshift-marketplace/redhat-operators-j6fcf" Mar 09 09:07:46 crc kubenswrapper[4861]: I0309 09:07:46.756691 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-md8cx\" (UniqueName: \"kubernetes.io/projected/33158a78-8c1c-4aa1-9c51-66e21d0e8ae6-kube-api-access-md8cx\") pod \"redhat-operators-j6fcf\" (UID: \"33158a78-8c1c-4aa1-9c51-66e21d0e8ae6\") " pod="openshift-marketplace/redhat-operators-j6fcf" Mar 09 09:07:46 crc kubenswrapper[4861]: I0309 09:07:46.757853 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33158a78-8c1c-4aa1-9c51-66e21d0e8ae6-utilities\") pod \"redhat-operators-j6fcf\" (UID: \"33158a78-8c1c-4aa1-9c51-66e21d0e8ae6\") " pod="openshift-marketplace/redhat-operators-j6fcf" Mar 09 09:07:46 crc kubenswrapper[4861]: I0309 09:07:46.757964 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33158a78-8c1c-4aa1-9c51-66e21d0e8ae6-catalog-content\") pod \"redhat-operators-j6fcf\" (UID: \"33158a78-8c1c-4aa1-9c51-66e21d0e8ae6\") " pod="openshift-marketplace/redhat-operators-j6fcf" Mar 09 09:07:46 crc kubenswrapper[4861]: I0309 09:07:46.759222 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=0.759200526 podStartE2EDuration="759.200526ms" podCreationTimestamp="2026-03-09 09:07:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:07:46.755849276 +0000 UTC m=+109.840888677" watchObservedRunningTime="2026-03-09 09:07:46.759200526 +0000 UTC m=+109.844239927" Mar 09 09:07:46 crc kubenswrapper[4861]: I0309 09:07:46.766282 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 09 09:07:46 crc kubenswrapper[4861]: I0309 09:07:46.806128 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-md8cx\" (UniqueName: \"kubernetes.io/projected/33158a78-8c1c-4aa1-9c51-66e21d0e8ae6-kube-api-access-md8cx\") pod \"redhat-operators-j6fcf\" (UID: \"33158a78-8c1c-4aa1-9c51-66e21d0e8ae6\") " pod="openshift-marketplace/redhat-operators-j6fcf" Mar 09 09:07:46 crc kubenswrapper[4861]: I0309 09:07:46.854904 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hn7r4" Mar 09 09:07:46 crc kubenswrapper[4861]: I0309 09:07:46.855166 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hn7r4" Mar 09 09:07:46 crc kubenswrapper[4861]: I0309 09:07:46.871297 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hn7r4" Mar 09 09:07:46 crc kubenswrapper[4861]: I0309 09:07:46.883514 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-qh9sg" Mar 09 09:07:46 crc kubenswrapper[4861]: I0309 09:07:46.883788 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-qh9sg" Mar 09 09:07:46 crc kubenswrapper[4861]: I0309 09:07:46.886633 4861 patch_prober.go:28] interesting pod/console-f9d7485db-qh9sg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Mar 09 09:07:46 crc kubenswrapper[4861]: I0309 09:07:46.886670 4861 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-qh9sg" podUID="85a3bbcb-e663-4a97-980c-606c979409d7" containerName="console" probeResult="failure" output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" Mar 09 09:07:46 crc kubenswrapper[4861]: I0309 09:07:46.923948 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j6fcf" Mar 09 09:07:46 crc kubenswrapper[4861]: I0309 09:07:46.967209 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zkngt"] Mar 09 09:07:46 crc kubenswrapper[4861]: I0309 09:07:46.968422 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zkngt" Mar 09 09:07:46 crc kubenswrapper[4861]: I0309 09:07:46.993299 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zkngt"] Mar 09 09:07:47 crc kubenswrapper[4861]: I0309 09:07:47.068129 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6eac3eed-7721-4030-b1e3-9dd28fea2e49-catalog-content\") pod \"redhat-operators-zkngt\" (UID: \"6eac3eed-7721-4030-b1e3-9dd28fea2e49\") " pod="openshift-marketplace/redhat-operators-zkngt" Mar 09 09:07:47 crc kubenswrapper[4861]: I0309 09:07:47.068183 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kh8vm\" (UniqueName: \"kubernetes.io/projected/6eac3eed-7721-4030-b1e3-9dd28fea2e49-kube-api-access-kh8vm\") pod \"redhat-operators-zkngt\" (UID: \"6eac3eed-7721-4030-b1e3-9dd28fea2e49\") " pod="openshift-marketplace/redhat-operators-zkngt" Mar 09 09:07:47 crc kubenswrapper[4861]: I0309 09:07:47.068247 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6eac3eed-7721-4030-b1e3-9dd28fea2e49-utilities\") pod \"redhat-operators-zkngt\" (UID: \"6eac3eed-7721-4030-b1e3-9dd28fea2e49\") " pod="openshift-marketplace/redhat-operators-zkngt" Mar 09 09:07:47 crc kubenswrapper[4861]: I0309 09:07:47.135500 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-l2jkp" Mar 09 09:07:47 crc kubenswrapper[4861]: I0309 09:07:47.136759 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-l2jkp" Mar 09 09:07:47 crc kubenswrapper[4861]: I0309 09:07:47.150561 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-l2jkp" Mar 09 09:07:47 crc kubenswrapper[4861]: I0309 09:07:47.173533 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6eac3eed-7721-4030-b1e3-9dd28fea2e49-catalog-content\") pod \"redhat-operators-zkngt\" (UID: \"6eac3eed-7721-4030-b1e3-9dd28fea2e49\") " pod="openshift-marketplace/redhat-operators-zkngt" Mar 09 09:07:47 crc kubenswrapper[4861]: I0309 09:07:47.173598 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kh8vm\" (UniqueName: \"kubernetes.io/projected/6eac3eed-7721-4030-b1e3-9dd28fea2e49-kube-api-access-kh8vm\") pod \"redhat-operators-zkngt\" (UID: \"6eac3eed-7721-4030-b1e3-9dd28fea2e49\") " pod="openshift-marketplace/redhat-operators-zkngt" Mar 09 09:07:47 crc kubenswrapper[4861]: I0309 09:07:47.173700 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6eac3eed-7721-4030-b1e3-9dd28fea2e49-utilities\") pod \"redhat-operators-zkngt\" (UID: \"6eac3eed-7721-4030-b1e3-9dd28fea2e49\") " pod="openshift-marketplace/redhat-operators-zkngt" Mar 09 09:07:47 crc kubenswrapper[4861]: I0309 09:07:47.174958 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6eac3eed-7721-4030-b1e3-9dd28fea2e49-utilities\") pod \"redhat-operators-zkngt\" (UID: \"6eac3eed-7721-4030-b1e3-9dd28fea2e49\") " pod="openshift-marketplace/redhat-operators-zkngt" Mar 09 09:07:47 crc kubenswrapper[4861]: I0309 09:07:47.175196 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6eac3eed-7721-4030-b1e3-9dd28fea2e49-catalog-content\") pod \"redhat-operators-zkngt\" (UID: \"6eac3eed-7721-4030-b1e3-9dd28fea2e49\") " pod="openshift-marketplace/redhat-operators-zkngt" Mar 09 09:07:47 crc kubenswrapper[4861]: I0309 09:07:47.209746 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 09 09:07:47 crc kubenswrapper[4861]: I0309 09:07:47.210677 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 09:07:47 crc kubenswrapper[4861]: I0309 09:07:47.213442 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 09 09:07:47 crc kubenswrapper[4861]: I0309 09:07:47.213599 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 09 09:07:47 crc kubenswrapper[4861]: I0309 09:07:47.223695 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 09 09:07:47 crc kubenswrapper[4861]: I0309 09:07:47.224673 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kh8vm\" (UniqueName: \"kubernetes.io/projected/6eac3eed-7721-4030-b1e3-9dd28fea2e49-kube-api-access-kh8vm\") pod \"redhat-operators-zkngt\" (UID: \"6eac3eed-7721-4030-b1e3-9dd28fea2e49\") " pod="openshift-marketplace/redhat-operators-zkngt" Mar 09 09:07:47 crc kubenswrapper[4861]: I0309 09:07:47.263072 4861 patch_prober.go:28] interesting pod/downloads-7954f5f757-h6j45 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.5:8080/\": dial tcp 10.217.0.5:8080: connect: connection refused" start-of-body= Mar 09 09:07:47 crc kubenswrapper[4861]: I0309 09:07:47.263110 4861 patch_prober.go:28] interesting pod/downloads-7954f5f757-h6j45 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.5:8080/\": dial tcp 10.217.0.5:8080: connect: connection refused" start-of-body= Mar 09 09:07:47 crc kubenswrapper[4861]: I0309 09:07:47.263128 4861 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-h6j45" podUID="566d8448-e794-44ee-9d17-e92493adcd87" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.5:8080/\": dial tcp 10.217.0.5:8080: connect: connection refused" Mar 09 09:07:47 crc kubenswrapper[4861]: I0309 09:07:47.263181 4861 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-h6j45" podUID="566d8448-e794-44ee-9d17-e92493adcd87" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.5:8080/\": dial tcp 10.217.0.5:8080: connect: connection refused" Mar 09 09:07:47 crc kubenswrapper[4861]: I0309 09:07:47.315522 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zkngt" Mar 09 09:07:47 crc kubenswrapper[4861]: I0309 09:07:47.378157 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a1e43c63-f141-4254-b2e9-3102f26254ae-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"a1e43c63-f141-4254-b2e9-3102f26254ae\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 09:07:47 crc kubenswrapper[4861]: I0309 09:07:47.378258 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a1e43c63-f141-4254-b2e9-3102f26254ae-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"a1e43c63-f141-4254-b2e9-3102f26254ae\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 09:07:47 crc kubenswrapper[4861]: I0309 09:07:47.479101 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a1e43c63-f141-4254-b2e9-3102f26254ae-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"a1e43c63-f141-4254-b2e9-3102f26254ae\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 09:07:47 crc kubenswrapper[4861]: I0309 09:07:47.479226 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a1e43c63-f141-4254-b2e9-3102f26254ae-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"a1e43c63-f141-4254-b2e9-3102f26254ae\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 09:07:47 crc kubenswrapper[4861]: I0309 09:07:47.479558 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a1e43c63-f141-4254-b2e9-3102f26254ae-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"a1e43c63-f141-4254-b2e9-3102f26254ae\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 09:07:47 crc kubenswrapper[4861]: I0309 09:07:47.492729 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-j6fcf"] Mar 09 09:07:47 crc kubenswrapper[4861]: W0309 09:07:47.505067 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33158a78_8c1c_4aa1_9c51_66e21d0e8ae6.slice/crio-836787c6e1446ba45b92a6a440d102a09879f66c9140757741215da862ee2b2f WatchSource:0}: Error finding container 836787c6e1446ba45b92a6a440d102a09879f66c9140757741215da862ee2b2f: Status 404 returned error can't find the container with id 836787c6e1446ba45b92a6a440d102a09879f66c9140757741215da862ee2b2f Mar 09 09:07:47 crc kubenswrapper[4861]: I0309 09:07:47.505436 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a1e43c63-f141-4254-b2e9-3102f26254ae-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"a1e43c63-f141-4254-b2e9-3102f26254ae\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 09:07:47 crc kubenswrapper[4861]: I0309 09:07:47.530045 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-c6sj6" Mar 09 09:07:47 crc kubenswrapper[4861]: I0309 09:07:47.533683 4861 patch_prober.go:28] interesting pod/router-default-5444994796-c6sj6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 09:07:47 crc kubenswrapper[4861]: [-]has-synced failed: reason withheld Mar 09 09:07:47 crc kubenswrapper[4861]: [+]process-running ok Mar 09 09:07:47 crc kubenswrapper[4861]: healthz check failed Mar 09 09:07:47 crc kubenswrapper[4861]: I0309 09:07:47.533722 4861 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-c6sj6" podUID="8ff37c1b-1688-42ce-8b0c-952d297ae4a0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 09:07:47 crc kubenswrapper[4861]: I0309 09:07:47.534647 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 09:07:47 crc kubenswrapper[4861]: I0309 09:07:47.668807 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b8695b-8129-4f02-824d-5ca2a451d899" path="/var/lib/kubelet/pods/20b8695b-8129-4f02-824d-5ca2a451d899/volumes" Mar 09 09:07:47 crc kubenswrapper[4861]: I0309 09:07:47.670761 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 09 09:07:47 crc kubenswrapper[4861]: I0309 09:07:47.683999 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba6c5b9f-7812-4b6d-998b-6f368a6edf83" path="/var/lib/kubelet/pods/ba6c5b9f-7812-4b6d-998b-6f368a6edf83/volumes" Mar 09 09:07:47 crc kubenswrapper[4861]: I0309 09:07:47.852672 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 09 09:07:47 crc kubenswrapper[4861]: I0309 09:07:47.855167 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f9d12c95c652ca0a136cf2124f76dce6920bef3013aa0ed2eabd6a1deaf707ed"} Mar 09 09:07:47 crc kubenswrapper[4861]: I0309 09:07:47.856145 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:07:47 crc kubenswrapper[4861]: I0309 09:07:47.911335 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=25.91131056 podStartE2EDuration="25.91131056s" podCreationTimestamp="2026-03-09 09:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:07:47.903641174 +0000 UTC m=+110.988680575" watchObservedRunningTime="2026-03-09 09:07:47.91131056 +0000 UTC m=+110.996349961" Mar 09 09:07:47 crc kubenswrapper[4861]: I0309 09:07:47.916478 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"808431cc-74ea-4377-8c81-189b70544602","Type":"ContainerStarted","Data":"a876cb80251d3ef3627388dd1489c1ebf68c6008b6a635fba12572a7fa62d6af"} Mar 09 09:07:47 crc kubenswrapper[4861]: I0309 09:07:47.916532 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"808431cc-74ea-4377-8c81-189b70544602","Type":"ContainerStarted","Data":"c55dbb65cfca1b1b42f0489d392965fb6c4fe2203444df3c83588c36054b3eca"} Mar 09 09:07:47 crc kubenswrapper[4861]: I0309 09:07:47.934319 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-k2z5n" Mar 09 09:07:47 crc kubenswrapper[4861]: I0309 09:07:47.939486 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.939463268 podStartE2EDuration="2.939463268s" podCreationTimestamp="2026-03-09 09:07:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:07:47.93845865 +0000 UTC m=+111.023498051" watchObservedRunningTime="2026-03-09 09:07:47.939463268 +0000 UTC m=+111.024502669" Mar 09 09:07:47 crc kubenswrapper[4861]: I0309 09:07:47.945220 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j6fcf" event={"ID":"33158a78-8c1c-4aa1-9c51-66e21d0e8ae6","Type":"ContainerStarted","Data":"836787c6e1446ba45b92a6a440d102a09879f66c9140757741215da862ee2b2f"} Mar 09 09:07:47 crc kubenswrapper[4861]: I0309 09:07:47.949975 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" event={"ID":"de2fac75-67e1-47c9-9507-b8b5e5857c32","Type":"ContainerStarted","Data":"177973148e294570ba55cc2ef874b58295e992029884438b97035e4bcf3d4111"} Mar 09 09:07:47 crc kubenswrapper[4861]: I0309 09:07:47.964696 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-l2jkp" Mar 09 09:07:47 crc kubenswrapper[4861]: I0309 09:07:47.975584 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-hn7r4" Mar 09 09:07:48 crc kubenswrapper[4861]: I0309 09:07:48.022294 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" podStartSLOduration=67.022276165 podStartE2EDuration="1m7.022276165s" podCreationTimestamp="2026-03-09 09:06:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:07:48.021853384 +0000 UTC m=+111.106892775" watchObservedRunningTime="2026-03-09 09:07:48.022276165 +0000 UTC m=+111.107315566" Mar 09 09:07:48 crc kubenswrapper[4861]: E0309 09:07:48.035513 4861 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="60d212b5a01a67560a782f62fd8dcd1024f51380874431f6df6a561a0cc668d3" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 09 09:07:48 crc kubenswrapper[4861]: E0309 09:07:48.043029 4861 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="60d212b5a01a67560a782f62fd8dcd1024f51380874431f6df6a561a0cc668d3" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 09 09:07:48 crc kubenswrapper[4861]: E0309 09:07:48.063681 4861 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="60d212b5a01a67560a782f62fd8dcd1024f51380874431f6df6a561a0cc668d3" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 09 09:07:48 crc kubenswrapper[4861]: E0309 09:07:48.063748 4861 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-wdsnl" podUID="4eed3eac-42f8-4683-9c1f-3733965e6af7" containerName="kube-multus-additional-cni-plugins" Mar 09 09:07:48 crc kubenswrapper[4861]: E0309 09:07:48.210037 4861 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-pod808431cc_74ea_4377_8c81_189b70544602.slice/crio-a876cb80251d3ef3627388dd1489c1ebf68c6008b6a635fba12572a7fa62d6af.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-pod808431cc_74ea_4377_8c81_189b70544602.slice/crio-conmon-a876cb80251d3ef3627388dd1489c1ebf68c6008b6a635fba12572a7fa62d6af.scope\": RecentStats: unable to find data in memory cache]" Mar 09 09:07:48 crc kubenswrapper[4861]: I0309 09:07:48.256616 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69b899dc9d-p2hwl"] Mar 09 09:07:48 crc kubenswrapper[4861]: I0309 09:07:48.258999 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69b899dc9d-p2hwl" Mar 09 09:07:48 crc kubenswrapper[4861]: I0309 09:07:48.261768 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 09 09:07:48 crc kubenswrapper[4861]: I0309 09:07:48.271228 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zkngt"] Mar 09 09:07:48 crc kubenswrapper[4861]: I0309 09:07:48.273829 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 09 09:07:48 crc kubenswrapper[4861]: I0309 09:07:48.274064 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 09 09:07:48 crc kubenswrapper[4861]: I0309 09:07:48.274111 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 09 09:07:48 crc kubenswrapper[4861]: I0309 09:07:48.274332 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 09 09:07:48 crc kubenswrapper[4861]: I0309 09:07:48.275145 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 09 09:07:48 crc kubenswrapper[4861]: I0309 09:07:48.285316 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69b899dc9d-p2hwl"] Mar 09 09:07:48 crc kubenswrapper[4861]: I0309 09:07:48.327245 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 09 09:07:48 crc kubenswrapper[4861]: I0309 09:07:48.424269 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be4aeccd-be26-4a9e-9451-da6c51b2f597-config\") pod \"route-controller-manager-69b899dc9d-p2hwl\" (UID: \"be4aeccd-be26-4a9e-9451-da6c51b2f597\") " pod="openshift-route-controller-manager/route-controller-manager-69b899dc9d-p2hwl" Mar 09 09:07:48 crc kubenswrapper[4861]: I0309 09:07:48.424328 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5847f\" (UniqueName: \"kubernetes.io/projected/be4aeccd-be26-4a9e-9451-da6c51b2f597-kube-api-access-5847f\") pod \"route-controller-manager-69b899dc9d-p2hwl\" (UID: \"be4aeccd-be26-4a9e-9451-da6c51b2f597\") " pod="openshift-route-controller-manager/route-controller-manager-69b899dc9d-p2hwl" Mar 09 09:07:48 crc kubenswrapper[4861]: I0309 09:07:48.424360 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be4aeccd-be26-4a9e-9451-da6c51b2f597-serving-cert\") pod \"route-controller-manager-69b899dc9d-p2hwl\" (UID: \"be4aeccd-be26-4a9e-9451-da6c51b2f597\") " pod="openshift-route-controller-manager/route-controller-manager-69b899dc9d-p2hwl" Mar 09 09:07:48 crc kubenswrapper[4861]: I0309 09:07:48.424516 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/be4aeccd-be26-4a9e-9451-da6c51b2f597-client-ca\") pod \"route-controller-manager-69b899dc9d-p2hwl\" (UID: \"be4aeccd-be26-4a9e-9451-da6c51b2f597\") " pod="openshift-route-controller-manager/route-controller-manager-69b899dc9d-p2hwl" Mar 09 09:07:48 crc kubenswrapper[4861]: W0309 09:07:48.429157 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda1e43c63_f141_4254_b2e9_3102f26254ae.slice/crio-52dbd5a1557fa30a90397b6a491d2690ce9ee40ab595c45170ad56213d5179ba WatchSource:0}: Error finding container 52dbd5a1557fa30a90397b6a491d2690ce9ee40ab595c45170ad56213d5179ba: Status 404 returned error can't find the container with id 52dbd5a1557fa30a90397b6a491d2690ce9ee40ab595c45170ad56213d5179ba Mar 09 09:07:48 crc kubenswrapper[4861]: I0309 09:07:48.526993 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be4aeccd-be26-4a9e-9451-da6c51b2f597-config\") pod \"route-controller-manager-69b899dc9d-p2hwl\" (UID: \"be4aeccd-be26-4a9e-9451-da6c51b2f597\") " pod="openshift-route-controller-manager/route-controller-manager-69b899dc9d-p2hwl" Mar 09 09:07:48 crc kubenswrapper[4861]: I0309 09:07:48.527039 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5847f\" (UniqueName: \"kubernetes.io/projected/be4aeccd-be26-4a9e-9451-da6c51b2f597-kube-api-access-5847f\") pod \"route-controller-manager-69b899dc9d-p2hwl\" (UID: \"be4aeccd-be26-4a9e-9451-da6c51b2f597\") " pod="openshift-route-controller-manager/route-controller-manager-69b899dc9d-p2hwl" Mar 09 09:07:48 crc kubenswrapper[4861]: I0309 09:07:48.527082 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be4aeccd-be26-4a9e-9451-da6c51b2f597-serving-cert\") pod \"route-controller-manager-69b899dc9d-p2hwl\" (UID: \"be4aeccd-be26-4a9e-9451-da6c51b2f597\") " pod="openshift-route-controller-manager/route-controller-manager-69b899dc9d-p2hwl" Mar 09 09:07:48 crc kubenswrapper[4861]: I0309 09:07:48.527104 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/be4aeccd-be26-4a9e-9451-da6c51b2f597-client-ca\") pod \"route-controller-manager-69b899dc9d-p2hwl\" (UID: \"be4aeccd-be26-4a9e-9451-da6c51b2f597\") " pod="openshift-route-controller-manager/route-controller-manager-69b899dc9d-p2hwl" Mar 09 09:07:48 crc kubenswrapper[4861]: I0309 09:07:48.528016 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/be4aeccd-be26-4a9e-9451-da6c51b2f597-client-ca\") pod \"route-controller-manager-69b899dc9d-p2hwl\" (UID: \"be4aeccd-be26-4a9e-9451-da6c51b2f597\") " pod="openshift-route-controller-manager/route-controller-manager-69b899dc9d-p2hwl" Mar 09 09:07:48 crc kubenswrapper[4861]: I0309 09:07:48.528472 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be4aeccd-be26-4a9e-9451-da6c51b2f597-config\") pod \"route-controller-manager-69b899dc9d-p2hwl\" (UID: \"be4aeccd-be26-4a9e-9451-da6c51b2f597\") " pod="openshift-route-controller-manager/route-controller-manager-69b899dc9d-p2hwl" Mar 09 09:07:48 crc kubenswrapper[4861]: I0309 09:07:48.536491 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be4aeccd-be26-4a9e-9451-da6c51b2f597-serving-cert\") pod \"route-controller-manager-69b899dc9d-p2hwl\" (UID: \"be4aeccd-be26-4a9e-9451-da6c51b2f597\") " pod="openshift-route-controller-manager/route-controller-manager-69b899dc9d-p2hwl" Mar 09 09:07:48 crc kubenswrapper[4861]: I0309 09:07:48.540323 4861 patch_prober.go:28] interesting pod/router-default-5444994796-c6sj6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 09:07:48 crc kubenswrapper[4861]: [-]has-synced failed: reason withheld Mar 09 09:07:48 crc kubenswrapper[4861]: [+]process-running ok Mar 09 09:07:48 crc kubenswrapper[4861]: healthz check failed Mar 09 09:07:48 crc kubenswrapper[4861]: I0309 09:07:48.540403 4861 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-c6sj6" podUID="8ff37c1b-1688-42ce-8b0c-952d297ae4a0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 09:07:48 crc kubenswrapper[4861]: I0309 09:07:48.556336 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5847f\" (UniqueName: \"kubernetes.io/projected/be4aeccd-be26-4a9e-9451-da6c51b2f597-kube-api-access-5847f\") pod \"route-controller-manager-69b899dc9d-p2hwl\" (UID: \"be4aeccd-be26-4a9e-9451-da6c51b2f597\") " pod="openshift-route-controller-manager/route-controller-manager-69b899dc9d-p2hwl" Mar 09 09:07:48 crc kubenswrapper[4861]: I0309 09:07:48.594225 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69b899dc9d-p2hwl" Mar 09 09:07:49 crc kubenswrapper[4861]: I0309 09:07:49.029857 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"a1e43c63-f141-4254-b2e9-3102f26254ae","Type":"ContainerStarted","Data":"52dbd5a1557fa30a90397b6a491d2690ce9ee40ab595c45170ad56213d5179ba"} Mar 09 09:07:49 crc kubenswrapper[4861]: I0309 09:07:49.036563 4861 generic.go:334] "Generic (PLEG): container finished" podID="808431cc-74ea-4377-8c81-189b70544602" containerID="a876cb80251d3ef3627388dd1489c1ebf68c6008b6a635fba12572a7fa62d6af" exitCode=0 Mar 09 09:07:49 crc kubenswrapper[4861]: I0309 09:07:49.037077 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"808431cc-74ea-4377-8c81-189b70544602","Type":"ContainerDied","Data":"a876cb80251d3ef3627388dd1489c1ebf68c6008b6a635fba12572a7fa62d6af"} Mar 09 09:07:49 crc kubenswrapper[4861]: I0309 09:07:49.046459 4861 generic.go:334] "Generic (PLEG): container finished" podID="33158a78-8c1c-4aa1-9c51-66e21d0e8ae6" containerID="7e1bb6d2bee4e18b8b866e5e2b2fcf1b9f4a74322d6ea9aa7af13b82e7b69b3c" exitCode=0 Mar 09 09:07:49 crc kubenswrapper[4861]: I0309 09:07:49.046538 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j6fcf" event={"ID":"33158a78-8c1c-4aa1-9c51-66e21d0e8ae6","Type":"ContainerDied","Data":"7e1bb6d2bee4e18b8b866e5e2b2fcf1b9f4a74322d6ea9aa7af13b82e7b69b3c"} Mar 09 09:07:49 crc kubenswrapper[4861]: I0309 09:07:49.052032 4861 generic.go:334] "Generic (PLEG): container finished" podID="6eac3eed-7721-4030-b1e3-9dd28fea2e49" containerID="40bf8b5e100c32cd1129b993c6b7bc18d00a45daea9d9338afae72575f43af1a" exitCode=0 Mar 09 09:07:49 crc kubenswrapper[4861]: I0309 09:07:49.052209 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zkngt" event={"ID":"6eac3eed-7721-4030-b1e3-9dd28fea2e49","Type":"ContainerDied","Data":"40bf8b5e100c32cd1129b993c6b7bc18d00a45daea9d9338afae72575f43af1a"} Mar 09 09:07:49 crc kubenswrapper[4861]: I0309 09:07:49.052264 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zkngt" event={"ID":"6eac3eed-7721-4030-b1e3-9dd28fea2e49","Type":"ContainerStarted","Data":"1695d96a7543fbd3121585572761ea892924ee4533e8b4563882d1bd9d24c289"} Mar 09 09:07:49 crc kubenswrapper[4861]: I0309 09:07:49.055013 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:07:49 crc kubenswrapper[4861]: I0309 09:07:49.256982 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69b899dc9d-p2hwl"] Mar 09 09:07:49 crc kubenswrapper[4861]: I0309 09:07:49.543465 4861 patch_prober.go:28] interesting pod/router-default-5444994796-c6sj6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 09:07:49 crc kubenswrapper[4861]: [-]has-synced failed: reason withheld Mar 09 09:07:49 crc kubenswrapper[4861]: [+]process-running ok Mar 09 09:07:49 crc kubenswrapper[4861]: healthz check failed Mar 09 09:07:49 crc kubenswrapper[4861]: I0309 09:07:49.544128 4861 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-c6sj6" podUID="8ff37c1b-1688-42ce-8b0c-952d297ae4a0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 09:07:50 crc kubenswrapper[4861]: I0309 09:07:50.070174 4861 generic.go:334] "Generic (PLEG): container finished" podID="608b7b11-f38a-4c4b-9e61-dab4f84c34c1" containerID="32abcafa9c87fb0e758e3fc32846e0cfb9678f874a5c21171ddebf46b78f0189" exitCode=0 Mar 09 09:07:50 crc kubenswrapper[4861]: I0309 09:07:50.070233 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550780-wpnp9" event={"ID":"608b7b11-f38a-4c4b-9e61-dab4f84c34c1","Type":"ContainerDied","Data":"32abcafa9c87fb0e758e3fc32846e0cfb9678f874a5c21171ddebf46b78f0189"} Mar 09 09:07:50 crc kubenswrapper[4861]: I0309 09:07:50.081126 4861 generic.go:334] "Generic (PLEG): container finished" podID="a1e43c63-f141-4254-b2e9-3102f26254ae" containerID="de632d46e7e4b61551504eae76fe0b5a72635ef3440142aea6283b259abbe9dc" exitCode=0 Mar 09 09:07:50 crc kubenswrapper[4861]: I0309 09:07:50.081187 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"a1e43c63-f141-4254-b2e9-3102f26254ae","Type":"ContainerDied","Data":"de632d46e7e4b61551504eae76fe0b5a72635ef3440142aea6283b259abbe9dc"} Mar 09 09:07:50 crc kubenswrapper[4861]: I0309 09:07:50.086864 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69b899dc9d-p2hwl" event={"ID":"be4aeccd-be26-4a9e-9451-da6c51b2f597","Type":"ContainerStarted","Data":"0d59c062d6216061b83f037f329eedeee1aea0904ab5a597d17d2a1e1be39826"} Mar 09 09:07:50 crc kubenswrapper[4861]: I0309 09:07:50.086885 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69b899dc9d-p2hwl" event={"ID":"be4aeccd-be26-4a9e-9451-da6c51b2f597","Type":"ContainerStarted","Data":"61a18e5bde6f81ebbf8d4b71f2fc7a646e68abf552da0d7d9a66b2a4340c2659"} Mar 09 09:07:50 crc kubenswrapper[4861]: I0309 09:07:50.086898 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-69b899dc9d-p2hwl" Mar 09 09:07:50 crc kubenswrapper[4861]: I0309 09:07:50.101829 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-69b899dc9d-p2hwl" Mar 09 09:07:50 crc kubenswrapper[4861]: I0309 09:07:50.122046 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-69b899dc9d-p2hwl" podStartSLOduration=6.122030584 podStartE2EDuration="6.122030584s" podCreationTimestamp="2026-03-09 09:07:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:07:50.12151001 +0000 UTC m=+113.206549411" watchObservedRunningTime="2026-03-09 09:07:50.122030584 +0000 UTC m=+113.207069985" Mar 09 09:07:50 crc kubenswrapper[4861]: I0309 09:07:50.467406 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 09:07:50 crc kubenswrapper[4861]: I0309 09:07:50.510665 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:07:50 crc kubenswrapper[4861]: I0309 09:07:50.510793 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:07:50 crc kubenswrapper[4861]: I0309 09:07:50.510820 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:07:50 crc kubenswrapper[4861]: I0309 09:07:50.510841 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:07:50 crc kubenswrapper[4861]: I0309 09:07:50.516394 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:07:50 crc kubenswrapper[4861]: I0309 09:07:50.517486 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:07:50 crc kubenswrapper[4861]: I0309 09:07:50.532599 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:07:50 crc kubenswrapper[4861]: I0309 09:07:50.533853 4861 patch_prober.go:28] interesting pod/router-default-5444994796-c6sj6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 09:07:50 crc kubenswrapper[4861]: [-]has-synced failed: reason withheld Mar 09 09:07:50 crc kubenswrapper[4861]: [+]process-running ok Mar 09 09:07:50 crc kubenswrapper[4861]: healthz check failed Mar 09 09:07:50 crc kubenswrapper[4861]: I0309 09:07:50.533926 4861 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-c6sj6" podUID="8ff37c1b-1688-42ce-8b0c-952d297ae4a0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 09:07:50 crc kubenswrapper[4861]: I0309 09:07:50.611422 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:07:50 crc kubenswrapper[4861]: I0309 09:07:50.611705 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/808431cc-74ea-4377-8c81-189b70544602-kubelet-dir\") pod \"808431cc-74ea-4377-8c81-189b70544602\" (UID: \"808431cc-74ea-4377-8c81-189b70544602\") " Mar 09 09:07:50 crc kubenswrapper[4861]: I0309 09:07:50.611839 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/808431cc-74ea-4377-8c81-189b70544602-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "808431cc-74ea-4377-8c81-189b70544602" (UID: "808431cc-74ea-4377-8c81-189b70544602"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:07:50 crc kubenswrapper[4861]: I0309 09:07:50.611966 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/808431cc-74ea-4377-8c81-189b70544602-kube-api-access\") pod \"808431cc-74ea-4377-8c81-189b70544602\" (UID: \"808431cc-74ea-4377-8c81-189b70544602\") " Mar 09 09:07:50 crc kubenswrapper[4861]: I0309 09:07:50.612343 4861 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/808431cc-74ea-4377-8c81-189b70544602-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:50 crc kubenswrapper[4861]: I0309 09:07:50.616508 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/808431cc-74ea-4377-8c81-189b70544602-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "808431cc-74ea-4377-8c81-189b70544602" (UID: "808431cc-74ea-4377-8c81-189b70544602"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:07:50 crc kubenswrapper[4861]: I0309 09:07:50.629297 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 09:07:50 crc kubenswrapper[4861]: I0309 09:07:50.683878 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:07:50 crc kubenswrapper[4861]: I0309 09:07:50.715090 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/808431cc-74ea-4377-8c81-189b70544602-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:50 crc kubenswrapper[4861]: I0309 09:07:50.733988 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 09:07:51 crc kubenswrapper[4861]: I0309 09:07:51.179292 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"808431cc-74ea-4377-8c81-189b70544602","Type":"ContainerDied","Data":"c55dbb65cfca1b1b42f0489d392965fb6c4fe2203444df3c83588c36054b3eca"} Mar 09 09:07:51 crc kubenswrapper[4861]: I0309 09:07:51.179715 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c55dbb65cfca1b1b42f0489d392965fb6c4fe2203444df3c83588c36054b3eca" Mar 09 09:07:51 crc kubenswrapper[4861]: I0309 09:07:51.179835 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 09:07:51 crc kubenswrapper[4861]: I0309 09:07:51.183646 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"f2d1ff3b76da623cf9ab9e114c89157ee102cd76212b6b6215b38f5d1f20cca3"} Mar 09 09:07:51 crc kubenswrapper[4861]: I0309 09:07:51.538397 4861 patch_prober.go:28] interesting pod/router-default-5444994796-c6sj6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 09:07:51 crc kubenswrapper[4861]: [-]has-synced failed: reason withheld Mar 09 09:07:51 crc kubenswrapper[4861]: [+]process-running ok Mar 09 09:07:51 crc kubenswrapper[4861]: healthz check failed Mar 09 09:07:51 crc kubenswrapper[4861]: I0309 09:07:51.538680 4861 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-c6sj6" podUID="8ff37c1b-1688-42ce-8b0c-952d297ae4a0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 09:07:51 crc kubenswrapper[4861]: I0309 09:07:51.544030 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550780-wpnp9" Mar 09 09:07:51 crc kubenswrapper[4861]: I0309 09:07:51.548705 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 09:07:51 crc kubenswrapper[4861]: I0309 09:07:51.650460 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a1e43c63-f141-4254-b2e9-3102f26254ae-kubelet-dir\") pod \"a1e43c63-f141-4254-b2e9-3102f26254ae\" (UID: \"a1e43c63-f141-4254-b2e9-3102f26254ae\") " Mar 09 09:07:51 crc kubenswrapper[4861]: I0309 09:07:51.650532 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dk4ng\" (UniqueName: \"kubernetes.io/projected/608b7b11-f38a-4c4b-9e61-dab4f84c34c1-kube-api-access-dk4ng\") pod \"608b7b11-f38a-4c4b-9e61-dab4f84c34c1\" (UID: \"608b7b11-f38a-4c4b-9e61-dab4f84c34c1\") " Mar 09 09:07:51 crc kubenswrapper[4861]: I0309 09:07:51.650870 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/608b7b11-f38a-4c4b-9e61-dab4f84c34c1-secret-volume\") pod \"608b7b11-f38a-4c4b-9e61-dab4f84c34c1\" (UID: \"608b7b11-f38a-4c4b-9e61-dab4f84c34c1\") " Mar 09 09:07:51 crc kubenswrapper[4861]: I0309 09:07:51.650969 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a1e43c63-f141-4254-b2e9-3102f26254ae-kube-api-access\") pod \"a1e43c63-f141-4254-b2e9-3102f26254ae\" (UID: \"a1e43c63-f141-4254-b2e9-3102f26254ae\") " Mar 09 09:07:51 crc kubenswrapper[4861]: I0309 09:07:51.650991 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/608b7b11-f38a-4c4b-9e61-dab4f84c34c1-config-volume\") pod \"608b7b11-f38a-4c4b-9e61-dab4f84c34c1\" (UID: \"608b7b11-f38a-4c4b-9e61-dab4f84c34c1\") " Mar 09 09:07:51 crc kubenswrapper[4861]: I0309 09:07:51.651557 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a1e43c63-f141-4254-b2e9-3102f26254ae-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a1e43c63-f141-4254-b2e9-3102f26254ae" (UID: "a1e43c63-f141-4254-b2e9-3102f26254ae"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:07:51 crc kubenswrapper[4861]: I0309 09:07:51.651871 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/608b7b11-f38a-4c4b-9e61-dab4f84c34c1-config-volume" (OuterVolumeSpecName: "config-volume") pod "608b7b11-f38a-4c4b-9e61-dab4f84c34c1" (UID: "608b7b11-f38a-4c4b-9e61-dab4f84c34c1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:07:51 crc kubenswrapper[4861]: I0309 09:07:51.657247 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1e43c63-f141-4254-b2e9-3102f26254ae-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a1e43c63-f141-4254-b2e9-3102f26254ae" (UID: "a1e43c63-f141-4254-b2e9-3102f26254ae"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:07:51 crc kubenswrapper[4861]: I0309 09:07:51.657326 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/608b7b11-f38a-4c4b-9e61-dab4f84c34c1-kube-api-access-dk4ng" (OuterVolumeSpecName: "kube-api-access-dk4ng") pod "608b7b11-f38a-4c4b-9e61-dab4f84c34c1" (UID: "608b7b11-f38a-4c4b-9e61-dab4f84c34c1"). InnerVolumeSpecName "kube-api-access-dk4ng". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:07:51 crc kubenswrapper[4861]: I0309 09:07:51.658013 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/608b7b11-f38a-4c4b-9e61-dab4f84c34c1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "608b7b11-f38a-4c4b-9e61-dab4f84c34c1" (UID: "608b7b11-f38a-4c4b-9e61-dab4f84c34c1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:07:51 crc kubenswrapper[4861]: I0309 09:07:51.762909 4861 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/608b7b11-f38a-4c4b-9e61-dab4f84c34c1-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:51 crc kubenswrapper[4861]: I0309 09:07:51.763580 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a1e43c63-f141-4254-b2e9-3102f26254ae-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:51 crc kubenswrapper[4861]: I0309 09:07:51.763598 4861 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/608b7b11-f38a-4c4b-9e61-dab4f84c34c1-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:51 crc kubenswrapper[4861]: I0309 09:07:51.763612 4861 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a1e43c63-f141-4254-b2e9-3102f26254ae-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:51 crc kubenswrapper[4861]: I0309 09:07:51.763628 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dk4ng\" (UniqueName: \"kubernetes.io/projected/608b7b11-f38a-4c4b-9e61-dab4f84c34c1-kube-api-access-dk4ng\") on node \"crc\" DevicePath \"\"" Mar 09 09:07:52 crc kubenswrapper[4861]: I0309 09:07:52.225764 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550780-wpnp9" Mar 09 09:07:52 crc kubenswrapper[4861]: I0309 09:07:52.225801 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550780-wpnp9" event={"ID":"608b7b11-f38a-4c4b-9e61-dab4f84c34c1","Type":"ContainerDied","Data":"7c31071fdbbf656b6b9449cb94bebb761b671f078696c06cd2a736d5dee90660"} Mar 09 09:07:52 crc kubenswrapper[4861]: I0309 09:07:52.225849 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c31071fdbbf656b6b9449cb94bebb761b671f078696c06cd2a736d5dee90660" Mar 09 09:07:52 crc kubenswrapper[4861]: I0309 09:07:52.235904 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"a1e43c63-f141-4254-b2e9-3102f26254ae","Type":"ContainerDied","Data":"52dbd5a1557fa30a90397b6a491d2690ce9ee40ab595c45170ad56213d5179ba"} Mar 09 09:07:52 crc kubenswrapper[4861]: I0309 09:07:52.235957 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52dbd5a1557fa30a90397b6a491d2690ce9ee40ab595c45170ad56213d5179ba" Mar 09 09:07:52 crc kubenswrapper[4861]: I0309 09:07:52.236043 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 09:07:52 crc kubenswrapper[4861]: I0309 09:07:52.245422 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"f4959a0acebc33107504fb5f6b75149880a582af55bc3128a98afc24a1e967a8"} Mar 09 09:07:52 crc kubenswrapper[4861]: I0309 09:07:52.245476 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"756d3ac11ce192323b0d89c066abff079ebb931a3b324317e11a46569bdc0370"} Mar 09 09:07:52 crc kubenswrapper[4861]: I0309 09:07:52.262867 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"626bff39c6e191859a576c5c6417bfe60f765e26daa43035c05ddccfe73ad184"} Mar 09 09:07:52 crc kubenswrapper[4861]: I0309 09:07:52.287965 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"f39f82163a7371a1c07bc285e624975bfcc7f07d3e65be7e5e9e605bc35dc077"} Mar 09 09:07:52 crc kubenswrapper[4861]: I0309 09:07:52.288022 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"6289f2cad71335bb4da5886381547971e3048a75ff6c7853449864dda5d1c306"} Mar 09 09:07:52 crc kubenswrapper[4861]: I0309 09:07:52.288333 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:07:52 crc kubenswrapper[4861]: I0309 09:07:52.533153 4861 patch_prober.go:28] interesting pod/router-default-5444994796-c6sj6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 09:07:52 crc kubenswrapper[4861]: [-]has-synced failed: reason withheld Mar 09 09:07:52 crc kubenswrapper[4861]: [+]process-running ok Mar 09 09:07:52 crc kubenswrapper[4861]: healthz check failed Mar 09 09:07:52 crc kubenswrapper[4861]: I0309 09:07:52.533213 4861 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-c6sj6" podUID="8ff37c1b-1688-42ce-8b0c-952d297ae4a0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 09:07:52 crc kubenswrapper[4861]: I0309 09:07:52.946964 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-zrbdj" Mar 09 09:07:53 crc kubenswrapper[4861]: I0309 09:07:53.535913 4861 patch_prober.go:28] interesting pod/router-default-5444994796-c6sj6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 09:07:53 crc kubenswrapper[4861]: [-]has-synced failed: reason withheld Mar 09 09:07:53 crc kubenswrapper[4861]: [+]process-running ok Mar 09 09:07:53 crc kubenswrapper[4861]: healthz check failed Mar 09 09:07:53 crc kubenswrapper[4861]: I0309 09:07:53.536304 4861 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-c6sj6" podUID="8ff37c1b-1688-42ce-8b0c-952d297ae4a0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 09:07:54 crc kubenswrapper[4861]: I0309 09:07:54.533111 4861 patch_prober.go:28] interesting pod/router-default-5444994796-c6sj6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 09:07:54 crc kubenswrapper[4861]: [-]has-synced failed: reason withheld Mar 09 09:07:54 crc kubenswrapper[4861]: [+]process-running ok Mar 09 09:07:54 crc kubenswrapper[4861]: healthz check failed Mar 09 09:07:54 crc kubenswrapper[4861]: I0309 09:07:54.533169 4861 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-c6sj6" podUID="8ff37c1b-1688-42ce-8b0c-952d297ae4a0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 09:07:54 crc kubenswrapper[4861]: I0309 09:07:54.705935 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:07:55 crc kubenswrapper[4861]: I0309 09:07:55.534994 4861 patch_prober.go:28] interesting pod/router-default-5444994796-c6sj6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 09:07:55 crc kubenswrapper[4861]: [-]has-synced failed: reason withheld Mar 09 09:07:55 crc kubenswrapper[4861]: [+]process-running ok Mar 09 09:07:55 crc kubenswrapper[4861]: healthz check failed Mar 09 09:07:55 crc kubenswrapper[4861]: I0309 09:07:55.535094 4861 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-c6sj6" podUID="8ff37c1b-1688-42ce-8b0c-952d297ae4a0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 09:07:56 crc kubenswrapper[4861]: I0309 09:07:56.532284 4861 patch_prober.go:28] interesting pod/router-default-5444994796-c6sj6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 09:07:56 crc kubenswrapper[4861]: [-]has-synced failed: reason withheld Mar 09 09:07:56 crc kubenswrapper[4861]: [+]process-running ok Mar 09 09:07:56 crc kubenswrapper[4861]: healthz check failed Mar 09 09:07:56 crc kubenswrapper[4861]: I0309 09:07:56.532335 4861 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-c6sj6" podUID="8ff37c1b-1688-42ce-8b0c-952d297ae4a0" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 09:07:56 crc kubenswrapper[4861]: I0309 09:07:56.881305 4861 patch_prober.go:28] interesting pod/console-f9d7485db-qh9sg container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Mar 09 09:07:56 crc kubenswrapper[4861]: I0309 09:07:56.881361 4861 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-qh9sg" podUID="85a3bbcb-e663-4a97-980c-606c979409d7" containerName="console" probeResult="failure" output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" Mar 09 09:07:57 crc kubenswrapper[4861]: I0309 09:07:57.269582 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-h6j45" Mar 09 09:07:57 crc kubenswrapper[4861]: I0309 09:07:57.532482 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-c6sj6" Mar 09 09:07:57 crc kubenswrapper[4861]: I0309 09:07:57.534183 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-c6sj6" Mar 09 09:07:58 crc kubenswrapper[4861]: E0309 09:07:58.009440 4861 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="60d212b5a01a67560a782f62fd8dcd1024f51380874431f6df6a561a0cc668d3" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 09 09:07:58 crc kubenswrapper[4861]: E0309 09:07:58.012057 4861 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="60d212b5a01a67560a782f62fd8dcd1024f51380874431f6df6a561a0cc668d3" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 09 09:07:58 crc kubenswrapper[4861]: E0309 09:07:58.016351 4861 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="60d212b5a01a67560a782f62fd8dcd1024f51380874431f6df6a561a0cc668d3" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 09 09:07:58 crc kubenswrapper[4861]: E0309 09:07:58.016439 4861 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-wdsnl" podUID="4eed3eac-42f8-4683-9c1f-3733965e6af7" containerName="kube-multus-additional-cni-plugins" Mar 09 09:08:00 crc kubenswrapper[4861]: I0309 09:08:00.125931 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550788-wz6bg"] Mar 09 09:08:00 crc kubenswrapper[4861]: E0309 09:08:00.126596 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1e43c63-f141-4254-b2e9-3102f26254ae" containerName="pruner" Mar 09 09:08:00 crc kubenswrapper[4861]: I0309 09:08:00.126609 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1e43c63-f141-4254-b2e9-3102f26254ae" containerName="pruner" Mar 09 09:08:00 crc kubenswrapper[4861]: E0309 09:08:00.126618 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="608b7b11-f38a-4c4b-9e61-dab4f84c34c1" containerName="collect-profiles" Mar 09 09:08:00 crc kubenswrapper[4861]: I0309 09:08:00.126624 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="608b7b11-f38a-4c4b-9e61-dab4f84c34c1" containerName="collect-profiles" Mar 09 09:08:00 crc kubenswrapper[4861]: E0309 09:08:00.126636 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="808431cc-74ea-4377-8c81-189b70544602" containerName="pruner" Mar 09 09:08:00 crc kubenswrapper[4861]: I0309 09:08:00.126644 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="808431cc-74ea-4377-8c81-189b70544602" containerName="pruner" Mar 09 09:08:00 crc kubenswrapper[4861]: I0309 09:08:00.126742 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="808431cc-74ea-4377-8c81-189b70544602" containerName="pruner" Mar 09 09:08:00 crc kubenswrapper[4861]: I0309 09:08:00.126754 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1e43c63-f141-4254-b2e9-3102f26254ae" containerName="pruner" Mar 09 09:08:00 crc kubenswrapper[4861]: I0309 09:08:00.126761 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="608b7b11-f38a-4c4b-9e61-dab4f84c34c1" containerName="collect-profiles" Mar 09 09:08:00 crc kubenswrapper[4861]: I0309 09:08:00.127106 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550788-wz6bg" Mar 09 09:08:00 crc kubenswrapper[4861]: I0309 09:08:00.129026 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k86l8" Mar 09 09:08:00 crc kubenswrapper[4861]: I0309 09:08:00.136041 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:08:00 crc kubenswrapper[4861]: I0309 09:08:00.142730 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550788-wz6bg"] Mar 09 09:08:00 crc kubenswrapper[4861]: I0309 09:08:00.144263 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:08:00 crc kubenswrapper[4861]: I0309 09:08:00.219179 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvrzs\" (UniqueName: \"kubernetes.io/projected/6effa8f1-34f2-4a9e-b5cb-71a02695603e-kube-api-access-dvrzs\") pod \"auto-csr-approver-29550788-wz6bg\" (UID: \"6effa8f1-34f2-4a9e-b5cb-71a02695603e\") " pod="openshift-infra/auto-csr-approver-29550788-wz6bg" Mar 09 09:08:00 crc kubenswrapper[4861]: I0309 09:08:00.320294 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvrzs\" (UniqueName: \"kubernetes.io/projected/6effa8f1-34f2-4a9e-b5cb-71a02695603e-kube-api-access-dvrzs\") pod \"auto-csr-approver-29550788-wz6bg\" (UID: \"6effa8f1-34f2-4a9e-b5cb-71a02695603e\") " pod="openshift-infra/auto-csr-approver-29550788-wz6bg" Mar 09 09:08:00 crc kubenswrapper[4861]: I0309 09:08:00.341994 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvrzs\" (UniqueName: \"kubernetes.io/projected/6effa8f1-34f2-4a9e-b5cb-71a02695603e-kube-api-access-dvrzs\") pod \"auto-csr-approver-29550788-wz6bg\" (UID: \"6effa8f1-34f2-4a9e-b5cb-71a02695603e\") " pod="openshift-infra/auto-csr-approver-29550788-wz6bg" Mar 09 09:08:00 crc kubenswrapper[4861]: I0309 09:08:00.449493 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550788-wz6bg" Mar 09 09:08:00 crc kubenswrapper[4861]: I0309 09:08:00.506624 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:08:02 crc kubenswrapper[4861]: I0309 09:08:02.290435 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550788-wz6bg"] Mar 09 09:08:02 crc kubenswrapper[4861]: I0309 09:08:02.384168 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550788-wz6bg" event={"ID":"6effa8f1-34f2-4a9e-b5cb-71a02695603e","Type":"ContainerStarted","Data":"937675e303e874c10f135e48a4ad8e4d0be11c7a949c0d88f6d2342e66c19273"} Mar 09 09:08:02 crc kubenswrapper[4861]: I0309 09:08:02.657690 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-748457ff4-72cdj"] Mar 09 09:08:02 crc kubenswrapper[4861]: I0309 09:08:02.658863 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-748457ff4-72cdj" podUID="7727f6b7-e2a9-4554-9224-a4600931b08a" containerName="controller-manager" containerID="cri-o://02258d50bdab0b7d450ab4919645b266eba5654e6192f751394ccaa5699570f5" gracePeriod=30 Mar 09 09:08:02 crc kubenswrapper[4861]: I0309 09:08:02.682574 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69b899dc9d-p2hwl"] Mar 09 09:08:02 crc kubenswrapper[4861]: I0309 09:08:02.683163 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-69b899dc9d-p2hwl" podUID="be4aeccd-be26-4a9e-9451-da6c51b2f597" containerName="route-controller-manager" containerID="cri-o://0d59c062d6216061b83f037f329eedeee1aea0904ab5a597d17d2a1e1be39826" gracePeriod=30 Mar 09 09:08:03 crc kubenswrapper[4861]: I0309 09:08:03.395809 4861 generic.go:334] "Generic (PLEG): container finished" podID="be4aeccd-be26-4a9e-9451-da6c51b2f597" containerID="0d59c062d6216061b83f037f329eedeee1aea0904ab5a597d17d2a1e1be39826" exitCode=0 Mar 09 09:08:03 crc kubenswrapper[4861]: I0309 09:08:03.395998 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69b899dc9d-p2hwl" event={"ID":"be4aeccd-be26-4a9e-9451-da6c51b2f597","Type":"ContainerDied","Data":"0d59c062d6216061b83f037f329eedeee1aea0904ab5a597d17d2a1e1be39826"} Mar 09 09:08:03 crc kubenswrapper[4861]: I0309 09:08:03.399827 4861 generic.go:334] "Generic (PLEG): container finished" podID="7727f6b7-e2a9-4554-9224-a4600931b08a" containerID="02258d50bdab0b7d450ab4919645b266eba5654e6192f751394ccaa5699570f5" exitCode=0 Mar 09 09:08:03 crc kubenswrapper[4861]: I0309 09:08:03.399881 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-748457ff4-72cdj" event={"ID":"7727f6b7-e2a9-4554-9224-a4600931b08a","Type":"ContainerDied","Data":"02258d50bdab0b7d450ab4919645b266eba5654e6192f751394ccaa5699570f5"} Mar 09 09:08:03 crc kubenswrapper[4861]: I0309 09:08:03.684118 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 09 09:08:06 crc kubenswrapper[4861]: I0309 09:08:06.200194 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:08:06 crc kubenswrapper[4861]: I0309 09:08:06.226119 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=3.226070778 podStartE2EDuration="3.226070778s" podCreationTimestamp="2026-03-09 09:08:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:08:06.219932992 +0000 UTC m=+129.304972413" watchObservedRunningTime="2026-03-09 09:08:06.226070778 +0000 UTC m=+129.311110189" Mar 09 09:08:06 crc kubenswrapper[4861]: I0309 09:08:06.864521 4861 patch_prober.go:28] interesting pod/controller-manager-748457ff4-72cdj container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.49:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 09:08:06 crc kubenswrapper[4861]: I0309 09:08:06.864581 4861 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-748457ff4-72cdj" podUID="7727f6b7-e2a9-4554-9224-a4600931b08a" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.49:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 09:08:06 crc kubenswrapper[4861]: I0309 09:08:06.883798 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-qh9sg" Mar 09 09:08:06 crc kubenswrapper[4861]: I0309 09:08:06.888319 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-qh9sg" Mar 09 09:08:07 crc kubenswrapper[4861]: E0309 09:08:07.997645 4861 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="60d212b5a01a67560a782f62fd8dcd1024f51380874431f6df6a561a0cc668d3" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 09 09:08:08 crc kubenswrapper[4861]: E0309 09:08:08.001191 4861 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="60d212b5a01a67560a782f62fd8dcd1024f51380874431f6df6a561a0cc668d3" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 09 09:08:08 crc kubenswrapper[4861]: E0309 09:08:08.003325 4861 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="60d212b5a01a67560a782f62fd8dcd1024f51380874431f6df6a561a0cc668d3" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 09 09:08:08 crc kubenswrapper[4861]: E0309 09:08:08.003436 4861 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-wdsnl" podUID="4eed3eac-42f8-4683-9c1f-3733965e6af7" containerName="kube-multus-additional-cni-plugins" Mar 09 09:08:09 crc kubenswrapper[4861]: I0309 09:08:09.598860 4861 patch_prober.go:28] interesting pod/route-controller-manager-69b899dc9d-p2hwl container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.55:8443/healthz\": dial tcp 10.217.0.55:8443: i/o timeout" start-of-body= Mar 09 09:08:09 crc kubenswrapper[4861]: I0309 09:08:09.599267 4861 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-69b899dc9d-p2hwl" podUID="be4aeccd-be26-4a9e-9451-da6c51b2f597" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.55:8443/healthz\": dial tcp 10.217.0.55:8443: i/o timeout" Mar 09 09:08:11 crc kubenswrapper[4861]: I0309 09:08:11.368734 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69b899dc9d-p2hwl" Mar 09 09:08:11 crc kubenswrapper[4861]: I0309 09:08:11.394353 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-684f6f5f69-tqhd5"] Mar 09 09:08:11 crc kubenswrapper[4861]: E0309 09:08:11.396757 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be4aeccd-be26-4a9e-9451-da6c51b2f597" containerName="route-controller-manager" Mar 09 09:08:11 crc kubenswrapper[4861]: I0309 09:08:11.396892 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="be4aeccd-be26-4a9e-9451-da6c51b2f597" containerName="route-controller-manager" Mar 09 09:08:11 crc kubenswrapper[4861]: I0309 09:08:11.397096 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="be4aeccd-be26-4a9e-9451-da6c51b2f597" containerName="route-controller-manager" Mar 09 09:08:11 crc kubenswrapper[4861]: I0309 09:08:11.397725 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-684f6f5f69-tqhd5" Mar 09 09:08:11 crc kubenswrapper[4861]: I0309 09:08:11.398162 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/be4aeccd-be26-4a9e-9451-da6c51b2f597-client-ca\") pod \"be4aeccd-be26-4a9e-9451-da6c51b2f597\" (UID: \"be4aeccd-be26-4a9e-9451-da6c51b2f597\") " Mar 09 09:08:11 crc kubenswrapper[4861]: I0309 09:08:11.398234 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be4aeccd-be26-4a9e-9451-da6c51b2f597-config\") pod \"be4aeccd-be26-4a9e-9451-da6c51b2f597\" (UID: \"be4aeccd-be26-4a9e-9451-da6c51b2f597\") " Mar 09 09:08:11 crc kubenswrapper[4861]: I0309 09:08:11.398268 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5847f\" (UniqueName: \"kubernetes.io/projected/be4aeccd-be26-4a9e-9451-da6c51b2f597-kube-api-access-5847f\") pod \"be4aeccd-be26-4a9e-9451-da6c51b2f597\" (UID: \"be4aeccd-be26-4a9e-9451-da6c51b2f597\") " Mar 09 09:08:11 crc kubenswrapper[4861]: I0309 09:08:11.398310 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be4aeccd-be26-4a9e-9451-da6c51b2f597-serving-cert\") pod \"be4aeccd-be26-4a9e-9451-da6c51b2f597\" (UID: \"be4aeccd-be26-4a9e-9451-da6c51b2f597\") " Mar 09 09:08:11 crc kubenswrapper[4861]: I0309 09:08:11.399454 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be4aeccd-be26-4a9e-9451-da6c51b2f597-config" (OuterVolumeSpecName: "config") pod "be4aeccd-be26-4a9e-9451-da6c51b2f597" (UID: "be4aeccd-be26-4a9e-9451-da6c51b2f597"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:11 crc kubenswrapper[4861]: I0309 09:08:11.401836 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be4aeccd-be26-4a9e-9451-da6c51b2f597-client-ca" (OuterVolumeSpecName: "client-ca") pod "be4aeccd-be26-4a9e-9451-da6c51b2f597" (UID: "be4aeccd-be26-4a9e-9451-da6c51b2f597"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:11 crc kubenswrapper[4861]: I0309 09:08:11.406627 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-684f6f5f69-tqhd5"] Mar 09 09:08:11 crc kubenswrapper[4861]: I0309 09:08:11.407743 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be4aeccd-be26-4a9e-9451-da6c51b2f597-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "be4aeccd-be26-4a9e-9451-da6c51b2f597" (UID: "be4aeccd-be26-4a9e-9451-da6c51b2f597"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:11 crc kubenswrapper[4861]: I0309 09:08:11.410455 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be4aeccd-be26-4a9e-9451-da6c51b2f597-kube-api-access-5847f" (OuterVolumeSpecName: "kube-api-access-5847f") pod "be4aeccd-be26-4a9e-9451-da6c51b2f597" (UID: "be4aeccd-be26-4a9e-9451-da6c51b2f597"). InnerVolumeSpecName "kube-api-access-5847f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:11 crc kubenswrapper[4861]: I0309 09:08:11.459157 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69b899dc9d-p2hwl" event={"ID":"be4aeccd-be26-4a9e-9451-da6c51b2f597","Type":"ContainerDied","Data":"61a18e5bde6f81ebbf8d4b71f2fc7a646e68abf552da0d7d9a66b2a4340c2659"} Mar 09 09:08:11 crc kubenswrapper[4861]: I0309 09:08:11.459221 4861 scope.go:117] "RemoveContainer" containerID="0d59c062d6216061b83f037f329eedeee1aea0904ab5a597d17d2a1e1be39826" Mar 09 09:08:11 crc kubenswrapper[4861]: I0309 09:08:11.459217 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69b899dc9d-p2hwl" Mar 09 09:08:11 crc kubenswrapper[4861]: I0309 09:08:11.488763 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69b899dc9d-p2hwl"] Mar 09 09:08:11 crc kubenswrapper[4861]: I0309 09:08:11.491696 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69b899dc9d-p2hwl"] Mar 09 09:08:11 crc kubenswrapper[4861]: I0309 09:08:11.499730 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fc67w\" (UniqueName: \"kubernetes.io/projected/29d571aa-5de1-4779-acd7-b4f39c96e386-kube-api-access-fc67w\") pod \"route-controller-manager-684f6f5f69-tqhd5\" (UID: \"29d571aa-5de1-4779-acd7-b4f39c96e386\") " pod="openshift-route-controller-manager/route-controller-manager-684f6f5f69-tqhd5" Mar 09 09:08:11 crc kubenswrapper[4861]: I0309 09:08:11.499779 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29d571aa-5de1-4779-acd7-b4f39c96e386-serving-cert\") pod \"route-controller-manager-684f6f5f69-tqhd5\" (UID: \"29d571aa-5de1-4779-acd7-b4f39c96e386\") " pod="openshift-route-controller-manager/route-controller-manager-684f6f5f69-tqhd5" Mar 09 09:08:11 crc kubenswrapper[4861]: I0309 09:08:11.499818 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/29d571aa-5de1-4779-acd7-b4f39c96e386-client-ca\") pod \"route-controller-manager-684f6f5f69-tqhd5\" (UID: \"29d571aa-5de1-4779-acd7-b4f39c96e386\") " pod="openshift-route-controller-manager/route-controller-manager-684f6f5f69-tqhd5" Mar 09 09:08:11 crc kubenswrapper[4861]: I0309 09:08:11.499878 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29d571aa-5de1-4779-acd7-b4f39c96e386-config\") pod \"route-controller-manager-684f6f5f69-tqhd5\" (UID: \"29d571aa-5de1-4779-acd7-b4f39c96e386\") " pod="openshift-route-controller-manager/route-controller-manager-684f6f5f69-tqhd5" Mar 09 09:08:11 crc kubenswrapper[4861]: I0309 09:08:11.499913 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5847f\" (UniqueName: \"kubernetes.io/projected/be4aeccd-be26-4a9e-9451-da6c51b2f597-kube-api-access-5847f\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:11 crc kubenswrapper[4861]: I0309 09:08:11.499940 4861 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be4aeccd-be26-4a9e-9451-da6c51b2f597-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:11 crc kubenswrapper[4861]: I0309 09:08:11.499949 4861 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/be4aeccd-be26-4a9e-9451-da6c51b2f597-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:11 crc kubenswrapper[4861]: I0309 09:08:11.499959 4861 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be4aeccd-be26-4a9e-9451-da6c51b2f597-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:11 crc kubenswrapper[4861]: I0309 09:08:11.600830 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fc67w\" (UniqueName: \"kubernetes.io/projected/29d571aa-5de1-4779-acd7-b4f39c96e386-kube-api-access-fc67w\") pod \"route-controller-manager-684f6f5f69-tqhd5\" (UID: \"29d571aa-5de1-4779-acd7-b4f39c96e386\") " pod="openshift-route-controller-manager/route-controller-manager-684f6f5f69-tqhd5" Mar 09 09:08:11 crc kubenswrapper[4861]: I0309 09:08:11.600875 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29d571aa-5de1-4779-acd7-b4f39c96e386-serving-cert\") pod \"route-controller-manager-684f6f5f69-tqhd5\" (UID: \"29d571aa-5de1-4779-acd7-b4f39c96e386\") " pod="openshift-route-controller-manager/route-controller-manager-684f6f5f69-tqhd5" Mar 09 09:08:11 crc kubenswrapper[4861]: I0309 09:08:11.600919 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/29d571aa-5de1-4779-acd7-b4f39c96e386-client-ca\") pod \"route-controller-manager-684f6f5f69-tqhd5\" (UID: \"29d571aa-5de1-4779-acd7-b4f39c96e386\") " pod="openshift-route-controller-manager/route-controller-manager-684f6f5f69-tqhd5" Mar 09 09:08:11 crc kubenswrapper[4861]: I0309 09:08:11.600973 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29d571aa-5de1-4779-acd7-b4f39c96e386-config\") pod \"route-controller-manager-684f6f5f69-tqhd5\" (UID: \"29d571aa-5de1-4779-acd7-b4f39c96e386\") " pod="openshift-route-controller-manager/route-controller-manager-684f6f5f69-tqhd5" Mar 09 09:08:11 crc kubenswrapper[4861]: I0309 09:08:11.602642 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29d571aa-5de1-4779-acd7-b4f39c96e386-config\") pod \"route-controller-manager-684f6f5f69-tqhd5\" (UID: \"29d571aa-5de1-4779-acd7-b4f39c96e386\") " pod="openshift-route-controller-manager/route-controller-manager-684f6f5f69-tqhd5" Mar 09 09:08:11 crc kubenswrapper[4861]: I0309 09:08:11.603796 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/29d571aa-5de1-4779-acd7-b4f39c96e386-client-ca\") pod \"route-controller-manager-684f6f5f69-tqhd5\" (UID: \"29d571aa-5de1-4779-acd7-b4f39c96e386\") " pod="openshift-route-controller-manager/route-controller-manager-684f6f5f69-tqhd5" Mar 09 09:08:11 crc kubenswrapper[4861]: I0309 09:08:11.605749 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29d571aa-5de1-4779-acd7-b4f39c96e386-serving-cert\") pod \"route-controller-manager-684f6f5f69-tqhd5\" (UID: \"29d571aa-5de1-4779-acd7-b4f39c96e386\") " pod="openshift-route-controller-manager/route-controller-manager-684f6f5f69-tqhd5" Mar 09 09:08:11 crc kubenswrapper[4861]: I0309 09:08:11.617712 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fc67w\" (UniqueName: \"kubernetes.io/projected/29d571aa-5de1-4779-acd7-b4f39c96e386-kube-api-access-fc67w\") pod \"route-controller-manager-684f6f5f69-tqhd5\" (UID: \"29d571aa-5de1-4779-acd7-b4f39c96e386\") " pod="openshift-route-controller-manager/route-controller-manager-684f6f5f69-tqhd5" Mar 09 09:08:11 crc kubenswrapper[4861]: I0309 09:08:11.664432 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be4aeccd-be26-4a9e-9451-da6c51b2f597" path="/var/lib/kubelet/pods/be4aeccd-be26-4a9e-9451-da6c51b2f597/volumes" Mar 09 09:08:11 crc kubenswrapper[4861]: I0309 09:08:11.736937 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-684f6f5f69-tqhd5" Mar 09 09:08:13 crc kubenswrapper[4861]: I0309 09:08:13.136527 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-748457ff4-72cdj" Mar 09 09:08:13 crc kubenswrapper[4861]: I0309 09:08:13.327358 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7727f6b7-e2a9-4554-9224-a4600931b08a-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7727f6b7-e2a9-4554-9224-a4600931b08a" (UID: "7727f6b7-e2a9-4554-9224-a4600931b08a"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:13 crc kubenswrapper[4861]: I0309 09:08:13.327648 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7727f6b7-e2a9-4554-9224-a4600931b08a-proxy-ca-bundles\") pod \"7727f6b7-e2a9-4554-9224-a4600931b08a\" (UID: \"7727f6b7-e2a9-4554-9224-a4600931b08a\") " Mar 09 09:08:13 crc kubenswrapper[4861]: I0309 09:08:13.327715 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7727f6b7-e2a9-4554-9224-a4600931b08a-config\") pod \"7727f6b7-e2a9-4554-9224-a4600931b08a\" (UID: \"7727f6b7-e2a9-4554-9224-a4600931b08a\") " Mar 09 09:08:13 crc kubenswrapper[4861]: I0309 09:08:13.327741 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7727f6b7-e2a9-4554-9224-a4600931b08a-client-ca\") pod \"7727f6b7-e2a9-4554-9224-a4600931b08a\" (UID: \"7727f6b7-e2a9-4554-9224-a4600931b08a\") " Mar 09 09:08:13 crc kubenswrapper[4861]: I0309 09:08:13.328108 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7727f6b7-e2a9-4554-9224-a4600931b08a-client-ca" (OuterVolumeSpecName: "client-ca") pod "7727f6b7-e2a9-4554-9224-a4600931b08a" (UID: "7727f6b7-e2a9-4554-9224-a4600931b08a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:13 crc kubenswrapper[4861]: I0309 09:08:13.328230 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7727f6b7-e2a9-4554-9224-a4600931b08a-config" (OuterVolumeSpecName: "config") pod "7727f6b7-e2a9-4554-9224-a4600931b08a" (UID: "7727f6b7-e2a9-4554-9224-a4600931b08a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:13 crc kubenswrapper[4861]: I0309 09:08:13.328298 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ww2x6\" (UniqueName: \"kubernetes.io/projected/7727f6b7-e2a9-4554-9224-a4600931b08a-kube-api-access-ww2x6\") pod \"7727f6b7-e2a9-4554-9224-a4600931b08a\" (UID: \"7727f6b7-e2a9-4554-9224-a4600931b08a\") " Mar 09 09:08:13 crc kubenswrapper[4861]: I0309 09:08:13.328329 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7727f6b7-e2a9-4554-9224-a4600931b08a-serving-cert\") pod \"7727f6b7-e2a9-4554-9224-a4600931b08a\" (UID: \"7727f6b7-e2a9-4554-9224-a4600931b08a\") " Mar 09 09:08:13 crc kubenswrapper[4861]: I0309 09:08:13.329021 4861 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7727f6b7-e2a9-4554-9224-a4600931b08a-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:13 crc kubenswrapper[4861]: I0309 09:08:13.329039 4861 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7727f6b7-e2a9-4554-9224-a4600931b08a-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:13 crc kubenswrapper[4861]: I0309 09:08:13.329049 4861 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7727f6b7-e2a9-4554-9224-a4600931b08a-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:13 crc kubenswrapper[4861]: I0309 09:08:13.334230 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7727f6b7-e2a9-4554-9224-a4600931b08a-kube-api-access-ww2x6" (OuterVolumeSpecName: "kube-api-access-ww2x6") pod "7727f6b7-e2a9-4554-9224-a4600931b08a" (UID: "7727f6b7-e2a9-4554-9224-a4600931b08a"). InnerVolumeSpecName "kube-api-access-ww2x6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:13 crc kubenswrapper[4861]: I0309 09:08:13.334318 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7727f6b7-e2a9-4554-9224-a4600931b08a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7727f6b7-e2a9-4554-9224-a4600931b08a" (UID: "7727f6b7-e2a9-4554-9224-a4600931b08a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:13 crc kubenswrapper[4861]: I0309 09:08:13.430210 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ww2x6\" (UniqueName: \"kubernetes.io/projected/7727f6b7-e2a9-4554-9224-a4600931b08a-kube-api-access-ww2x6\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:13 crc kubenswrapper[4861]: I0309 09:08:13.430246 4861 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7727f6b7-e2a9-4554-9224-a4600931b08a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:13 crc kubenswrapper[4861]: I0309 09:08:13.473274 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-748457ff4-72cdj" event={"ID":"7727f6b7-e2a9-4554-9224-a4600931b08a","Type":"ContainerDied","Data":"36acfb6bac664e13af285b17638ea22829a9e47517fc2952b3a367bddac703c3"} Mar 09 09:08:13 crc kubenswrapper[4861]: I0309 09:08:13.473340 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-748457ff4-72cdj" Mar 09 09:08:13 crc kubenswrapper[4861]: I0309 09:08:13.508115 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-748457ff4-72cdj"] Mar 09 09:08:13 crc kubenswrapper[4861]: I0309 09:08:13.513810 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-748457ff4-72cdj"] Mar 09 09:08:13 crc kubenswrapper[4861]: I0309 09:08:13.666435 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7727f6b7-e2a9-4554-9224-a4600931b08a" path="/var/lib/kubelet/pods/7727f6b7-e2a9-4554-9224-a4600931b08a/volumes" Mar 09 09:08:13 crc kubenswrapper[4861]: I0309 09:08:13.671939 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 09 09:08:14 crc kubenswrapper[4861]: I0309 09:08:14.278433 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-9788bd4c9-j8pf5"] Mar 09 09:08:14 crc kubenswrapper[4861]: E0309 09:08:14.278855 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7727f6b7-e2a9-4554-9224-a4600931b08a" containerName="controller-manager" Mar 09 09:08:14 crc kubenswrapper[4861]: I0309 09:08:14.278871 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="7727f6b7-e2a9-4554-9224-a4600931b08a" containerName="controller-manager" Mar 09 09:08:14 crc kubenswrapper[4861]: I0309 09:08:14.279082 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="7727f6b7-e2a9-4554-9224-a4600931b08a" containerName="controller-manager" Mar 09 09:08:14 crc kubenswrapper[4861]: I0309 09:08:14.279885 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9788bd4c9-j8pf5" Mar 09 09:08:14 crc kubenswrapper[4861]: I0309 09:08:14.284440 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 09 09:08:14 crc kubenswrapper[4861]: I0309 09:08:14.284762 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 09 09:08:14 crc kubenswrapper[4861]: I0309 09:08:14.284880 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 09 09:08:14 crc kubenswrapper[4861]: I0309 09:08:14.285771 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 09 09:08:14 crc kubenswrapper[4861]: I0309 09:08:14.285898 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 09 09:08:14 crc kubenswrapper[4861]: I0309 09:08:14.288251 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 09 09:08:14 crc kubenswrapper[4861]: I0309 09:08:14.290901 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-9788bd4c9-j8pf5"] Mar 09 09:08:14 crc kubenswrapper[4861]: I0309 09:08:14.297800 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 09 09:08:14 crc kubenswrapper[4861]: I0309 09:08:14.353360 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b17abf38-2942-4438-842d-91c589bbc03f-serving-cert\") pod \"controller-manager-9788bd4c9-j8pf5\" (UID: \"b17abf38-2942-4438-842d-91c589bbc03f\") " pod="openshift-controller-manager/controller-manager-9788bd4c9-j8pf5" Mar 09 09:08:14 crc kubenswrapper[4861]: I0309 09:08:14.353440 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6lb4\" (UniqueName: \"kubernetes.io/projected/b17abf38-2942-4438-842d-91c589bbc03f-kube-api-access-w6lb4\") pod \"controller-manager-9788bd4c9-j8pf5\" (UID: \"b17abf38-2942-4438-842d-91c589bbc03f\") " pod="openshift-controller-manager/controller-manager-9788bd4c9-j8pf5" Mar 09 09:08:14 crc kubenswrapper[4861]: I0309 09:08:14.353490 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b17abf38-2942-4438-842d-91c589bbc03f-proxy-ca-bundles\") pod \"controller-manager-9788bd4c9-j8pf5\" (UID: \"b17abf38-2942-4438-842d-91c589bbc03f\") " pod="openshift-controller-manager/controller-manager-9788bd4c9-j8pf5" Mar 09 09:08:14 crc kubenswrapper[4861]: I0309 09:08:14.353668 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b17abf38-2942-4438-842d-91c589bbc03f-client-ca\") pod \"controller-manager-9788bd4c9-j8pf5\" (UID: \"b17abf38-2942-4438-842d-91c589bbc03f\") " pod="openshift-controller-manager/controller-manager-9788bd4c9-j8pf5" Mar 09 09:08:14 crc kubenswrapper[4861]: I0309 09:08:14.353710 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b17abf38-2942-4438-842d-91c589bbc03f-config\") pod \"controller-manager-9788bd4c9-j8pf5\" (UID: \"b17abf38-2942-4438-842d-91c589bbc03f\") " pod="openshift-controller-manager/controller-manager-9788bd4c9-j8pf5" Mar 09 09:08:14 crc kubenswrapper[4861]: I0309 09:08:14.455551 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b17abf38-2942-4438-842d-91c589bbc03f-client-ca\") pod \"controller-manager-9788bd4c9-j8pf5\" (UID: \"b17abf38-2942-4438-842d-91c589bbc03f\") " pod="openshift-controller-manager/controller-manager-9788bd4c9-j8pf5" Mar 09 09:08:14 crc kubenswrapper[4861]: I0309 09:08:14.455934 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b17abf38-2942-4438-842d-91c589bbc03f-config\") pod \"controller-manager-9788bd4c9-j8pf5\" (UID: \"b17abf38-2942-4438-842d-91c589bbc03f\") " pod="openshift-controller-manager/controller-manager-9788bd4c9-j8pf5" Mar 09 09:08:14 crc kubenswrapper[4861]: I0309 09:08:14.455973 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b17abf38-2942-4438-842d-91c589bbc03f-serving-cert\") pod \"controller-manager-9788bd4c9-j8pf5\" (UID: \"b17abf38-2942-4438-842d-91c589bbc03f\") " pod="openshift-controller-manager/controller-manager-9788bd4c9-j8pf5" Mar 09 09:08:14 crc kubenswrapper[4861]: I0309 09:08:14.455999 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6lb4\" (UniqueName: \"kubernetes.io/projected/b17abf38-2942-4438-842d-91c589bbc03f-kube-api-access-w6lb4\") pod \"controller-manager-9788bd4c9-j8pf5\" (UID: \"b17abf38-2942-4438-842d-91c589bbc03f\") " pod="openshift-controller-manager/controller-manager-9788bd4c9-j8pf5" Mar 09 09:08:14 crc kubenswrapper[4861]: I0309 09:08:14.456030 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b17abf38-2942-4438-842d-91c589bbc03f-proxy-ca-bundles\") pod \"controller-manager-9788bd4c9-j8pf5\" (UID: \"b17abf38-2942-4438-842d-91c589bbc03f\") " pod="openshift-controller-manager/controller-manager-9788bd4c9-j8pf5" Mar 09 09:08:14 crc kubenswrapper[4861]: I0309 09:08:14.456679 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b17abf38-2942-4438-842d-91c589bbc03f-client-ca\") pod \"controller-manager-9788bd4c9-j8pf5\" (UID: \"b17abf38-2942-4438-842d-91c589bbc03f\") " pod="openshift-controller-manager/controller-manager-9788bd4c9-j8pf5" Mar 09 09:08:14 crc kubenswrapper[4861]: I0309 09:08:14.457043 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b17abf38-2942-4438-842d-91c589bbc03f-proxy-ca-bundles\") pod \"controller-manager-9788bd4c9-j8pf5\" (UID: \"b17abf38-2942-4438-842d-91c589bbc03f\") " pod="openshift-controller-manager/controller-manager-9788bd4c9-j8pf5" Mar 09 09:08:14 crc kubenswrapper[4861]: I0309 09:08:14.457617 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b17abf38-2942-4438-842d-91c589bbc03f-config\") pod \"controller-manager-9788bd4c9-j8pf5\" (UID: \"b17abf38-2942-4438-842d-91c589bbc03f\") " pod="openshift-controller-manager/controller-manager-9788bd4c9-j8pf5" Mar 09 09:08:14 crc kubenswrapper[4861]: I0309 09:08:14.461261 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b17abf38-2942-4438-842d-91c589bbc03f-serving-cert\") pod \"controller-manager-9788bd4c9-j8pf5\" (UID: \"b17abf38-2942-4438-842d-91c589bbc03f\") " pod="openshift-controller-manager/controller-manager-9788bd4c9-j8pf5" Mar 09 09:08:14 crc kubenswrapper[4861]: I0309 09:08:14.481262 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6lb4\" (UniqueName: \"kubernetes.io/projected/b17abf38-2942-4438-842d-91c589bbc03f-kube-api-access-w6lb4\") pod \"controller-manager-9788bd4c9-j8pf5\" (UID: \"b17abf38-2942-4438-842d-91c589bbc03f\") " pod="openshift-controller-manager/controller-manager-9788bd4c9-j8pf5" Mar 09 09:08:14 crc kubenswrapper[4861]: I0309 09:08:14.610465 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9788bd4c9-j8pf5" Mar 09 09:08:15 crc kubenswrapper[4861]: I0309 09:08:15.507052 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-wdsnl_4eed3eac-42f8-4683-9c1f-3733965e6af7/kube-multus-additional-cni-plugins/0.log" Mar 09 09:08:15 crc kubenswrapper[4861]: I0309 09:08:15.507363 4861 generic.go:334] "Generic (PLEG): container finished" podID="4eed3eac-42f8-4683-9c1f-3733965e6af7" containerID="60d212b5a01a67560a782f62fd8dcd1024f51380874431f6df6a561a0cc668d3" exitCode=137 Mar 09 09:08:15 crc kubenswrapper[4861]: I0309 09:08:15.507427 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-wdsnl" event={"ID":"4eed3eac-42f8-4683-9c1f-3733965e6af7","Type":"ContainerDied","Data":"60d212b5a01a67560a782f62fd8dcd1024f51380874431f6df6a561a0cc668d3"} Mar 09 09:08:16 crc kubenswrapper[4861]: E0309 09:08:16.988244 4861 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 09 09:08:16 crc kubenswrapper[4861]: E0309 09:08:16.988766 4861 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kh8vm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-zkngt_openshift-marketplace(6eac3eed-7721-4030-b1e3-9dd28fea2e49): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 09 09:08:16 crc kubenswrapper[4861]: E0309 09:08:16.989951 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-zkngt" podUID="6eac3eed-7721-4030-b1e3-9dd28fea2e49" Mar 09 09:08:17 crc kubenswrapper[4861]: I0309 09:08:17.171094 4861 scope.go:117] "RemoveContainer" containerID="02258d50bdab0b7d450ab4919645b266eba5654e6192f751394ccaa5699570f5" Mar 09 09:08:17 crc kubenswrapper[4861]: I0309 09:08:17.245071 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-wdsnl_4eed3eac-42f8-4683-9c1f-3733965e6af7/kube-multus-additional-cni-plugins/0.log" Mar 09 09:08:17 crc kubenswrapper[4861]: I0309 09:08:17.245135 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-wdsnl" Mar 09 09:08:17 crc kubenswrapper[4861]: I0309 09:08:17.256632 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=4.256617996 podStartE2EDuration="4.256617996s" podCreationTimestamp="2026-03-09 09:08:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:08:14.3220092 +0000 UTC m=+137.407048601" watchObservedRunningTime="2026-03-09 09:08:17.256617996 +0000 UTC m=+140.341657397" Mar 09 09:08:17 crc kubenswrapper[4861]: I0309 09:08:17.390408 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/4eed3eac-42f8-4683-9c1f-3733965e6af7-ready\") pod \"4eed3eac-42f8-4683-9c1f-3733965e6af7\" (UID: \"4eed3eac-42f8-4683-9c1f-3733965e6af7\") " Mar 09 09:08:17 crc kubenswrapper[4861]: I0309 09:08:17.390767 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djxts\" (UniqueName: \"kubernetes.io/projected/4eed3eac-42f8-4683-9c1f-3733965e6af7-kube-api-access-djxts\") pod \"4eed3eac-42f8-4683-9c1f-3733965e6af7\" (UID: \"4eed3eac-42f8-4683-9c1f-3733965e6af7\") " Mar 09 09:08:17 crc kubenswrapper[4861]: I0309 09:08:17.390803 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4eed3eac-42f8-4683-9c1f-3733965e6af7-tuning-conf-dir\") pod \"4eed3eac-42f8-4683-9c1f-3733965e6af7\" (UID: \"4eed3eac-42f8-4683-9c1f-3733965e6af7\") " Mar 09 09:08:17 crc kubenswrapper[4861]: I0309 09:08:17.391032 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4eed3eac-42f8-4683-9c1f-3733965e6af7-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "4eed3eac-42f8-4683-9c1f-3733965e6af7" (UID: "4eed3eac-42f8-4683-9c1f-3733965e6af7"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:08:17 crc kubenswrapper[4861]: I0309 09:08:17.391822 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4eed3eac-42f8-4683-9c1f-3733965e6af7-ready" (OuterVolumeSpecName: "ready") pod "4eed3eac-42f8-4683-9c1f-3733965e6af7" (UID: "4eed3eac-42f8-4683-9c1f-3733965e6af7"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:08:17 crc kubenswrapper[4861]: I0309 09:08:17.392210 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4eed3eac-42f8-4683-9c1f-3733965e6af7-cni-sysctl-allowlist\") pod \"4eed3eac-42f8-4683-9c1f-3733965e6af7\" (UID: \"4eed3eac-42f8-4683-9c1f-3733965e6af7\") " Mar 09 09:08:17 crc kubenswrapper[4861]: I0309 09:08:17.392514 4861 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/4eed3eac-42f8-4683-9c1f-3733965e6af7-ready\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:17 crc kubenswrapper[4861]: I0309 09:08:17.392533 4861 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4eed3eac-42f8-4683-9c1f-3733965e6af7-tuning-conf-dir\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:17 crc kubenswrapper[4861]: I0309 09:08:17.393508 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4eed3eac-42f8-4683-9c1f-3733965e6af7-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "4eed3eac-42f8-4683-9c1f-3733965e6af7" (UID: "4eed3eac-42f8-4683-9c1f-3733965e6af7"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:17 crc kubenswrapper[4861]: I0309 09:08:17.401457 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4eed3eac-42f8-4683-9c1f-3733965e6af7-kube-api-access-djxts" (OuterVolumeSpecName: "kube-api-access-djxts") pod "4eed3eac-42f8-4683-9c1f-3733965e6af7" (UID: "4eed3eac-42f8-4683-9c1f-3733965e6af7"). InnerVolumeSpecName "kube-api-access-djxts". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:17 crc kubenswrapper[4861]: I0309 09:08:17.493359 4861 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4eed3eac-42f8-4683-9c1f-3733965e6af7-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:17 crc kubenswrapper[4861]: I0309 09:08:17.493402 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djxts\" (UniqueName: \"kubernetes.io/projected/4eed3eac-42f8-4683-9c1f-3733965e6af7-kube-api-access-djxts\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:17 crc kubenswrapper[4861]: I0309 09:08:17.530177 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-wdsnl_4eed3eac-42f8-4683-9c1f-3733965e6af7/kube-multus-additional-cni-plugins/0.log" Mar 09 09:08:17 crc kubenswrapper[4861]: I0309 09:08:17.530276 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-wdsnl" event={"ID":"4eed3eac-42f8-4683-9c1f-3733965e6af7","Type":"ContainerDied","Data":"bb7717cb2e1a96f62fe1e45a3412bd03575544eb76b70a878b5f4243d0307f93"} Mar 09 09:08:17 crc kubenswrapper[4861]: I0309 09:08:17.530317 4861 scope.go:117] "RemoveContainer" containerID="60d212b5a01a67560a782f62fd8dcd1024f51380874431f6df6a561a0cc668d3" Mar 09 09:08:17 crc kubenswrapper[4861]: I0309 09:08:17.530394 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-wdsnl" Mar 09 09:08:17 crc kubenswrapper[4861]: I0309 09:08:17.533022 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fnk25" event={"ID":"0eaf35dc-b198-4fbb-9e43-ddac97f1f62b","Type":"ContainerStarted","Data":"116063dd19dc24cf5aa601ea2099ce41cc034ee16f5034f138594a61ea756abe"} Mar 09 09:08:17 crc kubenswrapper[4861]: I0309 09:08:17.539881 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mlqmj" event={"ID":"e8af01b5-95f0-43a2-b228-675b98c6203f","Type":"ContainerStarted","Data":"f414447491b9eefbf4159f870988b48e34e9a4679d32bb9fb74f8d47b48ebbd8"} Mar 09 09:08:17 crc kubenswrapper[4861]: I0309 09:08:17.549161 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wcp7q" event={"ID":"0398ae40-9658-4a9a-949c-4419bb1ca9bf","Type":"ContainerStarted","Data":"8006f8540682bd81fefde850328666dfbfe111f5c351cd75b5c9586fd473951e"} Mar 09 09:08:17 crc kubenswrapper[4861]: E0309 09:08:17.582106 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-zkngt" podUID="6eac3eed-7721-4030-b1e3-9dd28fea2e49" Mar 09 09:08:17 crc kubenswrapper[4861]: I0309 09:08:17.649117 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-wdsnl"] Mar 09 09:08:17 crc kubenswrapper[4861]: I0309 09:08:17.655919 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-wdsnl"] Mar 09 09:08:17 crc kubenswrapper[4861]: I0309 09:08:17.667557 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4eed3eac-42f8-4683-9c1f-3733965e6af7" path="/var/lib/kubelet/pods/4eed3eac-42f8-4683-9c1f-3733965e6af7/volumes" Mar 09 09:08:17 crc kubenswrapper[4861]: I0309 09:08:17.735432 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-684f6f5f69-tqhd5"] Mar 09 09:08:17 crc kubenswrapper[4861]: W0309 09:08:17.761271 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29d571aa_5de1_4779_acd7_b4f39c96e386.slice/crio-d354c7ff4b6c2a86e6a7e91c895036c1923aa74c505feb3a218ec3d9212db3f8 WatchSource:0}: Error finding container d354c7ff4b6c2a86e6a7e91c895036c1923aa74c505feb3a218ec3d9212db3f8: Status 404 returned error can't find the container with id d354c7ff4b6c2a86e6a7e91c895036c1923aa74c505feb3a218ec3d9212db3f8 Mar 09 09:08:17 crc kubenswrapper[4861]: I0309 09:08:17.812522 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-9788bd4c9-j8pf5"] Mar 09 09:08:17 crc kubenswrapper[4861]: I0309 09:08:17.831279 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zc95n" Mar 09 09:08:18 crc kubenswrapper[4861]: I0309 09:08:18.572691 4861 generic.go:334] "Generic (PLEG): container finished" podID="0398ae40-9658-4a9a-949c-4419bb1ca9bf" containerID="8006f8540682bd81fefde850328666dfbfe111f5c351cd75b5c9586fd473951e" exitCode=0 Mar 09 09:08:18 crc kubenswrapper[4861]: I0309 09:08:18.572776 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wcp7q" event={"ID":"0398ae40-9658-4a9a-949c-4419bb1ca9bf","Type":"ContainerDied","Data":"8006f8540682bd81fefde850328666dfbfe111f5c351cd75b5c9586fd473951e"} Mar 09 09:08:18 crc kubenswrapper[4861]: I0309 09:08:18.575579 4861 generic.go:334] "Generic (PLEG): container finished" podID="56bad3ba-c8af-4b69-96ae-93311a9d6151" containerID="6064ed91f9d379a74b24659a7000396365a464df9e2c7b043f72df218bede7b7" exitCode=0 Mar 09 09:08:18 crc kubenswrapper[4861]: I0309 09:08:18.575610 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-48m9h" event={"ID":"56bad3ba-c8af-4b69-96ae-93311a9d6151","Type":"ContainerDied","Data":"6064ed91f9d379a74b24659a7000396365a464df9e2c7b043f72df218bede7b7"} Mar 09 09:08:18 crc kubenswrapper[4861]: I0309 09:08:18.582661 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j6fcf" event={"ID":"33158a78-8c1c-4aa1-9c51-66e21d0e8ae6","Type":"ContainerStarted","Data":"d054bf6ca4163e20e2b1e243eb355bf50d30f6de8f2c9cdf71ca57b9d18f600c"} Mar 09 09:08:18 crc kubenswrapper[4861]: I0309 09:08:18.589666 4861 generic.go:334] "Generic (PLEG): container finished" podID="0eaf35dc-b198-4fbb-9e43-ddac97f1f62b" containerID="116063dd19dc24cf5aa601ea2099ce41cc034ee16f5034f138594a61ea756abe" exitCode=0 Mar 09 09:08:18 crc kubenswrapper[4861]: I0309 09:08:18.589977 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fnk25" event={"ID":"0eaf35dc-b198-4fbb-9e43-ddac97f1f62b","Type":"ContainerDied","Data":"116063dd19dc24cf5aa601ea2099ce41cc034ee16f5034f138594a61ea756abe"} Mar 09 09:08:18 crc kubenswrapper[4861]: I0309 09:08:18.596124 4861 generic.go:334] "Generic (PLEG): container finished" podID="e8af01b5-95f0-43a2-b228-675b98c6203f" containerID="f414447491b9eefbf4159f870988b48e34e9a4679d32bb9fb74f8d47b48ebbd8" exitCode=0 Mar 09 09:08:18 crc kubenswrapper[4861]: I0309 09:08:18.596615 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mlqmj" event={"ID":"e8af01b5-95f0-43a2-b228-675b98c6203f","Type":"ContainerDied","Data":"f414447491b9eefbf4159f870988b48e34e9a4679d32bb9fb74f8d47b48ebbd8"} Mar 09 09:08:18 crc kubenswrapper[4861]: I0309 09:08:18.599246 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-684f6f5f69-tqhd5" event={"ID":"29d571aa-5de1-4779-acd7-b4f39c96e386","Type":"ContainerStarted","Data":"a75304d454d2e91b361a678778adc87f9497d5f5c334ba188f38d9d05d5edc6d"} Mar 09 09:08:18 crc kubenswrapper[4861]: I0309 09:08:18.599294 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-684f6f5f69-tqhd5" event={"ID":"29d571aa-5de1-4779-acd7-b4f39c96e386","Type":"ContainerStarted","Data":"d354c7ff4b6c2a86e6a7e91c895036c1923aa74c505feb3a218ec3d9212db3f8"} Mar 09 09:08:18 crc kubenswrapper[4861]: I0309 09:08:18.606166 4861 generic.go:334] "Generic (PLEG): container finished" podID="245d74cf-545f-43d3-ad40-5260aef18260" containerID="ab8a12c5254f416fb5a2a4a147b4cc56c5c02f1759f3c4b4d624ea331c9cf149" exitCode=0 Mar 09 09:08:18 crc kubenswrapper[4861]: I0309 09:08:18.606233 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gcb6x" event={"ID":"245d74cf-545f-43d3-ad40-5260aef18260","Type":"ContainerDied","Data":"ab8a12c5254f416fb5a2a4a147b4cc56c5c02f1759f3c4b4d624ea331c9cf149"} Mar 09 09:08:18 crc kubenswrapper[4861]: I0309 09:08:18.609210 4861 generic.go:334] "Generic (PLEG): container finished" podID="3ad8332f-9ca2-4dd0-903f-2bf5723aa51e" containerID="0a3af934f28ecbd012dfa212f71d8fbd8229c69c129928c202abb78cdeb3bfef" exitCode=0 Mar 09 09:08:18 crc kubenswrapper[4861]: I0309 09:08:18.609277 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cjwtv" event={"ID":"3ad8332f-9ca2-4dd0-903f-2bf5723aa51e","Type":"ContainerDied","Data":"0a3af934f28ecbd012dfa212f71d8fbd8229c69c129928c202abb78cdeb3bfef"} Mar 09 09:08:18 crc kubenswrapper[4861]: I0309 09:08:18.611708 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9788bd4c9-j8pf5" event={"ID":"b17abf38-2942-4438-842d-91c589bbc03f","Type":"ContainerStarted","Data":"6a11c52601c89082eca65eec926e1bc57c11c527fd469ea399d0c85aad3b35fc"} Mar 09 09:08:18 crc kubenswrapper[4861]: I0309 09:08:18.611760 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9788bd4c9-j8pf5" event={"ID":"b17abf38-2942-4438-842d-91c589bbc03f","Type":"ContainerStarted","Data":"b61fb338b5603a0577de366660bc1a2f2d37c9def056bf4d92b12b1bafa7d95d"} Mar 09 09:08:19 crc kubenswrapper[4861]: I0309 09:08:19.632796 4861 generic.go:334] "Generic (PLEG): container finished" podID="33158a78-8c1c-4aa1-9c51-66e21d0e8ae6" containerID="d054bf6ca4163e20e2b1e243eb355bf50d30f6de8f2c9cdf71ca57b9d18f600c" exitCode=0 Mar 09 09:08:19 crc kubenswrapper[4861]: I0309 09:08:19.633463 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j6fcf" event={"ID":"33158a78-8c1c-4aa1-9c51-66e21d0e8ae6","Type":"ContainerDied","Data":"d054bf6ca4163e20e2b1e243eb355bf50d30f6de8f2c9cdf71ca57b9d18f600c"} Mar 09 09:08:19 crc kubenswrapper[4861]: I0309 09:08:19.634204 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-9788bd4c9-j8pf5" Mar 09 09:08:19 crc kubenswrapper[4861]: I0309 09:08:19.634440 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-684f6f5f69-tqhd5" Mar 09 09:08:19 crc kubenswrapper[4861]: I0309 09:08:19.640683 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-684f6f5f69-tqhd5" Mar 09 09:08:19 crc kubenswrapper[4861]: I0309 09:08:19.640731 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-9788bd4c9-j8pf5" Mar 09 09:08:19 crc kubenswrapper[4861]: I0309 09:08:19.650727 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-684f6f5f69-tqhd5" podStartSLOduration=17.650708579 podStartE2EDuration="17.650708579s" podCreationTimestamp="2026-03-09 09:08:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:08:19.6503947 +0000 UTC m=+142.735434111" watchObservedRunningTime="2026-03-09 09:08:19.650708579 +0000 UTC m=+142.735747980" Mar 09 09:08:19 crc kubenswrapper[4861]: I0309 09:08:19.714662 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-9788bd4c9-j8pf5" podStartSLOduration=17.714631373 podStartE2EDuration="17.714631373s" podCreationTimestamp="2026-03-09 09:08:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:08:19.705945744 +0000 UTC m=+142.790985165" watchObservedRunningTime="2026-03-09 09:08:19.714631373 +0000 UTC m=+142.799670784" Mar 09 09:08:21 crc kubenswrapper[4861]: I0309 09:08:21.202568 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 09 09:08:21 crc kubenswrapper[4861]: E0309 09:08:21.202986 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eed3eac-42f8-4683-9c1f-3733965e6af7" containerName="kube-multus-additional-cni-plugins" Mar 09 09:08:21 crc kubenswrapper[4861]: I0309 09:08:21.202997 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eed3eac-42f8-4683-9c1f-3733965e6af7" containerName="kube-multus-additional-cni-plugins" Mar 09 09:08:21 crc kubenswrapper[4861]: I0309 09:08:21.203095 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="4eed3eac-42f8-4683-9c1f-3733965e6af7" containerName="kube-multus-additional-cni-plugins" Mar 09 09:08:21 crc kubenswrapper[4861]: I0309 09:08:21.203460 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 09:08:21 crc kubenswrapper[4861]: I0309 09:08:21.204975 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 09 09:08:21 crc kubenswrapper[4861]: I0309 09:08:21.205392 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 09 09:08:21 crc kubenswrapper[4861]: I0309 09:08:21.215736 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 09 09:08:21 crc kubenswrapper[4861]: I0309 09:08:21.349833 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bc724060-a3ff-4a81-b417-a860a0a07045-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"bc724060-a3ff-4a81-b417-a860a0a07045\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 09:08:21 crc kubenswrapper[4861]: I0309 09:08:21.349960 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bc724060-a3ff-4a81-b417-a860a0a07045-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"bc724060-a3ff-4a81-b417-a860a0a07045\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 09:08:21 crc kubenswrapper[4861]: I0309 09:08:21.450778 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bc724060-a3ff-4a81-b417-a860a0a07045-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"bc724060-a3ff-4a81-b417-a860a0a07045\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 09:08:21 crc kubenswrapper[4861]: I0309 09:08:21.450890 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bc724060-a3ff-4a81-b417-a860a0a07045-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"bc724060-a3ff-4a81-b417-a860a0a07045\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 09:08:21 crc kubenswrapper[4861]: I0309 09:08:21.451066 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bc724060-a3ff-4a81-b417-a860a0a07045-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"bc724060-a3ff-4a81-b417-a860a0a07045\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 09:08:21 crc kubenswrapper[4861]: I0309 09:08:21.475753 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bc724060-a3ff-4a81-b417-a860a0a07045-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"bc724060-a3ff-4a81-b417-a860a0a07045\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 09:08:21 crc kubenswrapper[4861]: I0309 09:08:21.524205 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 09:08:22 crc kubenswrapper[4861]: I0309 09:08:22.579745 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-9788bd4c9-j8pf5"] Mar 09 09:08:22 crc kubenswrapper[4861]: I0309 09:08:22.657404 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-9788bd4c9-j8pf5" podUID="b17abf38-2942-4438-842d-91c589bbc03f" containerName="controller-manager" containerID="cri-o://6a11c52601c89082eca65eec926e1bc57c11c527fd469ea399d0c85aad3b35fc" gracePeriod=30 Mar 09 09:08:22 crc kubenswrapper[4861]: I0309 09:08:22.685592 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-684f6f5f69-tqhd5"] Mar 09 09:08:22 crc kubenswrapper[4861]: I0309 09:08:22.685784 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-684f6f5f69-tqhd5" podUID="29d571aa-5de1-4779-acd7-b4f39c96e386" containerName="route-controller-manager" containerID="cri-o://a75304d454d2e91b361a678778adc87f9497d5f5c334ba188f38d9d05d5edc6d" gracePeriod=30 Mar 09 09:08:23 crc kubenswrapper[4861]: I0309 09:08:23.667938 4861 generic.go:334] "Generic (PLEG): container finished" podID="b17abf38-2942-4438-842d-91c589bbc03f" containerID="6a11c52601c89082eca65eec926e1bc57c11c527fd469ea399d0c85aad3b35fc" exitCode=0 Mar 09 09:08:23 crc kubenswrapper[4861]: I0309 09:08:23.668032 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9788bd4c9-j8pf5" event={"ID":"b17abf38-2942-4438-842d-91c589bbc03f","Type":"ContainerDied","Data":"6a11c52601c89082eca65eec926e1bc57c11c527fd469ea399d0c85aad3b35fc"} Mar 09 09:08:23 crc kubenswrapper[4861]: I0309 09:08:23.670285 4861 generic.go:334] "Generic (PLEG): container finished" podID="29d571aa-5de1-4779-acd7-b4f39c96e386" containerID="a75304d454d2e91b361a678778adc87f9497d5f5c334ba188f38d9d05d5edc6d" exitCode=0 Mar 09 09:08:23 crc kubenswrapper[4861]: I0309 09:08:23.670314 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-684f6f5f69-tqhd5" event={"ID":"29d571aa-5de1-4779-acd7-b4f39c96e386","Type":"ContainerDied","Data":"a75304d454d2e91b361a678778adc87f9497d5f5c334ba188f38d9d05d5edc6d"} Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.341602 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-684f6f5f69-tqhd5" Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.350829 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9788bd4c9-j8pf5" Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.383718 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59b5dbf48f-tbq65"] Mar 09 09:08:25 crc kubenswrapper[4861]: E0309 09:08:25.384045 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29d571aa-5de1-4779-acd7-b4f39c96e386" containerName="route-controller-manager" Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.384061 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="29d571aa-5de1-4779-acd7-b4f39c96e386" containerName="route-controller-manager" Mar 09 09:08:25 crc kubenswrapper[4861]: E0309 09:08:25.384084 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b17abf38-2942-4438-842d-91c589bbc03f" containerName="controller-manager" Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.384091 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="b17abf38-2942-4438-842d-91c589bbc03f" containerName="controller-manager" Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.384206 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="b17abf38-2942-4438-842d-91c589bbc03f" containerName="controller-manager" Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.384225 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="29d571aa-5de1-4779-acd7-b4f39c96e386" containerName="route-controller-manager" Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.384671 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59b5dbf48f-tbq65" Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.389691 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59b5dbf48f-tbq65"] Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.401970 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b17abf38-2942-4438-842d-91c589bbc03f-client-ca\") pod \"b17abf38-2942-4438-842d-91c589bbc03f\" (UID: \"b17abf38-2942-4438-842d-91c589bbc03f\") " Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.402009 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b17abf38-2942-4438-842d-91c589bbc03f-proxy-ca-bundles\") pod \"b17abf38-2942-4438-842d-91c589bbc03f\" (UID: \"b17abf38-2942-4438-842d-91c589bbc03f\") " Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.403891 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b17abf38-2942-4438-842d-91c589bbc03f-serving-cert\") pod \"b17abf38-2942-4438-842d-91c589bbc03f\" (UID: \"b17abf38-2942-4438-842d-91c589bbc03f\") " Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.403957 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29d571aa-5de1-4779-acd7-b4f39c96e386-serving-cert\") pod \"29d571aa-5de1-4779-acd7-b4f39c96e386\" (UID: \"29d571aa-5de1-4779-acd7-b4f39c96e386\") " Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.403988 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6lb4\" (UniqueName: \"kubernetes.io/projected/b17abf38-2942-4438-842d-91c589bbc03f-kube-api-access-w6lb4\") pod \"b17abf38-2942-4438-842d-91c589bbc03f\" (UID: \"b17abf38-2942-4438-842d-91c589bbc03f\") " Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.404021 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/29d571aa-5de1-4779-acd7-b4f39c96e386-client-ca\") pod \"29d571aa-5de1-4779-acd7-b4f39c96e386\" (UID: \"29d571aa-5de1-4779-acd7-b4f39c96e386\") " Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.404050 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29d571aa-5de1-4779-acd7-b4f39c96e386-config\") pod \"29d571aa-5de1-4779-acd7-b4f39c96e386\" (UID: \"29d571aa-5de1-4779-acd7-b4f39c96e386\") " Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.404073 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b17abf38-2942-4438-842d-91c589bbc03f-config\") pod \"b17abf38-2942-4438-842d-91c589bbc03f\" (UID: \"b17abf38-2942-4438-842d-91c589bbc03f\") " Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.404085 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b17abf38-2942-4438-842d-91c589bbc03f-client-ca" (OuterVolumeSpecName: "client-ca") pod "b17abf38-2942-4438-842d-91c589bbc03f" (UID: "b17abf38-2942-4438-842d-91c589bbc03f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.404103 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fc67w\" (UniqueName: \"kubernetes.io/projected/29d571aa-5de1-4779-acd7-b4f39c96e386-kube-api-access-fc67w\") pod \"29d571aa-5de1-4779-acd7-b4f39c96e386\" (UID: \"29d571aa-5de1-4779-acd7-b4f39c96e386\") " Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.404869 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/89f0faed-9686-4f8f-af5d-ef542aa0f161-client-ca\") pod \"route-controller-manager-59b5dbf48f-tbq65\" (UID: \"89f0faed-9686-4f8f-af5d-ef542aa0f161\") " pod="openshift-route-controller-manager/route-controller-manager-59b5dbf48f-tbq65" Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.404907 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89f0faed-9686-4f8f-af5d-ef542aa0f161-config\") pod \"route-controller-manager-59b5dbf48f-tbq65\" (UID: \"89f0faed-9686-4f8f-af5d-ef542aa0f161\") " pod="openshift-route-controller-manager/route-controller-manager-59b5dbf48f-tbq65" Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.404991 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bvv2\" (UniqueName: \"kubernetes.io/projected/89f0faed-9686-4f8f-af5d-ef542aa0f161-kube-api-access-9bvv2\") pod \"route-controller-manager-59b5dbf48f-tbq65\" (UID: \"89f0faed-9686-4f8f-af5d-ef542aa0f161\") " pod="openshift-route-controller-manager/route-controller-manager-59b5dbf48f-tbq65" Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.405047 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89f0faed-9686-4f8f-af5d-ef542aa0f161-serving-cert\") pod \"route-controller-manager-59b5dbf48f-tbq65\" (UID: \"89f0faed-9686-4f8f-af5d-ef542aa0f161\") " pod="openshift-route-controller-manager/route-controller-manager-59b5dbf48f-tbq65" Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.405137 4861 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b17abf38-2942-4438-842d-91c589bbc03f-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.405573 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29d571aa-5de1-4779-acd7-b4f39c96e386-client-ca" (OuterVolumeSpecName: "client-ca") pod "29d571aa-5de1-4779-acd7-b4f39c96e386" (UID: "29d571aa-5de1-4779-acd7-b4f39c96e386"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.405850 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b17abf38-2942-4438-842d-91c589bbc03f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "b17abf38-2942-4438-842d-91c589bbc03f" (UID: "b17abf38-2942-4438-842d-91c589bbc03f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.406875 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29d571aa-5de1-4779-acd7-b4f39c96e386-config" (OuterVolumeSpecName: "config") pod "29d571aa-5de1-4779-acd7-b4f39c96e386" (UID: "29d571aa-5de1-4779-acd7-b4f39c96e386"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.412739 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b17abf38-2942-4438-842d-91c589bbc03f-config" (OuterVolumeSpecName: "config") pod "b17abf38-2942-4438-842d-91c589bbc03f" (UID: "b17abf38-2942-4438-842d-91c589bbc03f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.424619 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29d571aa-5de1-4779-acd7-b4f39c96e386-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "29d571aa-5de1-4779-acd7-b4f39c96e386" (UID: "29d571aa-5de1-4779-acd7-b4f39c96e386"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.425073 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b17abf38-2942-4438-842d-91c589bbc03f-kube-api-access-w6lb4" (OuterVolumeSpecName: "kube-api-access-w6lb4") pod "b17abf38-2942-4438-842d-91c589bbc03f" (UID: "b17abf38-2942-4438-842d-91c589bbc03f"). InnerVolumeSpecName "kube-api-access-w6lb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.426618 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b17abf38-2942-4438-842d-91c589bbc03f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b17abf38-2942-4438-842d-91c589bbc03f" (UID: "b17abf38-2942-4438-842d-91c589bbc03f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.433871 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29d571aa-5de1-4779-acd7-b4f39c96e386-kube-api-access-fc67w" (OuterVolumeSpecName: "kube-api-access-fc67w") pod "29d571aa-5de1-4779-acd7-b4f39c96e386" (UID: "29d571aa-5de1-4779-acd7-b4f39c96e386"). InnerVolumeSpecName "kube-api-access-fc67w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.505739 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bvv2\" (UniqueName: \"kubernetes.io/projected/89f0faed-9686-4f8f-af5d-ef542aa0f161-kube-api-access-9bvv2\") pod \"route-controller-manager-59b5dbf48f-tbq65\" (UID: \"89f0faed-9686-4f8f-af5d-ef542aa0f161\") " pod="openshift-route-controller-manager/route-controller-manager-59b5dbf48f-tbq65" Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.505810 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89f0faed-9686-4f8f-af5d-ef542aa0f161-serving-cert\") pod \"route-controller-manager-59b5dbf48f-tbq65\" (UID: \"89f0faed-9686-4f8f-af5d-ef542aa0f161\") " pod="openshift-route-controller-manager/route-controller-manager-59b5dbf48f-tbq65" Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.505884 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/89f0faed-9686-4f8f-af5d-ef542aa0f161-client-ca\") pod \"route-controller-manager-59b5dbf48f-tbq65\" (UID: \"89f0faed-9686-4f8f-af5d-ef542aa0f161\") " pod="openshift-route-controller-manager/route-controller-manager-59b5dbf48f-tbq65" Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.505913 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89f0faed-9686-4f8f-af5d-ef542aa0f161-config\") pod \"route-controller-manager-59b5dbf48f-tbq65\" (UID: \"89f0faed-9686-4f8f-af5d-ef542aa0f161\") " pod="openshift-route-controller-manager/route-controller-manager-59b5dbf48f-tbq65" Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.505971 4861 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b17abf38-2942-4438-842d-91c589bbc03f-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.505990 4861 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b17abf38-2942-4438-842d-91c589bbc03f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.506003 4861 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29d571aa-5de1-4779-acd7-b4f39c96e386-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.506016 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6lb4\" (UniqueName: \"kubernetes.io/projected/b17abf38-2942-4438-842d-91c589bbc03f-kube-api-access-w6lb4\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.506031 4861 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/29d571aa-5de1-4779-acd7-b4f39c96e386-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.506044 4861 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29d571aa-5de1-4779-acd7-b4f39c96e386-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.506058 4861 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b17abf38-2942-4438-842d-91c589bbc03f-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.506070 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fc67w\" (UniqueName: \"kubernetes.io/projected/29d571aa-5de1-4779-acd7-b4f39c96e386-kube-api-access-fc67w\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.507104 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/89f0faed-9686-4f8f-af5d-ef542aa0f161-client-ca\") pod \"route-controller-manager-59b5dbf48f-tbq65\" (UID: \"89f0faed-9686-4f8f-af5d-ef542aa0f161\") " pod="openshift-route-controller-manager/route-controller-manager-59b5dbf48f-tbq65" Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.507332 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89f0faed-9686-4f8f-af5d-ef542aa0f161-config\") pod \"route-controller-manager-59b5dbf48f-tbq65\" (UID: \"89f0faed-9686-4f8f-af5d-ef542aa0f161\") " pod="openshift-route-controller-manager/route-controller-manager-59b5dbf48f-tbq65" Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.512168 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89f0faed-9686-4f8f-af5d-ef542aa0f161-serving-cert\") pod \"route-controller-manager-59b5dbf48f-tbq65\" (UID: \"89f0faed-9686-4f8f-af5d-ef542aa0f161\") " pod="openshift-route-controller-manager/route-controller-manager-59b5dbf48f-tbq65" Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.522011 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bvv2\" (UniqueName: \"kubernetes.io/projected/89f0faed-9686-4f8f-af5d-ef542aa0f161-kube-api-access-9bvv2\") pod \"route-controller-manager-59b5dbf48f-tbq65\" (UID: \"89f0faed-9686-4f8f-af5d-ef542aa0f161\") " pod="openshift-route-controller-manager/route-controller-manager-59b5dbf48f-tbq65" Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.611991 4861 patch_prober.go:28] interesting pod/controller-manager-9788bd4c9-j8pf5 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: i/o timeout" start-of-body= Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.612041 4861 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-9788bd4c9-j8pf5" podUID="b17abf38-2942-4438-842d-91c589bbc03f" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: i/o timeout" Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.696611 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 09 09:08:25 crc kubenswrapper[4861]: W0309 09:08:25.711378 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podbc724060_a3ff_4a81_b417_a860a0a07045.slice/crio-531333315ec28b571e635141e46a35a56ba0f1eb405326f78b6a683d62f1b0f2 WatchSource:0}: Error finding container 531333315ec28b571e635141e46a35a56ba0f1eb405326f78b6a683d62f1b0f2: Status 404 returned error can't find the container with id 531333315ec28b571e635141e46a35a56ba0f1eb405326f78b6a683d62f1b0f2 Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.712298 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59b5dbf48f-tbq65" Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.715972 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fnk25" event={"ID":"0eaf35dc-b198-4fbb-9e43-ddac97f1f62b","Type":"ContainerStarted","Data":"525d4211dcbfe48d20481224873f6a6e2f939f304d71679efe94cf1c25471299"} Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.767998 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wcp7q" podStartSLOduration=1.9488107220000002 podStartE2EDuration="40.767973603s" podCreationTimestamp="2026-03-09 09:07:45 +0000 UTC" firstStartedPulling="2026-03-09 09:07:46.675444323 +0000 UTC m=+109.760483724" lastFinishedPulling="2026-03-09 09:08:25.494607204 +0000 UTC m=+148.579646605" observedRunningTime="2026-03-09 09:08:25.757063211 +0000 UTC m=+148.842102672" watchObservedRunningTime="2026-03-09 09:08:25.767973603 +0000 UTC m=+148.853013004" Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.769428 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-9788bd4c9-j8pf5" Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.770085 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-9788bd4c9-j8pf5" event={"ID":"b17abf38-2942-4438-842d-91c589bbc03f","Type":"ContainerDied","Data":"b61fb338b5603a0577de366660bc1a2f2d37c9def056bf4d92b12b1bafa7d95d"} Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.770123 4861 scope.go:117] "RemoveContainer" containerID="6a11c52601c89082eca65eec926e1bc57c11c527fd469ea399d0c85aad3b35fc" Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.781581 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cjwtv" podStartSLOduration=2.859877726 podStartE2EDuration="42.781563387s" podCreationTimestamp="2026-03-09 09:07:43 +0000 UTC" firstStartedPulling="2026-03-09 09:07:45.573014145 +0000 UTC m=+108.658053546" lastFinishedPulling="2026-03-09 09:08:25.494699806 +0000 UTC m=+148.579739207" observedRunningTime="2026-03-09 09:08:25.780176719 +0000 UTC m=+148.865216140" watchObservedRunningTime="2026-03-09 09:08:25.781563387 +0000 UTC m=+148.866602788" Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.783544 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-48m9h" event={"ID":"56bad3ba-c8af-4b69-96ae-93311a9d6151","Type":"ContainerStarted","Data":"65f2365bad210a7f5548f172a7f7cd07840e4e141ce9925a653919999a46b898"} Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.791992 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-9788bd4c9-j8pf5"] Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.796987 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-684f6f5f69-tqhd5" event={"ID":"29d571aa-5de1-4779-acd7-b4f39c96e386","Type":"ContainerDied","Data":"d354c7ff4b6c2a86e6a7e91c895036c1923aa74c505feb3a218ec3d9212db3f8"} Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.797099 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-684f6f5f69-tqhd5" Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.800091 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gcb6x" event={"ID":"245d74cf-545f-43d3-ad40-5260aef18260","Type":"ContainerStarted","Data":"5cc27015cd8e7901d288e5fdcef535feb4e96147fb697254c6ae19cea8c1f72e"} Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.806085 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-9788bd4c9-j8pf5"] Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.813702 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-48m9h" podStartSLOduration=3.092717135 podStartE2EDuration="42.813684334s" podCreationTimestamp="2026-03-09 09:07:43 +0000 UTC" firstStartedPulling="2026-03-09 09:07:45.572991275 +0000 UTC m=+108.658030686" lastFinishedPulling="2026-03-09 09:08:25.293958484 +0000 UTC m=+148.378997885" observedRunningTime="2026-03-09 09:08:25.813220842 +0000 UTC m=+148.898260243" watchObservedRunningTime="2026-03-09 09:08:25.813684334 +0000 UTC m=+148.898723735" Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.839689 4861 scope.go:117] "RemoveContainer" containerID="a75304d454d2e91b361a678778adc87f9497d5f5c334ba188f38d9d05d5edc6d" Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.842655 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gcb6x" podStartSLOduration=3.050836289 podStartE2EDuration="42.842641314s" podCreationTimestamp="2026-03-09 09:07:43 +0000 UTC" firstStartedPulling="2026-03-09 09:07:45.515001424 +0000 UTC m=+108.600040825" lastFinishedPulling="2026-03-09 09:08:25.306806449 +0000 UTC m=+148.391845850" observedRunningTime="2026-03-09 09:08:25.838264643 +0000 UTC m=+148.923304054" watchObservedRunningTime="2026-03-09 09:08:25.842641314 +0000 UTC m=+148.927680715" Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.865839 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-684f6f5f69-tqhd5"] Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.865880 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-684f6f5f69-tqhd5"] Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.936417 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wcp7q" Mar 09 09:08:25 crc kubenswrapper[4861]: I0309 09:08:25.936491 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wcp7q" Mar 09 09:08:26 crc kubenswrapper[4861]: I0309 09:08:26.166000 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59b5dbf48f-tbq65"] Mar 09 09:08:26 crc kubenswrapper[4861]: W0309 09:08:26.173115 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89f0faed_9686_4f8f_af5d_ef542aa0f161.slice/crio-ab97f5145d3626c811d0d7f531202ceb328413ba602688cc9ecff55291553b14 WatchSource:0}: Error finding container ab97f5145d3626c811d0d7f531202ceb328413ba602688cc9ecff55291553b14: Status 404 returned error can't find the container with id ab97f5145d3626c811d0d7f531202ceb328413ba602688cc9ecff55291553b14 Mar 09 09:08:26 crc kubenswrapper[4861]: I0309 09:08:26.809966 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cjwtv" event={"ID":"3ad8332f-9ca2-4dd0-903f-2bf5723aa51e","Type":"ContainerStarted","Data":"5fc761449bed87a2d5eb929cd50f3ddd1ddbaa199788d60e479ae409c70f1227"} Mar 09 09:08:26 crc kubenswrapper[4861]: I0309 09:08:26.818002 4861 generic.go:334] "Generic (PLEG): container finished" podID="6effa8f1-34f2-4a9e-b5cb-71a02695603e" containerID="35bf0fee9e9c674042ed4c4fbdb5bad9779dc06cb0cae1aac4678ac51c9c8ed8" exitCode=0 Mar 09 09:08:26 crc kubenswrapper[4861]: I0309 09:08:26.818077 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550788-wz6bg" event={"ID":"6effa8f1-34f2-4a9e-b5cb-71a02695603e","Type":"ContainerDied","Data":"35bf0fee9e9c674042ed4c4fbdb5bad9779dc06cb0cae1aac4678ac51c9c8ed8"} Mar 09 09:08:26 crc kubenswrapper[4861]: I0309 09:08:26.819650 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59b5dbf48f-tbq65" event={"ID":"89f0faed-9686-4f8f-af5d-ef542aa0f161","Type":"ContainerStarted","Data":"c6c9884d96bce2917b8e3ff927f81abdb311f213d05c592dbdbe190ae232f9c5"} Mar 09 09:08:26 crc kubenswrapper[4861]: I0309 09:08:26.819689 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59b5dbf48f-tbq65" event={"ID":"89f0faed-9686-4f8f-af5d-ef542aa0f161","Type":"ContainerStarted","Data":"ab97f5145d3626c811d0d7f531202ceb328413ba602688cc9ecff55291553b14"} Mar 09 09:08:26 crc kubenswrapper[4861]: I0309 09:08:26.819934 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-59b5dbf48f-tbq65" Mar 09 09:08:26 crc kubenswrapper[4861]: I0309 09:08:26.835203 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j6fcf" event={"ID":"33158a78-8c1c-4aa1-9c51-66e21d0e8ae6","Type":"ContainerStarted","Data":"146a4ed3edae2c1ad1b1d9047b6e1bdc80e7e837c091f12a65496e962099ee40"} Mar 09 09:08:26 crc kubenswrapper[4861]: I0309 09:08:26.843521 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mlqmj" event={"ID":"e8af01b5-95f0-43a2-b228-675b98c6203f","Type":"ContainerStarted","Data":"14bf210afb405bac1772e678cdca30ca2b20935e864c470c73a0d4d066443f3c"} Mar 09 09:08:26 crc kubenswrapper[4861]: I0309 09:08:26.849083 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-59b5dbf48f-tbq65" podStartSLOduration=4.849067322 podStartE2EDuration="4.849067322s" podCreationTimestamp="2026-03-09 09:08:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:08:26.848292071 +0000 UTC m=+149.933331472" watchObservedRunningTime="2026-03-09 09:08:26.849067322 +0000 UTC m=+149.934106723" Mar 09 09:08:26 crc kubenswrapper[4861]: I0309 09:08:26.850555 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wcp7q" event={"ID":"0398ae40-9658-4a9a-949c-4419bb1ca9bf","Type":"ContainerStarted","Data":"d2d42a3e5e633101270ecaec42e27e4983bcb1e91f98a021783ceb2ebb7abd0d"} Mar 09 09:08:26 crc kubenswrapper[4861]: I0309 09:08:26.852245 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"bc724060-a3ff-4a81-b417-a860a0a07045","Type":"ContainerStarted","Data":"a784b30f9ae5a82b0927e34dd0ee678c95f0c3b2f18675511c3e7445cfb66f90"} Mar 09 09:08:26 crc kubenswrapper[4861]: I0309 09:08:26.852282 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"bc724060-a3ff-4a81-b417-a860a0a07045","Type":"ContainerStarted","Data":"531333315ec28b571e635141e46a35a56ba0f1eb405326f78b6a683d62f1b0f2"} Mar 09 09:08:26 crc kubenswrapper[4861]: I0309 09:08:26.878578 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mlqmj" podStartSLOduration=3.934982314 podStartE2EDuration="43.878560497s" podCreationTimestamp="2026-03-09 09:07:43 +0000 UTC" firstStartedPulling="2026-03-09 09:07:45.527027608 +0000 UTC m=+108.612067009" lastFinishedPulling="2026-03-09 09:08:25.470605791 +0000 UTC m=+148.555645192" observedRunningTime="2026-03-09 09:08:26.876394336 +0000 UTC m=+149.961433757" watchObservedRunningTime="2026-03-09 09:08:26.878560497 +0000 UTC m=+149.963599898" Mar 09 09:08:26 crc kubenswrapper[4861]: I0309 09:08:26.921132 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-j6fcf" podStartSLOduration=3.374369791 podStartE2EDuration="40.921114781s" podCreationTimestamp="2026-03-09 09:07:46 +0000 UTC" firstStartedPulling="2026-03-09 09:07:47.947831943 +0000 UTC m=+111.032871344" lastFinishedPulling="2026-03-09 09:08:25.494576933 +0000 UTC m=+148.579616334" observedRunningTime="2026-03-09 09:08:26.905925692 +0000 UTC m=+149.990965093" watchObservedRunningTime="2026-03-09 09:08:26.921114781 +0000 UTC m=+150.006154182" Mar 09 09:08:26 crc kubenswrapper[4861]: I0309 09:08:26.924585 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-j6fcf" Mar 09 09:08:26 crc kubenswrapper[4861]: I0309 09:08:26.924835 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-j6fcf" Mar 09 09:08:26 crc kubenswrapper[4861]: I0309 09:08:26.943790 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fnk25" podStartSLOduration=3.305465044 podStartE2EDuration="41.943771917s" podCreationTimestamp="2026-03-09 09:07:45 +0000 UTC" firstStartedPulling="2026-03-09 09:07:46.668474465 +0000 UTC m=+109.753513866" lastFinishedPulling="2026-03-09 09:08:25.306781318 +0000 UTC m=+148.391820739" observedRunningTime="2026-03-09 09:08:26.941982867 +0000 UTC m=+150.027022278" watchObservedRunningTime="2026-03-09 09:08:26.943771917 +0000 UTC m=+150.028811318" Mar 09 09:08:26 crc kubenswrapper[4861]: I0309 09:08:26.957469 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-59b5dbf48f-tbq65" Mar 09 09:08:27 crc kubenswrapper[4861]: I0309 09:08:27.090712 4861 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-wcp7q" podUID="0398ae40-9658-4a9a-949c-4419bb1ca9bf" containerName="registry-server" probeResult="failure" output=< Mar 09 09:08:27 crc kubenswrapper[4861]: timeout: failed to connect service ":50051" within 1s Mar 09 09:08:27 crc kubenswrapper[4861]: > Mar 09 09:08:27 crc kubenswrapper[4861]: I0309 09:08:27.664415 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29d571aa-5de1-4779-acd7-b4f39c96e386" path="/var/lib/kubelet/pods/29d571aa-5de1-4779-acd7-b4f39c96e386/volumes" Mar 09 09:08:27 crc kubenswrapper[4861]: I0309 09:08:27.665035 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b17abf38-2942-4438-842d-91c589bbc03f" path="/var/lib/kubelet/pods/b17abf38-2942-4438-842d-91c589bbc03f/volumes" Mar 09 09:08:27 crc kubenswrapper[4861]: I0309 09:08:27.859167 4861 generic.go:334] "Generic (PLEG): container finished" podID="bc724060-a3ff-4a81-b417-a860a0a07045" containerID="a784b30f9ae5a82b0927e34dd0ee678c95f0c3b2f18675511c3e7445cfb66f90" exitCode=0 Mar 09 09:08:27 crc kubenswrapper[4861]: I0309 09:08:27.860028 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"bc724060-a3ff-4a81-b417-a860a0a07045","Type":"ContainerDied","Data":"a784b30f9ae5a82b0927e34dd0ee678c95f0c3b2f18675511c3e7445cfb66f90"} Mar 09 09:08:28 crc kubenswrapper[4861]: I0309 09:08:28.010892 4861 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-j6fcf" podUID="33158a78-8c1c-4aa1-9c51-66e21d0e8ae6" containerName="registry-server" probeResult="failure" output=< Mar 09 09:08:28 crc kubenswrapper[4861]: timeout: failed to connect service ":50051" within 1s Mar 09 09:08:28 crc kubenswrapper[4861]: > Mar 09 09:08:28 crc kubenswrapper[4861]: I0309 09:08:28.141106 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 09:08:28 crc kubenswrapper[4861]: I0309 09:08:28.246395 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550788-wz6bg" Mar 09 09:08:28 crc kubenswrapper[4861]: I0309 09:08:28.247086 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bc724060-a3ff-4a81-b417-a860a0a07045-kube-api-access\") pod \"bc724060-a3ff-4a81-b417-a860a0a07045\" (UID: \"bc724060-a3ff-4a81-b417-a860a0a07045\") " Mar 09 09:08:28 crc kubenswrapper[4861]: I0309 09:08:28.247181 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bc724060-a3ff-4a81-b417-a860a0a07045-kubelet-dir\") pod \"bc724060-a3ff-4a81-b417-a860a0a07045\" (UID: \"bc724060-a3ff-4a81-b417-a860a0a07045\") " Mar 09 09:08:28 crc kubenswrapper[4861]: I0309 09:08:28.247275 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bc724060-a3ff-4a81-b417-a860a0a07045-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "bc724060-a3ff-4a81-b417-a860a0a07045" (UID: "bc724060-a3ff-4a81-b417-a860a0a07045"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:08:28 crc kubenswrapper[4861]: I0309 09:08:28.247418 4861 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bc724060-a3ff-4a81-b417-a860a0a07045-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:28 crc kubenswrapper[4861]: I0309 09:08:28.251784 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc724060-a3ff-4a81-b417-a860a0a07045-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "bc724060-a3ff-4a81-b417-a860a0a07045" (UID: "bc724060-a3ff-4a81-b417-a860a0a07045"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:28 crc kubenswrapper[4861]: I0309 09:08:28.288987 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-769f74488b-4xx9h"] Mar 09 09:08:28 crc kubenswrapper[4861]: E0309 09:08:28.289279 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6effa8f1-34f2-4a9e-b5cb-71a02695603e" containerName="oc" Mar 09 09:08:28 crc kubenswrapper[4861]: I0309 09:08:28.289299 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="6effa8f1-34f2-4a9e-b5cb-71a02695603e" containerName="oc" Mar 09 09:08:28 crc kubenswrapper[4861]: E0309 09:08:28.289316 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc724060-a3ff-4a81-b417-a860a0a07045" containerName="pruner" Mar 09 09:08:28 crc kubenswrapper[4861]: I0309 09:08:28.289323 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc724060-a3ff-4a81-b417-a860a0a07045" containerName="pruner" Mar 09 09:08:28 crc kubenswrapper[4861]: I0309 09:08:28.289450 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="6effa8f1-34f2-4a9e-b5cb-71a02695603e" containerName="oc" Mar 09 09:08:28 crc kubenswrapper[4861]: I0309 09:08:28.289464 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc724060-a3ff-4a81-b417-a860a0a07045" containerName="pruner" Mar 09 09:08:28 crc kubenswrapper[4861]: I0309 09:08:28.289946 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-769f74488b-4xx9h" Mar 09 09:08:28 crc kubenswrapper[4861]: I0309 09:08:28.294030 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 09 09:08:28 crc kubenswrapper[4861]: I0309 09:08:28.294279 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 09 09:08:28 crc kubenswrapper[4861]: I0309 09:08:28.294286 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 09 09:08:28 crc kubenswrapper[4861]: I0309 09:08:28.294462 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 09 09:08:28 crc kubenswrapper[4861]: I0309 09:08:28.294584 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 09 09:08:28 crc kubenswrapper[4861]: I0309 09:08:28.299889 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 09 09:08:28 crc kubenswrapper[4861]: I0309 09:08:28.302215 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-769f74488b-4xx9h"] Mar 09 09:08:28 crc kubenswrapper[4861]: I0309 09:08:28.306147 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 09 09:08:28 crc kubenswrapper[4861]: I0309 09:08:28.348281 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvrzs\" (UniqueName: \"kubernetes.io/projected/6effa8f1-34f2-4a9e-b5cb-71a02695603e-kube-api-access-dvrzs\") pod \"6effa8f1-34f2-4a9e-b5cb-71a02695603e\" (UID: \"6effa8f1-34f2-4a9e-b5cb-71a02695603e\") " Mar 09 09:08:28 crc kubenswrapper[4861]: I0309 09:08:28.348727 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebace06e-6730-48a4-bee8-ebdbcc3d8f78-config\") pod \"controller-manager-769f74488b-4xx9h\" (UID: \"ebace06e-6730-48a4-bee8-ebdbcc3d8f78\") " pod="openshift-controller-manager/controller-manager-769f74488b-4xx9h" Mar 09 09:08:28 crc kubenswrapper[4861]: I0309 09:08:28.348802 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbd4c\" (UniqueName: \"kubernetes.io/projected/ebace06e-6730-48a4-bee8-ebdbcc3d8f78-kube-api-access-kbd4c\") pod \"controller-manager-769f74488b-4xx9h\" (UID: \"ebace06e-6730-48a4-bee8-ebdbcc3d8f78\") " pod="openshift-controller-manager/controller-manager-769f74488b-4xx9h" Mar 09 09:08:28 crc kubenswrapper[4861]: I0309 09:08:28.348850 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebace06e-6730-48a4-bee8-ebdbcc3d8f78-serving-cert\") pod \"controller-manager-769f74488b-4xx9h\" (UID: \"ebace06e-6730-48a4-bee8-ebdbcc3d8f78\") " pod="openshift-controller-manager/controller-manager-769f74488b-4xx9h" Mar 09 09:08:28 crc kubenswrapper[4861]: I0309 09:08:28.348883 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ebace06e-6730-48a4-bee8-ebdbcc3d8f78-client-ca\") pod \"controller-manager-769f74488b-4xx9h\" (UID: \"ebace06e-6730-48a4-bee8-ebdbcc3d8f78\") " pod="openshift-controller-manager/controller-manager-769f74488b-4xx9h" Mar 09 09:08:28 crc kubenswrapper[4861]: I0309 09:08:28.348947 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ebace06e-6730-48a4-bee8-ebdbcc3d8f78-proxy-ca-bundles\") pod \"controller-manager-769f74488b-4xx9h\" (UID: \"ebace06e-6730-48a4-bee8-ebdbcc3d8f78\") " pod="openshift-controller-manager/controller-manager-769f74488b-4xx9h" Mar 09 09:08:28 crc kubenswrapper[4861]: I0309 09:08:28.349055 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bc724060-a3ff-4a81-b417-a860a0a07045-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:28 crc kubenswrapper[4861]: I0309 09:08:28.351421 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6effa8f1-34f2-4a9e-b5cb-71a02695603e-kube-api-access-dvrzs" (OuterVolumeSpecName: "kube-api-access-dvrzs") pod "6effa8f1-34f2-4a9e-b5cb-71a02695603e" (UID: "6effa8f1-34f2-4a9e-b5cb-71a02695603e"). InnerVolumeSpecName "kube-api-access-dvrzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:28 crc kubenswrapper[4861]: I0309 09:08:28.449494 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbd4c\" (UniqueName: \"kubernetes.io/projected/ebace06e-6730-48a4-bee8-ebdbcc3d8f78-kube-api-access-kbd4c\") pod \"controller-manager-769f74488b-4xx9h\" (UID: \"ebace06e-6730-48a4-bee8-ebdbcc3d8f78\") " pod="openshift-controller-manager/controller-manager-769f74488b-4xx9h" Mar 09 09:08:28 crc kubenswrapper[4861]: I0309 09:08:28.449552 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebace06e-6730-48a4-bee8-ebdbcc3d8f78-serving-cert\") pod \"controller-manager-769f74488b-4xx9h\" (UID: \"ebace06e-6730-48a4-bee8-ebdbcc3d8f78\") " pod="openshift-controller-manager/controller-manager-769f74488b-4xx9h" Mar 09 09:08:28 crc kubenswrapper[4861]: I0309 09:08:28.449584 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ebace06e-6730-48a4-bee8-ebdbcc3d8f78-client-ca\") pod \"controller-manager-769f74488b-4xx9h\" (UID: \"ebace06e-6730-48a4-bee8-ebdbcc3d8f78\") " pod="openshift-controller-manager/controller-manager-769f74488b-4xx9h" Mar 09 09:08:28 crc kubenswrapper[4861]: I0309 09:08:28.449616 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ebace06e-6730-48a4-bee8-ebdbcc3d8f78-proxy-ca-bundles\") pod \"controller-manager-769f74488b-4xx9h\" (UID: \"ebace06e-6730-48a4-bee8-ebdbcc3d8f78\") " pod="openshift-controller-manager/controller-manager-769f74488b-4xx9h" Mar 09 09:08:28 crc kubenswrapper[4861]: I0309 09:08:28.449641 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebace06e-6730-48a4-bee8-ebdbcc3d8f78-config\") pod \"controller-manager-769f74488b-4xx9h\" (UID: \"ebace06e-6730-48a4-bee8-ebdbcc3d8f78\") " pod="openshift-controller-manager/controller-manager-769f74488b-4xx9h" Mar 09 09:08:28 crc kubenswrapper[4861]: I0309 09:08:28.449692 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvrzs\" (UniqueName: \"kubernetes.io/projected/6effa8f1-34f2-4a9e-b5cb-71a02695603e-kube-api-access-dvrzs\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:28 crc kubenswrapper[4861]: I0309 09:08:28.450939 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ebace06e-6730-48a4-bee8-ebdbcc3d8f78-client-ca\") pod \"controller-manager-769f74488b-4xx9h\" (UID: \"ebace06e-6730-48a4-bee8-ebdbcc3d8f78\") " pod="openshift-controller-manager/controller-manager-769f74488b-4xx9h" Mar 09 09:08:28 crc kubenswrapper[4861]: I0309 09:08:28.450992 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebace06e-6730-48a4-bee8-ebdbcc3d8f78-config\") pod \"controller-manager-769f74488b-4xx9h\" (UID: \"ebace06e-6730-48a4-bee8-ebdbcc3d8f78\") " pod="openshift-controller-manager/controller-manager-769f74488b-4xx9h" Mar 09 09:08:28 crc kubenswrapper[4861]: I0309 09:08:28.451504 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ebace06e-6730-48a4-bee8-ebdbcc3d8f78-proxy-ca-bundles\") pod \"controller-manager-769f74488b-4xx9h\" (UID: \"ebace06e-6730-48a4-bee8-ebdbcc3d8f78\") " pod="openshift-controller-manager/controller-manager-769f74488b-4xx9h" Mar 09 09:08:28 crc kubenswrapper[4861]: I0309 09:08:28.455406 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebace06e-6730-48a4-bee8-ebdbcc3d8f78-serving-cert\") pod \"controller-manager-769f74488b-4xx9h\" (UID: \"ebace06e-6730-48a4-bee8-ebdbcc3d8f78\") " pod="openshift-controller-manager/controller-manager-769f74488b-4xx9h" Mar 09 09:08:28 crc kubenswrapper[4861]: I0309 09:08:28.468705 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbd4c\" (UniqueName: \"kubernetes.io/projected/ebace06e-6730-48a4-bee8-ebdbcc3d8f78-kube-api-access-kbd4c\") pod \"controller-manager-769f74488b-4xx9h\" (UID: \"ebace06e-6730-48a4-bee8-ebdbcc3d8f78\") " pod="openshift-controller-manager/controller-manager-769f74488b-4xx9h" Mar 09 09:08:28 crc kubenswrapper[4861]: I0309 09:08:28.611528 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-769f74488b-4xx9h" Mar 09 09:08:28 crc kubenswrapper[4861]: I0309 09:08:28.865970 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550788-wz6bg" event={"ID":"6effa8f1-34f2-4a9e-b5cb-71a02695603e","Type":"ContainerDied","Data":"937675e303e874c10f135e48a4ad8e4d0be11c7a949c0d88f6d2342e66c19273"} Mar 09 09:08:28 crc kubenswrapper[4861]: I0309 09:08:28.866167 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550788-wz6bg" Mar 09 09:08:28 crc kubenswrapper[4861]: I0309 09:08:28.866556 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="937675e303e874c10f135e48a4ad8e4d0be11c7a949c0d88f6d2342e66c19273" Mar 09 09:08:28 crc kubenswrapper[4861]: I0309 09:08:28.867350 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"bc724060-a3ff-4a81-b417-a860a0a07045","Type":"ContainerDied","Data":"531333315ec28b571e635141e46a35a56ba0f1eb405326f78b6a683d62f1b0f2"} Mar 09 09:08:28 crc kubenswrapper[4861]: I0309 09:08:28.867411 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="531333315ec28b571e635141e46a35a56ba0f1eb405326f78b6a683d62f1b0f2" Mar 09 09:08:28 crc kubenswrapper[4861]: I0309 09:08:28.867465 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 09:08:29 crc kubenswrapper[4861]: I0309 09:08:29.067502 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-769f74488b-4xx9h"] Mar 09 09:08:29 crc kubenswrapper[4861]: W0309 09:08:29.074152 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebace06e_6730_48a4_bee8_ebdbcc3d8f78.slice/crio-cf125631c4d89bff3e97cb5b1a3eff16cd9782c05d2ea74da352977bc7b559ca WatchSource:0}: Error finding container cf125631c4d89bff3e97cb5b1a3eff16cd9782c05d2ea74da352977bc7b559ca: Status 404 returned error can't find the container with id cf125631c4d89bff3e97cb5b1a3eff16cd9782c05d2ea74da352977bc7b559ca Mar 09 09:08:29 crc kubenswrapper[4861]: I0309 09:08:29.872962 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-769f74488b-4xx9h" event={"ID":"ebace06e-6730-48a4-bee8-ebdbcc3d8f78","Type":"ContainerStarted","Data":"23ad2ab4a4dbb0d18edf7f03fe76242001ae46394013ca055e6890fcb7364153"} Mar 09 09:08:29 crc kubenswrapper[4861]: I0309 09:08:29.873226 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-769f74488b-4xx9h" Mar 09 09:08:29 crc kubenswrapper[4861]: I0309 09:08:29.873237 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-769f74488b-4xx9h" event={"ID":"ebace06e-6730-48a4-bee8-ebdbcc3d8f78","Type":"ContainerStarted","Data":"cf125631c4d89bff3e97cb5b1a3eff16cd9782c05d2ea74da352977bc7b559ca"} Mar 09 09:08:29 crc kubenswrapper[4861]: I0309 09:08:29.877631 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-97dkg"] Mar 09 09:08:29 crc kubenswrapper[4861]: I0309 09:08:29.881880 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-769f74488b-4xx9h" Mar 09 09:08:29 crc kubenswrapper[4861]: I0309 09:08:29.896035 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-769f74488b-4xx9h" podStartSLOduration=7.896017061 podStartE2EDuration="7.896017061s" podCreationTimestamp="2026-03-09 09:08:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:08:29.892907166 +0000 UTC m=+152.977946567" watchObservedRunningTime="2026-03-09 09:08:29.896017061 +0000 UTC m=+152.981056472" Mar 09 09:08:30 crc kubenswrapper[4861]: I0309 09:08:30.695011 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 09:08:33 crc kubenswrapper[4861]: I0309 09:08:33.626131 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cjwtv" Mar 09 09:08:33 crc kubenswrapper[4861]: I0309 09:08:33.626948 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cjwtv" Mar 09 09:08:33 crc kubenswrapper[4861]: I0309 09:08:33.684575 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cjwtv" Mar 09 09:08:33 crc kubenswrapper[4861]: I0309 09:08:33.788613 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gcb6x" Mar 09 09:08:33 crc kubenswrapper[4861]: I0309 09:08:33.788659 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gcb6x" Mar 09 09:08:33 crc kubenswrapper[4861]: I0309 09:08:33.836283 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gcb6x" Mar 09 09:08:33 crc kubenswrapper[4861]: I0309 09:08:33.894448 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zkngt" event={"ID":"6eac3eed-7721-4030-b1e3-9dd28fea2e49","Type":"ContainerStarted","Data":"ac828fd1aa0f1b74d8fba5f0eb43f48c9ce9a8f13dbc669fd6e18a8fb33507cd"} Mar 09 09:08:33 crc kubenswrapper[4861]: I0309 09:08:33.933153 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cjwtv" Mar 09 09:08:33 crc kubenswrapper[4861]: I0309 09:08:33.948006 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mlqmj" Mar 09 09:08:33 crc kubenswrapper[4861]: I0309 09:08:33.948779 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mlqmj" Mar 09 09:08:33 crc kubenswrapper[4861]: I0309 09:08:33.951846 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gcb6x" Mar 09 09:08:34 crc kubenswrapper[4861]: I0309 09:08:34.006866 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mlqmj" Mar 09 09:08:34 crc kubenswrapper[4861]: I0309 09:08:34.158714 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-48m9h" Mar 09 09:08:34 crc kubenswrapper[4861]: I0309 09:08:34.159194 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-48m9h" Mar 09 09:08:34 crc kubenswrapper[4861]: I0309 09:08:34.199268 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-48m9h" Mar 09 09:08:34 crc kubenswrapper[4861]: I0309 09:08:34.903978 4861 generic.go:334] "Generic (PLEG): container finished" podID="6eac3eed-7721-4030-b1e3-9dd28fea2e49" containerID="ac828fd1aa0f1b74d8fba5f0eb43f48c9ce9a8f13dbc669fd6e18a8fb33507cd" exitCode=0 Mar 09 09:08:34 crc kubenswrapper[4861]: I0309 09:08:34.904666 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zkngt" event={"ID":"6eac3eed-7721-4030-b1e3-9dd28fea2e49","Type":"ContainerDied","Data":"ac828fd1aa0f1b74d8fba5f0eb43f48c9ce9a8f13dbc669fd6e18a8fb33507cd"} Mar 09 09:08:34 crc kubenswrapper[4861]: I0309 09:08:34.956128 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mlqmj" Mar 09 09:08:34 crc kubenswrapper[4861]: I0309 09:08:34.963986 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-48m9h" Mar 09 09:08:35 crc kubenswrapper[4861]: I0309 09:08:35.202264 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 09 09:08:35 crc kubenswrapper[4861]: I0309 09:08:35.203947 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 09 09:08:35 crc kubenswrapper[4861]: I0309 09:08:35.205867 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 09 09:08:35 crc kubenswrapper[4861]: I0309 09:08:35.206429 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 09 09:08:35 crc kubenswrapper[4861]: I0309 09:08:35.210077 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 09 09:08:35 crc kubenswrapper[4861]: I0309 09:08:35.349416 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6be593ec-c05d-4500-961d-74aa43fa5ff7-kube-api-access\") pod \"installer-9-crc\" (UID: \"6be593ec-c05d-4500-961d-74aa43fa5ff7\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 09:08:35 crc kubenswrapper[4861]: I0309 09:08:35.349506 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6be593ec-c05d-4500-961d-74aa43fa5ff7-kubelet-dir\") pod \"installer-9-crc\" (UID: \"6be593ec-c05d-4500-961d-74aa43fa5ff7\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 09:08:35 crc kubenswrapper[4861]: I0309 09:08:35.349540 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6be593ec-c05d-4500-961d-74aa43fa5ff7-var-lock\") pod \"installer-9-crc\" (UID: \"6be593ec-c05d-4500-961d-74aa43fa5ff7\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 09:08:35 crc kubenswrapper[4861]: I0309 09:08:35.450436 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6be593ec-c05d-4500-961d-74aa43fa5ff7-kubelet-dir\") pod \"installer-9-crc\" (UID: \"6be593ec-c05d-4500-961d-74aa43fa5ff7\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 09:08:35 crc kubenswrapper[4861]: I0309 09:08:35.450489 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6be593ec-c05d-4500-961d-74aa43fa5ff7-var-lock\") pod \"installer-9-crc\" (UID: \"6be593ec-c05d-4500-961d-74aa43fa5ff7\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 09:08:35 crc kubenswrapper[4861]: I0309 09:08:35.450540 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6be593ec-c05d-4500-961d-74aa43fa5ff7-kube-api-access\") pod \"installer-9-crc\" (UID: \"6be593ec-c05d-4500-961d-74aa43fa5ff7\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 09:08:35 crc kubenswrapper[4861]: I0309 09:08:35.450586 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6be593ec-c05d-4500-961d-74aa43fa5ff7-kubelet-dir\") pod \"installer-9-crc\" (UID: \"6be593ec-c05d-4500-961d-74aa43fa5ff7\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 09:08:35 crc kubenswrapper[4861]: I0309 09:08:35.450645 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6be593ec-c05d-4500-961d-74aa43fa5ff7-var-lock\") pod \"installer-9-crc\" (UID: \"6be593ec-c05d-4500-961d-74aa43fa5ff7\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 09:08:35 crc kubenswrapper[4861]: I0309 09:08:35.479357 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6be593ec-c05d-4500-961d-74aa43fa5ff7-kube-api-access\") pod \"installer-9-crc\" (UID: \"6be593ec-c05d-4500-961d-74aa43fa5ff7\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 09:08:35 crc kubenswrapper[4861]: I0309 09:08:35.522486 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 09 09:08:35 crc kubenswrapper[4861]: I0309 09:08:35.580719 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fnk25" Mar 09 09:08:35 crc kubenswrapper[4861]: I0309 09:08:35.580770 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fnk25" Mar 09 09:08:35 crc kubenswrapper[4861]: I0309 09:08:35.644146 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fnk25" Mar 09 09:08:35 crc kubenswrapper[4861]: I0309 09:08:35.912972 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zkngt" event={"ID":"6eac3eed-7721-4030-b1e3-9dd28fea2e49","Type":"ContainerStarted","Data":"22c2787ab7459d1deced22ba471cbda27f48a357d73310be38990de6082e6b54"} Mar 09 09:08:35 crc kubenswrapper[4861]: I0309 09:08:35.936304 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zkngt" podStartSLOduration=3.714373967 podStartE2EDuration="49.936284989s" podCreationTimestamp="2026-03-09 09:07:46 +0000 UTC" firstStartedPulling="2026-03-09 09:07:49.065195612 +0000 UTC m=+112.150235013" lastFinishedPulling="2026-03-09 09:08:35.287106634 +0000 UTC m=+158.372146035" observedRunningTime="2026-03-09 09:08:35.93234281 +0000 UTC m=+159.017382211" watchObservedRunningTime="2026-03-09 09:08:35.936284989 +0000 UTC m=+159.021324390" Mar 09 09:08:35 crc kubenswrapper[4861]: I0309 09:08:35.947290 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 09 09:08:35 crc kubenswrapper[4861]: I0309 09:08:35.958309 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fnk25" Mar 09 09:08:35 crc kubenswrapper[4861]: W0309 09:08:35.970155 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod6be593ec_c05d_4500_961d_74aa43fa5ff7.slice/crio-c95bf7c2f7a33395d15dae2d27d73eec14a36e4fb890c8e1d3199f83f49d3173 WatchSource:0}: Error finding container c95bf7c2f7a33395d15dae2d27d73eec14a36e4fb890c8e1d3199f83f49d3173: Status 404 returned error can't find the container with id c95bf7c2f7a33395d15dae2d27d73eec14a36e4fb890c8e1d3199f83f49d3173 Mar 09 09:08:35 crc kubenswrapper[4861]: I0309 09:08:35.972605 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wcp7q" Mar 09 09:08:36 crc kubenswrapper[4861]: I0309 09:08:36.035092 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wcp7q" Mar 09 09:08:36 crc kubenswrapper[4861]: I0309 09:08:36.366007 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mlqmj"] Mar 09 09:08:36 crc kubenswrapper[4861]: I0309 09:08:36.920590 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"6be593ec-c05d-4500-961d-74aa43fa5ff7","Type":"ContainerStarted","Data":"2ed2fe73d7fec7263224791214e79191dbf6b4ef7b1abf9585957c2c63d89b7c"} Mar 09 09:08:36 crc kubenswrapper[4861]: I0309 09:08:36.920635 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"6be593ec-c05d-4500-961d-74aa43fa5ff7","Type":"ContainerStarted","Data":"c95bf7c2f7a33395d15dae2d27d73eec14a36e4fb890c8e1d3199f83f49d3173"} Mar 09 09:08:36 crc kubenswrapper[4861]: I0309 09:08:36.952674 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=1.952647652 podStartE2EDuration="1.952647652s" podCreationTimestamp="2026-03-09 09:08:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:08:36.951295864 +0000 UTC m=+160.036335285" watchObservedRunningTime="2026-03-09 09:08:36.952647652 +0000 UTC m=+160.037687093" Mar 09 09:08:36 crc kubenswrapper[4861]: I0309 09:08:36.980675 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-j6fcf" Mar 09 09:08:37 crc kubenswrapper[4861]: I0309 09:08:37.023046 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-j6fcf" Mar 09 09:08:37 crc kubenswrapper[4861]: I0309 09:08:37.316440 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zkngt" Mar 09 09:08:37 crc kubenswrapper[4861]: I0309 09:08:37.316513 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zkngt" Mar 09 09:08:37 crc kubenswrapper[4861]: I0309 09:08:37.930801 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mlqmj" podUID="e8af01b5-95f0-43a2-b228-675b98c6203f" containerName="registry-server" containerID="cri-o://14bf210afb405bac1772e678cdca30ca2b20935e864c470c73a0d4d066443f3c" gracePeriod=2 Mar 09 09:08:38 crc kubenswrapper[4861]: I0309 09:08:38.186833 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-48m9h"] Mar 09 09:08:38 crc kubenswrapper[4861]: I0309 09:08:38.187229 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-48m9h" podUID="56bad3ba-c8af-4b69-96ae-93311a9d6151" containerName="registry-server" containerID="cri-o://65f2365bad210a7f5548f172a7f7cd07840e4e141ce9925a653919999a46b898" gracePeriod=2 Mar 09 09:08:38 crc kubenswrapper[4861]: I0309 09:08:38.351710 4861 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zkngt" podUID="6eac3eed-7721-4030-b1e3-9dd28fea2e49" containerName="registry-server" probeResult="failure" output=< Mar 09 09:08:38 crc kubenswrapper[4861]: timeout: failed to connect service ":50051" within 1s Mar 09 09:08:38 crc kubenswrapper[4861]: > Mar 09 09:08:38 crc kubenswrapper[4861]: I0309 09:08:38.424433 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mlqmj" Mar 09 09:08:38 crc kubenswrapper[4861]: I0309 09:08:38.500359 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rq2j\" (UniqueName: \"kubernetes.io/projected/e8af01b5-95f0-43a2-b228-675b98c6203f-kube-api-access-8rq2j\") pod \"e8af01b5-95f0-43a2-b228-675b98c6203f\" (UID: \"e8af01b5-95f0-43a2-b228-675b98c6203f\") " Mar 09 09:08:38 crc kubenswrapper[4861]: I0309 09:08:38.500432 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8af01b5-95f0-43a2-b228-675b98c6203f-utilities\") pod \"e8af01b5-95f0-43a2-b228-675b98c6203f\" (UID: \"e8af01b5-95f0-43a2-b228-675b98c6203f\") " Mar 09 09:08:38 crc kubenswrapper[4861]: I0309 09:08:38.500490 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8af01b5-95f0-43a2-b228-675b98c6203f-catalog-content\") pod \"e8af01b5-95f0-43a2-b228-675b98c6203f\" (UID: \"e8af01b5-95f0-43a2-b228-675b98c6203f\") " Mar 09 09:08:38 crc kubenswrapper[4861]: I0309 09:08:38.501408 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8af01b5-95f0-43a2-b228-675b98c6203f-utilities" (OuterVolumeSpecName: "utilities") pod "e8af01b5-95f0-43a2-b228-675b98c6203f" (UID: "e8af01b5-95f0-43a2-b228-675b98c6203f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:08:38 crc kubenswrapper[4861]: I0309 09:08:38.519652 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8af01b5-95f0-43a2-b228-675b98c6203f-kube-api-access-8rq2j" (OuterVolumeSpecName: "kube-api-access-8rq2j") pod "e8af01b5-95f0-43a2-b228-675b98c6203f" (UID: "e8af01b5-95f0-43a2-b228-675b98c6203f"). InnerVolumeSpecName "kube-api-access-8rq2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:38 crc kubenswrapper[4861]: I0309 09:08:38.559586 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8af01b5-95f0-43a2-b228-675b98c6203f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e8af01b5-95f0-43a2-b228-675b98c6203f" (UID: "e8af01b5-95f0-43a2-b228-675b98c6203f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:08:38 crc kubenswrapper[4861]: I0309 09:08:38.601743 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rq2j\" (UniqueName: \"kubernetes.io/projected/e8af01b5-95f0-43a2-b228-675b98c6203f-kube-api-access-8rq2j\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:38 crc kubenswrapper[4861]: I0309 09:08:38.601782 4861 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8af01b5-95f0-43a2-b228-675b98c6203f-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:38 crc kubenswrapper[4861]: I0309 09:08:38.601799 4861 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8af01b5-95f0-43a2-b228-675b98c6203f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:38 crc kubenswrapper[4861]: I0309 09:08:38.624430 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-48m9h" Mar 09 09:08:38 crc kubenswrapper[4861]: I0309 09:08:38.775285 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wcp7q"] Mar 09 09:08:38 crc kubenswrapper[4861]: I0309 09:08:38.775727 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wcp7q" podUID="0398ae40-9658-4a9a-949c-4419bb1ca9bf" containerName="registry-server" containerID="cri-o://d2d42a3e5e633101270ecaec42e27e4983bcb1e91f98a021783ceb2ebb7abd0d" gracePeriod=2 Mar 09 09:08:38 crc kubenswrapper[4861]: I0309 09:08:38.804089 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56bad3ba-c8af-4b69-96ae-93311a9d6151-utilities\") pod \"56bad3ba-c8af-4b69-96ae-93311a9d6151\" (UID: \"56bad3ba-c8af-4b69-96ae-93311a9d6151\") " Mar 09 09:08:38 crc kubenswrapper[4861]: I0309 09:08:38.804156 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56bad3ba-c8af-4b69-96ae-93311a9d6151-catalog-content\") pod \"56bad3ba-c8af-4b69-96ae-93311a9d6151\" (UID: \"56bad3ba-c8af-4b69-96ae-93311a9d6151\") " Mar 09 09:08:38 crc kubenswrapper[4861]: I0309 09:08:38.804248 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6c8pq\" (UniqueName: \"kubernetes.io/projected/56bad3ba-c8af-4b69-96ae-93311a9d6151-kube-api-access-6c8pq\") pod \"56bad3ba-c8af-4b69-96ae-93311a9d6151\" (UID: \"56bad3ba-c8af-4b69-96ae-93311a9d6151\") " Mar 09 09:08:38 crc kubenswrapper[4861]: I0309 09:08:38.805411 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56bad3ba-c8af-4b69-96ae-93311a9d6151-utilities" (OuterVolumeSpecName: "utilities") pod "56bad3ba-c8af-4b69-96ae-93311a9d6151" (UID: "56bad3ba-c8af-4b69-96ae-93311a9d6151"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:08:38 crc kubenswrapper[4861]: I0309 09:08:38.807052 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56bad3ba-c8af-4b69-96ae-93311a9d6151-kube-api-access-6c8pq" (OuterVolumeSpecName: "kube-api-access-6c8pq") pod "56bad3ba-c8af-4b69-96ae-93311a9d6151" (UID: "56bad3ba-c8af-4b69-96ae-93311a9d6151"). InnerVolumeSpecName "kube-api-access-6c8pq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:38 crc kubenswrapper[4861]: I0309 09:08:38.888311 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56bad3ba-c8af-4b69-96ae-93311a9d6151-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "56bad3ba-c8af-4b69-96ae-93311a9d6151" (UID: "56bad3ba-c8af-4b69-96ae-93311a9d6151"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:08:38 crc kubenswrapper[4861]: I0309 09:08:38.906123 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6c8pq\" (UniqueName: \"kubernetes.io/projected/56bad3ba-c8af-4b69-96ae-93311a9d6151-kube-api-access-6c8pq\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:38 crc kubenswrapper[4861]: I0309 09:08:38.906198 4861 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56bad3ba-c8af-4b69-96ae-93311a9d6151-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:38 crc kubenswrapper[4861]: I0309 09:08:38.906214 4861 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56bad3ba-c8af-4b69-96ae-93311a9d6151-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:38 crc kubenswrapper[4861]: I0309 09:08:38.939792 4861 generic.go:334] "Generic (PLEG): container finished" podID="56bad3ba-c8af-4b69-96ae-93311a9d6151" containerID="65f2365bad210a7f5548f172a7f7cd07840e4e141ce9925a653919999a46b898" exitCode=0 Mar 09 09:08:38 crc kubenswrapper[4861]: I0309 09:08:38.939849 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-48m9h" event={"ID":"56bad3ba-c8af-4b69-96ae-93311a9d6151","Type":"ContainerDied","Data":"65f2365bad210a7f5548f172a7f7cd07840e4e141ce9925a653919999a46b898"} Mar 09 09:08:38 crc kubenswrapper[4861]: I0309 09:08:38.939873 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-48m9h" event={"ID":"56bad3ba-c8af-4b69-96ae-93311a9d6151","Type":"ContainerDied","Data":"da8faef02597c2965cc7577d9c3211e38a595d8ccb3e164bdcd2e162901057cd"} Mar 09 09:08:38 crc kubenswrapper[4861]: I0309 09:08:38.939888 4861 scope.go:117] "RemoveContainer" containerID="65f2365bad210a7f5548f172a7f7cd07840e4e141ce9925a653919999a46b898" Mar 09 09:08:38 crc kubenswrapper[4861]: I0309 09:08:38.939984 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-48m9h" Mar 09 09:08:38 crc kubenswrapper[4861]: I0309 09:08:38.949759 4861 generic.go:334] "Generic (PLEG): container finished" podID="e8af01b5-95f0-43a2-b228-675b98c6203f" containerID="14bf210afb405bac1772e678cdca30ca2b20935e864c470c73a0d4d066443f3c" exitCode=0 Mar 09 09:08:38 crc kubenswrapper[4861]: I0309 09:08:38.949801 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mlqmj" Mar 09 09:08:38 crc kubenswrapper[4861]: I0309 09:08:38.949814 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mlqmj" event={"ID":"e8af01b5-95f0-43a2-b228-675b98c6203f","Type":"ContainerDied","Data":"14bf210afb405bac1772e678cdca30ca2b20935e864c470c73a0d4d066443f3c"} Mar 09 09:08:38 crc kubenswrapper[4861]: I0309 09:08:38.949896 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mlqmj" event={"ID":"e8af01b5-95f0-43a2-b228-675b98c6203f","Type":"ContainerDied","Data":"68989ef329cac07fd635ca2c012abc573f1bd9468a9fc65d744edfa3c6db3a54"} Mar 09 09:08:38 crc kubenswrapper[4861]: I0309 09:08:38.974993 4861 scope.go:117] "RemoveContainer" containerID="6064ed91f9d379a74b24659a7000396365a464df9e2c7b043f72df218bede7b7" Mar 09 09:08:38 crc kubenswrapper[4861]: I0309 09:08:38.982924 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-48m9h"] Mar 09 09:08:38 crc kubenswrapper[4861]: I0309 09:08:38.985830 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-48m9h"] Mar 09 09:08:38 crc kubenswrapper[4861]: I0309 09:08:38.998742 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mlqmj"] Mar 09 09:08:38 crc kubenswrapper[4861]: I0309 09:08:38.998817 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mlqmj"] Mar 09 09:08:39 crc kubenswrapper[4861]: I0309 09:08:39.035255 4861 scope.go:117] "RemoveContainer" containerID="23b47bbc46d7c47f0edd12e592c4cd6b4b26f00019821676601739aeeb3778d9" Mar 09 09:08:39 crc kubenswrapper[4861]: I0309 09:08:39.062814 4861 scope.go:117] "RemoveContainer" containerID="65f2365bad210a7f5548f172a7f7cd07840e4e141ce9925a653919999a46b898" Mar 09 09:08:39 crc kubenswrapper[4861]: E0309 09:08:39.063300 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65f2365bad210a7f5548f172a7f7cd07840e4e141ce9925a653919999a46b898\": container with ID starting with 65f2365bad210a7f5548f172a7f7cd07840e4e141ce9925a653919999a46b898 not found: ID does not exist" containerID="65f2365bad210a7f5548f172a7f7cd07840e4e141ce9925a653919999a46b898" Mar 09 09:08:39 crc kubenswrapper[4861]: I0309 09:08:39.063397 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65f2365bad210a7f5548f172a7f7cd07840e4e141ce9925a653919999a46b898"} err="failed to get container status \"65f2365bad210a7f5548f172a7f7cd07840e4e141ce9925a653919999a46b898\": rpc error: code = NotFound desc = could not find container \"65f2365bad210a7f5548f172a7f7cd07840e4e141ce9925a653919999a46b898\": container with ID starting with 65f2365bad210a7f5548f172a7f7cd07840e4e141ce9925a653919999a46b898 not found: ID does not exist" Mar 09 09:08:39 crc kubenswrapper[4861]: I0309 09:08:39.063507 4861 scope.go:117] "RemoveContainer" containerID="6064ed91f9d379a74b24659a7000396365a464df9e2c7b043f72df218bede7b7" Mar 09 09:08:39 crc kubenswrapper[4861]: E0309 09:08:39.063815 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6064ed91f9d379a74b24659a7000396365a464df9e2c7b043f72df218bede7b7\": container with ID starting with 6064ed91f9d379a74b24659a7000396365a464df9e2c7b043f72df218bede7b7 not found: ID does not exist" containerID="6064ed91f9d379a74b24659a7000396365a464df9e2c7b043f72df218bede7b7" Mar 09 09:08:39 crc kubenswrapper[4861]: I0309 09:08:39.063891 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6064ed91f9d379a74b24659a7000396365a464df9e2c7b043f72df218bede7b7"} err="failed to get container status \"6064ed91f9d379a74b24659a7000396365a464df9e2c7b043f72df218bede7b7\": rpc error: code = NotFound desc = could not find container \"6064ed91f9d379a74b24659a7000396365a464df9e2c7b043f72df218bede7b7\": container with ID starting with 6064ed91f9d379a74b24659a7000396365a464df9e2c7b043f72df218bede7b7 not found: ID does not exist" Mar 09 09:08:39 crc kubenswrapper[4861]: I0309 09:08:39.063953 4861 scope.go:117] "RemoveContainer" containerID="23b47bbc46d7c47f0edd12e592c4cd6b4b26f00019821676601739aeeb3778d9" Mar 09 09:08:39 crc kubenswrapper[4861]: E0309 09:08:39.064235 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23b47bbc46d7c47f0edd12e592c4cd6b4b26f00019821676601739aeeb3778d9\": container with ID starting with 23b47bbc46d7c47f0edd12e592c4cd6b4b26f00019821676601739aeeb3778d9 not found: ID does not exist" containerID="23b47bbc46d7c47f0edd12e592c4cd6b4b26f00019821676601739aeeb3778d9" Mar 09 09:08:39 crc kubenswrapper[4861]: I0309 09:08:39.064309 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23b47bbc46d7c47f0edd12e592c4cd6b4b26f00019821676601739aeeb3778d9"} err="failed to get container status \"23b47bbc46d7c47f0edd12e592c4cd6b4b26f00019821676601739aeeb3778d9\": rpc error: code = NotFound desc = could not find container \"23b47bbc46d7c47f0edd12e592c4cd6b4b26f00019821676601739aeeb3778d9\": container with ID starting with 23b47bbc46d7c47f0edd12e592c4cd6b4b26f00019821676601739aeeb3778d9 not found: ID does not exist" Mar 09 09:08:39 crc kubenswrapper[4861]: I0309 09:08:39.064411 4861 scope.go:117] "RemoveContainer" containerID="14bf210afb405bac1772e678cdca30ca2b20935e864c470c73a0d4d066443f3c" Mar 09 09:08:39 crc kubenswrapper[4861]: I0309 09:08:39.099515 4861 scope.go:117] "RemoveContainer" containerID="f414447491b9eefbf4159f870988b48e34e9a4679d32bb9fb74f8d47b48ebbd8" Mar 09 09:08:39 crc kubenswrapper[4861]: I0309 09:08:39.119608 4861 scope.go:117] "RemoveContainer" containerID="d1ac58f66678ed0efa4f1a5b7ccefaea6deda777c9f9db670dce14c3a9eb20f7" Mar 09 09:08:39 crc kubenswrapper[4861]: I0309 09:08:39.150091 4861 scope.go:117] "RemoveContainer" containerID="14bf210afb405bac1772e678cdca30ca2b20935e864c470c73a0d4d066443f3c" Mar 09 09:08:39 crc kubenswrapper[4861]: E0309 09:08:39.150590 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14bf210afb405bac1772e678cdca30ca2b20935e864c470c73a0d4d066443f3c\": container with ID starting with 14bf210afb405bac1772e678cdca30ca2b20935e864c470c73a0d4d066443f3c not found: ID does not exist" containerID="14bf210afb405bac1772e678cdca30ca2b20935e864c470c73a0d4d066443f3c" Mar 09 09:08:39 crc kubenswrapper[4861]: I0309 09:08:39.150691 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14bf210afb405bac1772e678cdca30ca2b20935e864c470c73a0d4d066443f3c"} err="failed to get container status \"14bf210afb405bac1772e678cdca30ca2b20935e864c470c73a0d4d066443f3c\": rpc error: code = NotFound desc = could not find container \"14bf210afb405bac1772e678cdca30ca2b20935e864c470c73a0d4d066443f3c\": container with ID starting with 14bf210afb405bac1772e678cdca30ca2b20935e864c470c73a0d4d066443f3c not found: ID does not exist" Mar 09 09:08:39 crc kubenswrapper[4861]: I0309 09:08:39.150780 4861 scope.go:117] "RemoveContainer" containerID="f414447491b9eefbf4159f870988b48e34e9a4679d32bb9fb74f8d47b48ebbd8" Mar 09 09:08:39 crc kubenswrapper[4861]: E0309 09:08:39.151217 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f414447491b9eefbf4159f870988b48e34e9a4679d32bb9fb74f8d47b48ebbd8\": container with ID starting with f414447491b9eefbf4159f870988b48e34e9a4679d32bb9fb74f8d47b48ebbd8 not found: ID does not exist" containerID="f414447491b9eefbf4159f870988b48e34e9a4679d32bb9fb74f8d47b48ebbd8" Mar 09 09:08:39 crc kubenswrapper[4861]: I0309 09:08:39.151270 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f414447491b9eefbf4159f870988b48e34e9a4679d32bb9fb74f8d47b48ebbd8"} err="failed to get container status \"f414447491b9eefbf4159f870988b48e34e9a4679d32bb9fb74f8d47b48ebbd8\": rpc error: code = NotFound desc = could not find container \"f414447491b9eefbf4159f870988b48e34e9a4679d32bb9fb74f8d47b48ebbd8\": container with ID starting with f414447491b9eefbf4159f870988b48e34e9a4679d32bb9fb74f8d47b48ebbd8 not found: ID does not exist" Mar 09 09:08:39 crc kubenswrapper[4861]: I0309 09:08:39.151313 4861 scope.go:117] "RemoveContainer" containerID="d1ac58f66678ed0efa4f1a5b7ccefaea6deda777c9f9db670dce14c3a9eb20f7" Mar 09 09:08:39 crc kubenswrapper[4861]: E0309 09:08:39.151724 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1ac58f66678ed0efa4f1a5b7ccefaea6deda777c9f9db670dce14c3a9eb20f7\": container with ID starting with d1ac58f66678ed0efa4f1a5b7ccefaea6deda777c9f9db670dce14c3a9eb20f7 not found: ID does not exist" containerID="d1ac58f66678ed0efa4f1a5b7ccefaea6deda777c9f9db670dce14c3a9eb20f7" Mar 09 09:08:39 crc kubenswrapper[4861]: I0309 09:08:39.151810 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1ac58f66678ed0efa4f1a5b7ccefaea6deda777c9f9db670dce14c3a9eb20f7"} err="failed to get container status \"d1ac58f66678ed0efa4f1a5b7ccefaea6deda777c9f9db670dce14c3a9eb20f7\": rpc error: code = NotFound desc = could not find container \"d1ac58f66678ed0efa4f1a5b7ccefaea6deda777c9f9db670dce14c3a9eb20f7\": container with ID starting with d1ac58f66678ed0efa4f1a5b7ccefaea6deda777c9f9db670dce14c3a9eb20f7 not found: ID does not exist" Mar 09 09:08:39 crc kubenswrapper[4861]: I0309 09:08:39.670841 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56bad3ba-c8af-4b69-96ae-93311a9d6151" path="/var/lib/kubelet/pods/56bad3ba-c8af-4b69-96ae-93311a9d6151/volumes" Mar 09 09:08:39 crc kubenswrapper[4861]: I0309 09:08:39.671947 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8af01b5-95f0-43a2-b228-675b98c6203f" path="/var/lib/kubelet/pods/e8af01b5-95f0-43a2-b228-675b98c6203f/volumes" Mar 09 09:08:39 crc kubenswrapper[4861]: I0309 09:08:39.859470 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wcp7q" Mar 09 09:08:39 crc kubenswrapper[4861]: I0309 09:08:39.960471 4861 generic.go:334] "Generic (PLEG): container finished" podID="0398ae40-9658-4a9a-949c-4419bb1ca9bf" containerID="d2d42a3e5e633101270ecaec42e27e4983bcb1e91f98a021783ceb2ebb7abd0d" exitCode=0 Mar 09 09:08:39 crc kubenswrapper[4861]: I0309 09:08:39.960557 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wcp7q" event={"ID":"0398ae40-9658-4a9a-949c-4419bb1ca9bf","Type":"ContainerDied","Data":"d2d42a3e5e633101270ecaec42e27e4983bcb1e91f98a021783ceb2ebb7abd0d"} Mar 09 09:08:39 crc kubenswrapper[4861]: I0309 09:08:39.960585 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wcp7q" event={"ID":"0398ae40-9658-4a9a-949c-4419bb1ca9bf","Type":"ContainerDied","Data":"d3acd6bcc671bf38e672e530f0be65aba4844418fba614c03065a562388d4d84"} Mar 09 09:08:39 crc kubenswrapper[4861]: I0309 09:08:39.960605 4861 scope.go:117] "RemoveContainer" containerID="d2d42a3e5e633101270ecaec42e27e4983bcb1e91f98a021783ceb2ebb7abd0d" Mar 09 09:08:39 crc kubenswrapper[4861]: I0309 09:08:39.960701 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wcp7q" Mar 09 09:08:39 crc kubenswrapper[4861]: I0309 09:08:39.985161 4861 scope.go:117] "RemoveContainer" containerID="8006f8540682bd81fefde850328666dfbfe111f5c351cd75b5c9586fd473951e" Mar 09 09:08:40 crc kubenswrapper[4861]: I0309 09:08:40.017520 4861 scope.go:117] "RemoveContainer" containerID="db0199b19b12f610f5e637145782ea749a5d98fe1c2e92629063b7700da24cf1" Mar 09 09:08:40 crc kubenswrapper[4861]: I0309 09:08:40.026304 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0398ae40-9658-4a9a-949c-4419bb1ca9bf-utilities\") pod \"0398ae40-9658-4a9a-949c-4419bb1ca9bf\" (UID: \"0398ae40-9658-4a9a-949c-4419bb1ca9bf\") " Mar 09 09:08:40 crc kubenswrapper[4861]: I0309 09:08:40.026349 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0398ae40-9658-4a9a-949c-4419bb1ca9bf-catalog-content\") pod \"0398ae40-9658-4a9a-949c-4419bb1ca9bf\" (UID: \"0398ae40-9658-4a9a-949c-4419bb1ca9bf\") " Mar 09 09:08:40 crc kubenswrapper[4861]: I0309 09:08:40.026419 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfzmv\" (UniqueName: \"kubernetes.io/projected/0398ae40-9658-4a9a-949c-4419bb1ca9bf-kube-api-access-vfzmv\") pod \"0398ae40-9658-4a9a-949c-4419bb1ca9bf\" (UID: \"0398ae40-9658-4a9a-949c-4419bb1ca9bf\") " Mar 09 09:08:40 crc kubenswrapper[4861]: I0309 09:08:40.028900 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0398ae40-9658-4a9a-949c-4419bb1ca9bf-utilities" (OuterVolumeSpecName: "utilities") pod "0398ae40-9658-4a9a-949c-4419bb1ca9bf" (UID: "0398ae40-9658-4a9a-949c-4419bb1ca9bf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:08:40 crc kubenswrapper[4861]: I0309 09:08:40.033039 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0398ae40-9658-4a9a-949c-4419bb1ca9bf-kube-api-access-vfzmv" (OuterVolumeSpecName: "kube-api-access-vfzmv") pod "0398ae40-9658-4a9a-949c-4419bb1ca9bf" (UID: "0398ae40-9658-4a9a-949c-4419bb1ca9bf"). InnerVolumeSpecName "kube-api-access-vfzmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:40 crc kubenswrapper[4861]: I0309 09:08:40.039566 4861 scope.go:117] "RemoveContainer" containerID="d2d42a3e5e633101270ecaec42e27e4983bcb1e91f98a021783ceb2ebb7abd0d" Mar 09 09:08:40 crc kubenswrapper[4861]: E0309 09:08:40.040172 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2d42a3e5e633101270ecaec42e27e4983bcb1e91f98a021783ceb2ebb7abd0d\": container with ID starting with d2d42a3e5e633101270ecaec42e27e4983bcb1e91f98a021783ceb2ebb7abd0d not found: ID does not exist" containerID="d2d42a3e5e633101270ecaec42e27e4983bcb1e91f98a021783ceb2ebb7abd0d" Mar 09 09:08:40 crc kubenswrapper[4861]: I0309 09:08:40.040216 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2d42a3e5e633101270ecaec42e27e4983bcb1e91f98a021783ceb2ebb7abd0d"} err="failed to get container status \"d2d42a3e5e633101270ecaec42e27e4983bcb1e91f98a021783ceb2ebb7abd0d\": rpc error: code = NotFound desc = could not find container \"d2d42a3e5e633101270ecaec42e27e4983bcb1e91f98a021783ceb2ebb7abd0d\": container with ID starting with d2d42a3e5e633101270ecaec42e27e4983bcb1e91f98a021783ceb2ebb7abd0d not found: ID does not exist" Mar 09 09:08:40 crc kubenswrapper[4861]: I0309 09:08:40.040250 4861 scope.go:117] "RemoveContainer" containerID="8006f8540682bd81fefde850328666dfbfe111f5c351cd75b5c9586fd473951e" Mar 09 09:08:40 crc kubenswrapper[4861]: E0309 09:08:40.040723 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8006f8540682bd81fefde850328666dfbfe111f5c351cd75b5c9586fd473951e\": container with ID starting with 8006f8540682bd81fefde850328666dfbfe111f5c351cd75b5c9586fd473951e not found: ID does not exist" containerID="8006f8540682bd81fefde850328666dfbfe111f5c351cd75b5c9586fd473951e" Mar 09 09:08:40 crc kubenswrapper[4861]: I0309 09:08:40.040750 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8006f8540682bd81fefde850328666dfbfe111f5c351cd75b5c9586fd473951e"} err="failed to get container status \"8006f8540682bd81fefde850328666dfbfe111f5c351cd75b5c9586fd473951e\": rpc error: code = NotFound desc = could not find container \"8006f8540682bd81fefde850328666dfbfe111f5c351cd75b5c9586fd473951e\": container with ID starting with 8006f8540682bd81fefde850328666dfbfe111f5c351cd75b5c9586fd473951e not found: ID does not exist" Mar 09 09:08:40 crc kubenswrapper[4861]: I0309 09:08:40.040770 4861 scope.go:117] "RemoveContainer" containerID="db0199b19b12f610f5e637145782ea749a5d98fe1c2e92629063b7700da24cf1" Mar 09 09:08:40 crc kubenswrapper[4861]: E0309 09:08:40.040993 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db0199b19b12f610f5e637145782ea749a5d98fe1c2e92629063b7700da24cf1\": container with ID starting with db0199b19b12f610f5e637145782ea749a5d98fe1c2e92629063b7700da24cf1 not found: ID does not exist" containerID="db0199b19b12f610f5e637145782ea749a5d98fe1c2e92629063b7700da24cf1" Mar 09 09:08:40 crc kubenswrapper[4861]: I0309 09:08:40.041026 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db0199b19b12f610f5e637145782ea749a5d98fe1c2e92629063b7700da24cf1"} err="failed to get container status \"db0199b19b12f610f5e637145782ea749a5d98fe1c2e92629063b7700da24cf1\": rpc error: code = NotFound desc = could not find container \"db0199b19b12f610f5e637145782ea749a5d98fe1c2e92629063b7700da24cf1\": container with ID starting with db0199b19b12f610f5e637145782ea749a5d98fe1c2e92629063b7700da24cf1 not found: ID does not exist" Mar 09 09:08:40 crc kubenswrapper[4861]: I0309 09:08:40.072522 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0398ae40-9658-4a9a-949c-4419bb1ca9bf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0398ae40-9658-4a9a-949c-4419bb1ca9bf" (UID: "0398ae40-9658-4a9a-949c-4419bb1ca9bf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:08:40 crc kubenswrapper[4861]: I0309 09:08:40.128102 4861 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0398ae40-9658-4a9a-949c-4419bb1ca9bf-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:40 crc kubenswrapper[4861]: I0309 09:08:40.128545 4861 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0398ae40-9658-4a9a-949c-4419bb1ca9bf-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:40 crc kubenswrapper[4861]: I0309 09:08:40.128560 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfzmv\" (UniqueName: \"kubernetes.io/projected/0398ae40-9658-4a9a-949c-4419bb1ca9bf-kube-api-access-vfzmv\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:40 crc kubenswrapper[4861]: I0309 09:08:40.294327 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wcp7q"] Mar 09 09:08:40 crc kubenswrapper[4861]: I0309 09:08:40.299612 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wcp7q"] Mar 09 09:08:41 crc kubenswrapper[4861]: I0309 09:08:41.665061 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0398ae40-9658-4a9a-949c-4419bb1ca9bf" path="/var/lib/kubelet/pods/0398ae40-9658-4a9a-949c-4419bb1ca9bf/volumes" Mar 09 09:08:42 crc kubenswrapper[4861]: I0309 09:08:42.625973 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-769f74488b-4xx9h"] Mar 09 09:08:42 crc kubenswrapper[4861]: I0309 09:08:42.626488 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-769f74488b-4xx9h" podUID="ebace06e-6730-48a4-bee8-ebdbcc3d8f78" containerName="controller-manager" containerID="cri-o://23ad2ab4a4dbb0d18edf7f03fe76242001ae46394013ca055e6890fcb7364153" gracePeriod=30 Mar 09 09:08:42 crc kubenswrapper[4861]: I0309 09:08:42.643451 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59b5dbf48f-tbq65"] Mar 09 09:08:42 crc kubenswrapper[4861]: I0309 09:08:42.643686 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-59b5dbf48f-tbq65" podUID="89f0faed-9686-4f8f-af5d-ef542aa0f161" containerName="route-controller-manager" containerID="cri-o://c6c9884d96bce2917b8e3ff927f81abdb311f213d05c592dbdbe190ae232f9c5" gracePeriod=30 Mar 09 09:08:43 crc kubenswrapper[4861]: I0309 09:08:42.992868 4861 generic.go:334] "Generic (PLEG): container finished" podID="89f0faed-9686-4f8f-af5d-ef542aa0f161" containerID="c6c9884d96bce2917b8e3ff927f81abdb311f213d05c592dbdbe190ae232f9c5" exitCode=0 Mar 09 09:08:43 crc kubenswrapper[4861]: I0309 09:08:42.992951 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59b5dbf48f-tbq65" event={"ID":"89f0faed-9686-4f8f-af5d-ef542aa0f161","Type":"ContainerDied","Data":"c6c9884d96bce2917b8e3ff927f81abdb311f213d05c592dbdbe190ae232f9c5"} Mar 09 09:08:43 crc kubenswrapper[4861]: I0309 09:08:42.994281 4861 generic.go:334] "Generic (PLEG): container finished" podID="ebace06e-6730-48a4-bee8-ebdbcc3d8f78" containerID="23ad2ab4a4dbb0d18edf7f03fe76242001ae46394013ca055e6890fcb7364153" exitCode=0 Mar 09 09:08:43 crc kubenswrapper[4861]: I0309 09:08:42.994306 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-769f74488b-4xx9h" event={"ID":"ebace06e-6730-48a4-bee8-ebdbcc3d8f78","Type":"ContainerDied","Data":"23ad2ab4a4dbb0d18edf7f03fe76242001ae46394013ca055e6890fcb7364153"} Mar 09 09:08:43 crc kubenswrapper[4861]: I0309 09:08:43.214402 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59b5dbf48f-tbq65" Mar 09 09:08:43 crc kubenswrapper[4861]: I0309 09:08:43.367853 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-769f74488b-4xx9h" Mar 09 09:08:43 crc kubenswrapper[4861]: I0309 09:08:43.370698 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bvv2\" (UniqueName: \"kubernetes.io/projected/89f0faed-9686-4f8f-af5d-ef542aa0f161-kube-api-access-9bvv2\") pod \"89f0faed-9686-4f8f-af5d-ef542aa0f161\" (UID: \"89f0faed-9686-4f8f-af5d-ef542aa0f161\") " Mar 09 09:08:43 crc kubenswrapper[4861]: I0309 09:08:43.370752 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89f0faed-9686-4f8f-af5d-ef542aa0f161-config\") pod \"89f0faed-9686-4f8f-af5d-ef542aa0f161\" (UID: \"89f0faed-9686-4f8f-af5d-ef542aa0f161\") " Mar 09 09:08:43 crc kubenswrapper[4861]: I0309 09:08:43.370800 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/89f0faed-9686-4f8f-af5d-ef542aa0f161-client-ca\") pod \"89f0faed-9686-4f8f-af5d-ef542aa0f161\" (UID: \"89f0faed-9686-4f8f-af5d-ef542aa0f161\") " Mar 09 09:08:43 crc kubenswrapper[4861]: I0309 09:08:43.370858 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89f0faed-9686-4f8f-af5d-ef542aa0f161-serving-cert\") pod \"89f0faed-9686-4f8f-af5d-ef542aa0f161\" (UID: \"89f0faed-9686-4f8f-af5d-ef542aa0f161\") " Mar 09 09:08:43 crc kubenswrapper[4861]: I0309 09:08:43.371527 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89f0faed-9686-4f8f-af5d-ef542aa0f161-config" (OuterVolumeSpecName: "config") pod "89f0faed-9686-4f8f-af5d-ef542aa0f161" (UID: "89f0faed-9686-4f8f-af5d-ef542aa0f161"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4861]: I0309 09:08:43.371884 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89f0faed-9686-4f8f-af5d-ef542aa0f161-client-ca" (OuterVolumeSpecName: "client-ca") pod "89f0faed-9686-4f8f-af5d-ef542aa0f161" (UID: "89f0faed-9686-4f8f-af5d-ef542aa0f161"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4861]: I0309 09:08:43.375708 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89f0faed-9686-4f8f-af5d-ef542aa0f161-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "89f0faed-9686-4f8f-af5d-ef542aa0f161" (UID: "89f0faed-9686-4f8f-af5d-ef542aa0f161"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4861]: I0309 09:08:43.375768 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89f0faed-9686-4f8f-af5d-ef542aa0f161-kube-api-access-9bvv2" (OuterVolumeSpecName: "kube-api-access-9bvv2") pod "89f0faed-9686-4f8f-af5d-ef542aa0f161" (UID: "89f0faed-9686-4f8f-af5d-ef542aa0f161"). InnerVolumeSpecName "kube-api-access-9bvv2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4861]: I0309 09:08:43.472268 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbd4c\" (UniqueName: \"kubernetes.io/projected/ebace06e-6730-48a4-bee8-ebdbcc3d8f78-kube-api-access-kbd4c\") pod \"ebace06e-6730-48a4-bee8-ebdbcc3d8f78\" (UID: \"ebace06e-6730-48a4-bee8-ebdbcc3d8f78\") " Mar 09 09:08:43 crc kubenswrapper[4861]: I0309 09:08:43.472360 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebace06e-6730-48a4-bee8-ebdbcc3d8f78-config\") pod \"ebace06e-6730-48a4-bee8-ebdbcc3d8f78\" (UID: \"ebace06e-6730-48a4-bee8-ebdbcc3d8f78\") " Mar 09 09:08:43 crc kubenswrapper[4861]: I0309 09:08:43.472461 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ebace06e-6730-48a4-bee8-ebdbcc3d8f78-client-ca\") pod \"ebace06e-6730-48a4-bee8-ebdbcc3d8f78\" (UID: \"ebace06e-6730-48a4-bee8-ebdbcc3d8f78\") " Mar 09 09:08:43 crc kubenswrapper[4861]: I0309 09:08:43.472527 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ebace06e-6730-48a4-bee8-ebdbcc3d8f78-proxy-ca-bundles\") pod \"ebace06e-6730-48a4-bee8-ebdbcc3d8f78\" (UID: \"ebace06e-6730-48a4-bee8-ebdbcc3d8f78\") " Mar 09 09:08:43 crc kubenswrapper[4861]: I0309 09:08:43.472556 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebace06e-6730-48a4-bee8-ebdbcc3d8f78-serving-cert\") pod \"ebace06e-6730-48a4-bee8-ebdbcc3d8f78\" (UID: \"ebace06e-6730-48a4-bee8-ebdbcc3d8f78\") " Mar 09 09:08:43 crc kubenswrapper[4861]: I0309 09:08:43.472804 4861 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89f0faed-9686-4f8f-af5d-ef542aa0f161-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4861]: I0309 09:08:43.472828 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9bvv2\" (UniqueName: \"kubernetes.io/projected/89f0faed-9686-4f8f-af5d-ef542aa0f161-kube-api-access-9bvv2\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4861]: I0309 09:08:43.472844 4861 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89f0faed-9686-4f8f-af5d-ef542aa0f161-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4861]: I0309 09:08:43.472857 4861 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/89f0faed-9686-4f8f-af5d-ef542aa0f161-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4861]: I0309 09:08:43.473350 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebace06e-6730-48a4-bee8-ebdbcc3d8f78-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ebace06e-6730-48a4-bee8-ebdbcc3d8f78" (UID: "ebace06e-6730-48a4-bee8-ebdbcc3d8f78"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4861]: I0309 09:08:43.473486 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebace06e-6730-48a4-bee8-ebdbcc3d8f78-client-ca" (OuterVolumeSpecName: "client-ca") pod "ebace06e-6730-48a4-bee8-ebdbcc3d8f78" (UID: "ebace06e-6730-48a4-bee8-ebdbcc3d8f78"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4861]: I0309 09:08:43.473788 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebace06e-6730-48a4-bee8-ebdbcc3d8f78-config" (OuterVolumeSpecName: "config") pod "ebace06e-6730-48a4-bee8-ebdbcc3d8f78" (UID: "ebace06e-6730-48a4-bee8-ebdbcc3d8f78"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4861]: I0309 09:08:43.475399 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebace06e-6730-48a4-bee8-ebdbcc3d8f78-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ebace06e-6730-48a4-bee8-ebdbcc3d8f78" (UID: "ebace06e-6730-48a4-bee8-ebdbcc3d8f78"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4861]: I0309 09:08:43.476580 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebace06e-6730-48a4-bee8-ebdbcc3d8f78-kube-api-access-kbd4c" (OuterVolumeSpecName: "kube-api-access-kbd4c") pod "ebace06e-6730-48a4-bee8-ebdbcc3d8f78" (UID: "ebace06e-6730-48a4-bee8-ebdbcc3d8f78"). InnerVolumeSpecName "kube-api-access-kbd4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:43 crc kubenswrapper[4861]: I0309 09:08:43.577041 4861 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ebace06e-6730-48a4-bee8-ebdbcc3d8f78-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4861]: I0309 09:08:43.577093 4861 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebace06e-6730-48a4-bee8-ebdbcc3d8f78-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4861]: I0309 09:08:43.577106 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbd4c\" (UniqueName: \"kubernetes.io/projected/ebace06e-6730-48a4-bee8-ebdbcc3d8f78-kube-api-access-kbd4c\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4861]: I0309 09:08:43.577119 4861 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebace06e-6730-48a4-bee8-ebdbcc3d8f78-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:43 crc kubenswrapper[4861]: I0309 09:08:43.577128 4861 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ebace06e-6730-48a4-bee8-ebdbcc3d8f78-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.003237 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-59b5dbf48f-tbq65" event={"ID":"89f0faed-9686-4f8f-af5d-ef542aa0f161","Type":"ContainerDied","Data":"ab97f5145d3626c811d0d7f531202ceb328413ba602688cc9ecff55291553b14"} Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.003280 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-59b5dbf48f-tbq65" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.003324 4861 scope.go:117] "RemoveContainer" containerID="c6c9884d96bce2917b8e3ff927f81abdb311f213d05c592dbdbe190ae232f9c5" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.007677 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-769f74488b-4xx9h" event={"ID":"ebace06e-6730-48a4-bee8-ebdbcc3d8f78","Type":"ContainerDied","Data":"cf125631c4d89bff3e97cb5b1a3eff16cd9782c05d2ea74da352977bc7b559ca"} Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.007741 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-769f74488b-4xx9h" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.029312 4861 scope.go:117] "RemoveContainer" containerID="23ad2ab4a4dbb0d18edf7f03fe76242001ae46394013ca055e6890fcb7364153" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.029770 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59b5dbf48f-tbq65"] Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.032716 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-59b5dbf48f-tbq65"] Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.040562 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-769f74488b-4xx9h"] Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.045674 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-769f74488b-4xx9h"] Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.303422 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7b76458476-6twtf"] Mar 09 09:08:44 crc kubenswrapper[4861]: E0309 09:08:44.303745 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89f0faed-9686-4f8f-af5d-ef542aa0f161" containerName="route-controller-manager" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.303758 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="89f0faed-9686-4f8f-af5d-ef542aa0f161" containerName="route-controller-manager" Mar 09 09:08:44 crc kubenswrapper[4861]: E0309 09:08:44.303771 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0398ae40-9658-4a9a-949c-4419bb1ca9bf" containerName="registry-server" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.303780 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="0398ae40-9658-4a9a-949c-4419bb1ca9bf" containerName="registry-server" Mar 09 09:08:44 crc kubenswrapper[4861]: E0309 09:08:44.303789 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0398ae40-9658-4a9a-949c-4419bb1ca9bf" containerName="extract-content" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.303796 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="0398ae40-9658-4a9a-949c-4419bb1ca9bf" containerName="extract-content" Mar 09 09:08:44 crc kubenswrapper[4861]: E0309 09:08:44.303804 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8af01b5-95f0-43a2-b228-675b98c6203f" containerName="extract-content" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.303810 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8af01b5-95f0-43a2-b228-675b98c6203f" containerName="extract-content" Mar 09 09:08:44 crc kubenswrapper[4861]: E0309 09:08:44.303818 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56bad3ba-c8af-4b69-96ae-93311a9d6151" containerName="extract-utilities" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.303824 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="56bad3ba-c8af-4b69-96ae-93311a9d6151" containerName="extract-utilities" Mar 09 09:08:44 crc kubenswrapper[4861]: E0309 09:08:44.303831 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56bad3ba-c8af-4b69-96ae-93311a9d6151" containerName="registry-server" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.303837 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="56bad3ba-c8af-4b69-96ae-93311a9d6151" containerName="registry-server" Mar 09 09:08:44 crc kubenswrapper[4861]: E0309 09:08:44.303849 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56bad3ba-c8af-4b69-96ae-93311a9d6151" containerName="extract-content" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.303855 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="56bad3ba-c8af-4b69-96ae-93311a9d6151" containerName="extract-content" Mar 09 09:08:44 crc kubenswrapper[4861]: E0309 09:08:44.303861 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8af01b5-95f0-43a2-b228-675b98c6203f" containerName="extract-utilities" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.303867 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8af01b5-95f0-43a2-b228-675b98c6203f" containerName="extract-utilities" Mar 09 09:08:44 crc kubenswrapper[4861]: E0309 09:08:44.303877 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0398ae40-9658-4a9a-949c-4419bb1ca9bf" containerName="extract-utilities" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.303884 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="0398ae40-9658-4a9a-949c-4419bb1ca9bf" containerName="extract-utilities" Mar 09 09:08:44 crc kubenswrapper[4861]: E0309 09:08:44.303890 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebace06e-6730-48a4-bee8-ebdbcc3d8f78" containerName="controller-manager" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.303896 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebace06e-6730-48a4-bee8-ebdbcc3d8f78" containerName="controller-manager" Mar 09 09:08:44 crc kubenswrapper[4861]: E0309 09:08:44.303904 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8af01b5-95f0-43a2-b228-675b98c6203f" containerName="registry-server" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.303910 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8af01b5-95f0-43a2-b228-675b98c6203f" containerName="registry-server" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.304032 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="56bad3ba-c8af-4b69-96ae-93311a9d6151" containerName="registry-server" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.304041 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="0398ae40-9658-4a9a-949c-4419bb1ca9bf" containerName="registry-server" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.304051 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="89f0faed-9686-4f8f-af5d-ef542aa0f161" containerName="route-controller-manager" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.304059 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebace06e-6730-48a4-bee8-ebdbcc3d8f78" containerName="controller-manager" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.304067 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8af01b5-95f0-43a2-b228-675b98c6203f" containerName="registry-server" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.304502 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b76458476-6twtf" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.310543 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-698587484d-g2lfn"] Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.311484 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.311972 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.312021 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-698587484d-g2lfn" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.312542 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.312760 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.315106 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.315184 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.320530 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.321255 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.321470 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-698587484d-g2lfn"] Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.323815 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.324090 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.325421 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.325997 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.331535 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.333255 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7b76458476-6twtf"] Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.387467 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d49cdfb-1198-408e-a97f-a03243e5a46c-config\") pod \"route-controller-manager-698587484d-g2lfn\" (UID: \"2d49cdfb-1198-408e-a97f-a03243e5a46c\") " pod="openshift-route-controller-manager/route-controller-manager-698587484d-g2lfn" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.387521 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nxs5\" (UniqueName: \"kubernetes.io/projected/2d49cdfb-1198-408e-a97f-a03243e5a46c-kube-api-access-5nxs5\") pod \"route-controller-manager-698587484d-g2lfn\" (UID: \"2d49cdfb-1198-408e-a97f-a03243e5a46c\") " pod="openshift-route-controller-manager/route-controller-manager-698587484d-g2lfn" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.387567 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5867q\" (UniqueName: \"kubernetes.io/projected/17fc9163-cd18-41c0-abbc-2f3d15063703-kube-api-access-5867q\") pod \"controller-manager-7b76458476-6twtf\" (UID: \"17fc9163-cd18-41c0-abbc-2f3d15063703\") " pod="openshift-controller-manager/controller-manager-7b76458476-6twtf" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.387631 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/17fc9163-cd18-41c0-abbc-2f3d15063703-proxy-ca-bundles\") pod \"controller-manager-7b76458476-6twtf\" (UID: \"17fc9163-cd18-41c0-abbc-2f3d15063703\") " pod="openshift-controller-manager/controller-manager-7b76458476-6twtf" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.387659 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17fc9163-cd18-41c0-abbc-2f3d15063703-config\") pod \"controller-manager-7b76458476-6twtf\" (UID: \"17fc9163-cd18-41c0-abbc-2f3d15063703\") " pod="openshift-controller-manager/controller-manager-7b76458476-6twtf" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.387717 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d49cdfb-1198-408e-a97f-a03243e5a46c-serving-cert\") pod \"route-controller-manager-698587484d-g2lfn\" (UID: \"2d49cdfb-1198-408e-a97f-a03243e5a46c\") " pod="openshift-route-controller-manager/route-controller-manager-698587484d-g2lfn" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.387743 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/17fc9163-cd18-41c0-abbc-2f3d15063703-client-ca\") pod \"controller-manager-7b76458476-6twtf\" (UID: \"17fc9163-cd18-41c0-abbc-2f3d15063703\") " pod="openshift-controller-manager/controller-manager-7b76458476-6twtf" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.387789 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2d49cdfb-1198-408e-a97f-a03243e5a46c-client-ca\") pod \"route-controller-manager-698587484d-g2lfn\" (UID: \"2d49cdfb-1198-408e-a97f-a03243e5a46c\") " pod="openshift-route-controller-manager/route-controller-manager-698587484d-g2lfn" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.387810 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17fc9163-cd18-41c0-abbc-2f3d15063703-serving-cert\") pod \"controller-manager-7b76458476-6twtf\" (UID: \"17fc9163-cd18-41c0-abbc-2f3d15063703\") " pod="openshift-controller-manager/controller-manager-7b76458476-6twtf" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.488817 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/17fc9163-cd18-41c0-abbc-2f3d15063703-client-ca\") pod \"controller-manager-7b76458476-6twtf\" (UID: \"17fc9163-cd18-41c0-abbc-2f3d15063703\") " pod="openshift-controller-manager/controller-manager-7b76458476-6twtf" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.489164 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2d49cdfb-1198-408e-a97f-a03243e5a46c-client-ca\") pod \"route-controller-manager-698587484d-g2lfn\" (UID: \"2d49cdfb-1198-408e-a97f-a03243e5a46c\") " pod="openshift-route-controller-manager/route-controller-manager-698587484d-g2lfn" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.489225 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17fc9163-cd18-41c0-abbc-2f3d15063703-serving-cert\") pod \"controller-manager-7b76458476-6twtf\" (UID: \"17fc9163-cd18-41c0-abbc-2f3d15063703\") " pod="openshift-controller-manager/controller-manager-7b76458476-6twtf" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.489320 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d49cdfb-1198-408e-a97f-a03243e5a46c-config\") pod \"route-controller-manager-698587484d-g2lfn\" (UID: \"2d49cdfb-1198-408e-a97f-a03243e5a46c\") " pod="openshift-route-controller-manager/route-controller-manager-698587484d-g2lfn" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.489402 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nxs5\" (UniqueName: \"kubernetes.io/projected/2d49cdfb-1198-408e-a97f-a03243e5a46c-kube-api-access-5nxs5\") pod \"route-controller-manager-698587484d-g2lfn\" (UID: \"2d49cdfb-1198-408e-a97f-a03243e5a46c\") " pod="openshift-route-controller-manager/route-controller-manager-698587484d-g2lfn" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.489485 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5867q\" (UniqueName: \"kubernetes.io/projected/17fc9163-cd18-41c0-abbc-2f3d15063703-kube-api-access-5867q\") pod \"controller-manager-7b76458476-6twtf\" (UID: \"17fc9163-cd18-41c0-abbc-2f3d15063703\") " pod="openshift-controller-manager/controller-manager-7b76458476-6twtf" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.489583 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/17fc9163-cd18-41c0-abbc-2f3d15063703-proxy-ca-bundles\") pod \"controller-manager-7b76458476-6twtf\" (UID: \"17fc9163-cd18-41c0-abbc-2f3d15063703\") " pod="openshift-controller-manager/controller-manager-7b76458476-6twtf" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.489622 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17fc9163-cd18-41c0-abbc-2f3d15063703-config\") pod \"controller-manager-7b76458476-6twtf\" (UID: \"17fc9163-cd18-41c0-abbc-2f3d15063703\") " pod="openshift-controller-manager/controller-manager-7b76458476-6twtf" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.489699 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d49cdfb-1198-408e-a97f-a03243e5a46c-serving-cert\") pod \"route-controller-manager-698587484d-g2lfn\" (UID: \"2d49cdfb-1198-408e-a97f-a03243e5a46c\") " pod="openshift-route-controller-manager/route-controller-manager-698587484d-g2lfn" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.490934 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2d49cdfb-1198-408e-a97f-a03243e5a46c-client-ca\") pod \"route-controller-manager-698587484d-g2lfn\" (UID: \"2d49cdfb-1198-408e-a97f-a03243e5a46c\") " pod="openshift-route-controller-manager/route-controller-manager-698587484d-g2lfn" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.490934 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/17fc9163-cd18-41c0-abbc-2f3d15063703-client-ca\") pod \"controller-manager-7b76458476-6twtf\" (UID: \"17fc9163-cd18-41c0-abbc-2f3d15063703\") " pod="openshift-controller-manager/controller-manager-7b76458476-6twtf" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.491068 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d49cdfb-1198-408e-a97f-a03243e5a46c-config\") pod \"route-controller-manager-698587484d-g2lfn\" (UID: \"2d49cdfb-1198-408e-a97f-a03243e5a46c\") " pod="openshift-route-controller-manager/route-controller-manager-698587484d-g2lfn" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.491074 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/17fc9163-cd18-41c0-abbc-2f3d15063703-proxy-ca-bundles\") pod \"controller-manager-7b76458476-6twtf\" (UID: \"17fc9163-cd18-41c0-abbc-2f3d15063703\") " pod="openshift-controller-manager/controller-manager-7b76458476-6twtf" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.491559 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17fc9163-cd18-41c0-abbc-2f3d15063703-config\") pod \"controller-manager-7b76458476-6twtf\" (UID: \"17fc9163-cd18-41c0-abbc-2f3d15063703\") " pod="openshift-controller-manager/controller-manager-7b76458476-6twtf" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.495524 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17fc9163-cd18-41c0-abbc-2f3d15063703-serving-cert\") pod \"controller-manager-7b76458476-6twtf\" (UID: \"17fc9163-cd18-41c0-abbc-2f3d15063703\") " pod="openshift-controller-manager/controller-manager-7b76458476-6twtf" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.501209 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d49cdfb-1198-408e-a97f-a03243e5a46c-serving-cert\") pod \"route-controller-manager-698587484d-g2lfn\" (UID: \"2d49cdfb-1198-408e-a97f-a03243e5a46c\") " pod="openshift-route-controller-manager/route-controller-manager-698587484d-g2lfn" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.510904 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nxs5\" (UniqueName: \"kubernetes.io/projected/2d49cdfb-1198-408e-a97f-a03243e5a46c-kube-api-access-5nxs5\") pod \"route-controller-manager-698587484d-g2lfn\" (UID: \"2d49cdfb-1198-408e-a97f-a03243e5a46c\") " pod="openshift-route-controller-manager/route-controller-manager-698587484d-g2lfn" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.512971 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5867q\" (UniqueName: \"kubernetes.io/projected/17fc9163-cd18-41c0-abbc-2f3d15063703-kube-api-access-5867q\") pod \"controller-manager-7b76458476-6twtf\" (UID: \"17fc9163-cd18-41c0-abbc-2f3d15063703\") " pod="openshift-controller-manager/controller-manager-7b76458476-6twtf" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.653241 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b76458476-6twtf" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.671116 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-698587484d-g2lfn" Mar 09 09:08:44 crc kubenswrapper[4861]: I0309 09:08:44.967284 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-698587484d-g2lfn"] Mar 09 09:08:45 crc kubenswrapper[4861]: I0309 09:08:45.014383 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-698587484d-g2lfn" event={"ID":"2d49cdfb-1198-408e-a97f-a03243e5a46c","Type":"ContainerStarted","Data":"9503657adacd1957167f343d02c0d293fc812ef2fe179133724d4b75098976fa"} Mar 09 09:08:45 crc kubenswrapper[4861]: I0309 09:08:45.093482 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7b76458476-6twtf"] Mar 09 09:08:45 crc kubenswrapper[4861]: I0309 09:08:45.667520 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89f0faed-9686-4f8f-af5d-ef542aa0f161" path="/var/lib/kubelet/pods/89f0faed-9686-4f8f-af5d-ef542aa0f161/volumes" Mar 09 09:08:45 crc kubenswrapper[4861]: I0309 09:08:45.668684 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebace06e-6730-48a4-bee8-ebdbcc3d8f78" path="/var/lib/kubelet/pods/ebace06e-6730-48a4-bee8-ebdbcc3d8f78/volumes" Mar 09 09:08:46 crc kubenswrapper[4861]: I0309 09:08:46.020282 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b76458476-6twtf" event={"ID":"17fc9163-cd18-41c0-abbc-2f3d15063703","Type":"ContainerStarted","Data":"559b8e24ae687e5a1b7e6f52bbb496c165639e1c743f12bf286c2273bb8ae09b"} Mar 09 09:08:46 crc kubenswrapper[4861]: I0309 09:08:46.020328 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b76458476-6twtf" event={"ID":"17fc9163-cd18-41c0-abbc-2f3d15063703","Type":"ContainerStarted","Data":"9fc28cfabf6b742bf4eab906fa7dde41866db831dfa2b95bbc1a314aae2f6788"} Mar 09 09:08:46 crc kubenswrapper[4861]: I0309 09:08:46.020612 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7b76458476-6twtf" Mar 09 09:08:46 crc kubenswrapper[4861]: I0309 09:08:46.021740 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-698587484d-g2lfn" event={"ID":"2d49cdfb-1198-408e-a97f-a03243e5a46c","Type":"ContainerStarted","Data":"abf47dfe3aed929bdefcb4118d3105871b156bec1a89dac0e067eacb8a91a487"} Mar 09 09:08:46 crc kubenswrapper[4861]: I0309 09:08:46.022420 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-698587484d-g2lfn" Mar 09 09:08:46 crc kubenswrapper[4861]: I0309 09:08:46.024058 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7b76458476-6twtf" Mar 09 09:08:46 crc kubenswrapper[4861]: I0309 09:08:46.034251 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7b76458476-6twtf" podStartSLOduration=4.034236163 podStartE2EDuration="4.034236163s" podCreationTimestamp="2026-03-09 09:08:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:08:46.033265765 +0000 UTC m=+169.118305176" watchObservedRunningTime="2026-03-09 09:08:46.034236163 +0000 UTC m=+169.119275564" Mar 09 09:08:46 crc kubenswrapper[4861]: I0309 09:08:46.055248 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-698587484d-g2lfn" podStartSLOduration=4.055229652 podStartE2EDuration="4.055229652s" podCreationTimestamp="2026-03-09 09:08:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:08:46.053205737 +0000 UTC m=+169.138245158" watchObservedRunningTime="2026-03-09 09:08:46.055229652 +0000 UTC m=+169.140269073" Mar 09 09:08:46 crc kubenswrapper[4861]: I0309 09:08:46.144308 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-698587484d-g2lfn" Mar 09 09:08:47 crc kubenswrapper[4861]: I0309 09:08:47.373075 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zkngt" Mar 09 09:08:47 crc kubenswrapper[4861]: I0309 09:08:47.426933 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zkngt" Mar 09 09:08:49 crc kubenswrapper[4861]: I0309 09:08:49.766145 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zkngt"] Mar 09 09:08:49 crc kubenswrapper[4861]: I0309 09:08:49.766786 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zkngt" podUID="6eac3eed-7721-4030-b1e3-9dd28fea2e49" containerName="registry-server" containerID="cri-o://22c2787ab7459d1deced22ba471cbda27f48a357d73310be38990de6082e6b54" gracePeriod=2 Mar 09 09:08:50 crc kubenswrapper[4861]: I0309 09:08:50.053002 4861 generic.go:334] "Generic (PLEG): container finished" podID="6eac3eed-7721-4030-b1e3-9dd28fea2e49" containerID="22c2787ab7459d1deced22ba471cbda27f48a357d73310be38990de6082e6b54" exitCode=0 Mar 09 09:08:50 crc kubenswrapper[4861]: I0309 09:08:50.053267 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zkngt" event={"ID":"6eac3eed-7721-4030-b1e3-9dd28fea2e49","Type":"ContainerDied","Data":"22c2787ab7459d1deced22ba471cbda27f48a357d73310be38990de6082e6b54"} Mar 09 09:08:50 crc kubenswrapper[4861]: I0309 09:08:50.713737 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zkngt" Mar 09 09:08:50 crc kubenswrapper[4861]: I0309 09:08:50.781481 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6eac3eed-7721-4030-b1e3-9dd28fea2e49-utilities\") pod \"6eac3eed-7721-4030-b1e3-9dd28fea2e49\" (UID: \"6eac3eed-7721-4030-b1e3-9dd28fea2e49\") " Mar 09 09:08:50 crc kubenswrapper[4861]: I0309 09:08:50.781535 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kh8vm\" (UniqueName: \"kubernetes.io/projected/6eac3eed-7721-4030-b1e3-9dd28fea2e49-kube-api-access-kh8vm\") pod \"6eac3eed-7721-4030-b1e3-9dd28fea2e49\" (UID: \"6eac3eed-7721-4030-b1e3-9dd28fea2e49\") " Mar 09 09:08:50 crc kubenswrapper[4861]: I0309 09:08:50.781572 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6eac3eed-7721-4030-b1e3-9dd28fea2e49-catalog-content\") pod \"6eac3eed-7721-4030-b1e3-9dd28fea2e49\" (UID: \"6eac3eed-7721-4030-b1e3-9dd28fea2e49\") " Mar 09 09:08:50 crc kubenswrapper[4861]: I0309 09:08:50.783253 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6eac3eed-7721-4030-b1e3-9dd28fea2e49-utilities" (OuterVolumeSpecName: "utilities") pod "6eac3eed-7721-4030-b1e3-9dd28fea2e49" (UID: "6eac3eed-7721-4030-b1e3-9dd28fea2e49"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:08:50 crc kubenswrapper[4861]: I0309 09:08:50.788578 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6eac3eed-7721-4030-b1e3-9dd28fea2e49-kube-api-access-kh8vm" (OuterVolumeSpecName: "kube-api-access-kh8vm") pod "6eac3eed-7721-4030-b1e3-9dd28fea2e49" (UID: "6eac3eed-7721-4030-b1e3-9dd28fea2e49"). InnerVolumeSpecName "kube-api-access-kh8vm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:50 crc kubenswrapper[4861]: I0309 09:08:50.882516 4861 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6eac3eed-7721-4030-b1e3-9dd28fea2e49-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:50 crc kubenswrapper[4861]: I0309 09:08:50.882559 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kh8vm\" (UniqueName: \"kubernetes.io/projected/6eac3eed-7721-4030-b1e3-9dd28fea2e49-kube-api-access-kh8vm\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:50 crc kubenswrapper[4861]: I0309 09:08:50.911975 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6eac3eed-7721-4030-b1e3-9dd28fea2e49-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6eac3eed-7721-4030-b1e3-9dd28fea2e49" (UID: "6eac3eed-7721-4030-b1e3-9dd28fea2e49"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:08:50 crc kubenswrapper[4861]: I0309 09:08:50.983182 4861 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6eac3eed-7721-4030-b1e3-9dd28fea2e49-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:51 crc kubenswrapper[4861]: I0309 09:08:51.060185 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zkngt" event={"ID":"6eac3eed-7721-4030-b1e3-9dd28fea2e49","Type":"ContainerDied","Data":"1695d96a7543fbd3121585572761ea892924ee4533e8b4563882d1bd9d24c289"} Mar 09 09:08:51 crc kubenswrapper[4861]: I0309 09:08:51.060238 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zkngt" Mar 09 09:08:51 crc kubenswrapper[4861]: I0309 09:08:51.060425 4861 scope.go:117] "RemoveContainer" containerID="22c2787ab7459d1deced22ba471cbda27f48a357d73310be38990de6082e6b54" Mar 09 09:08:51 crc kubenswrapper[4861]: I0309 09:08:51.079220 4861 scope.go:117] "RemoveContainer" containerID="ac828fd1aa0f1b74d8fba5f0eb43f48c9ce9a8f13dbc669fd6e18a8fb33507cd" Mar 09 09:08:51 crc kubenswrapper[4861]: I0309 09:08:51.087454 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zkngt"] Mar 09 09:08:51 crc kubenswrapper[4861]: I0309 09:08:51.094780 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zkngt"] Mar 09 09:08:51 crc kubenswrapper[4861]: I0309 09:08:51.109855 4861 scope.go:117] "RemoveContainer" containerID="40bf8b5e100c32cd1129b993c6b7bc18d00a45daea9d9338afae72575f43af1a" Mar 09 09:08:51 crc kubenswrapper[4861]: I0309 09:08:51.670171 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6eac3eed-7721-4030-b1e3-9dd28fea2e49" path="/var/lib/kubelet/pods/6eac3eed-7721-4030-b1e3-9dd28fea2e49/volumes" Mar 09 09:08:54 crc kubenswrapper[4861]: I0309 09:08:54.904540 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-97dkg" podUID="eaee5667-e42c-4ef1-8c6b-279ee6fc171a" containerName="oauth-openshift" containerID="cri-o://1cc70892b0dd7216c8ade7ec994511e85d17018d17079894b1284beeef727892" gracePeriod=15 Mar 09 09:08:55 crc kubenswrapper[4861]: I0309 09:08:55.950452 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-97dkg" Mar 09 09:08:56 crc kubenswrapper[4861]: I0309 09:08:56.094141 4861 generic.go:334] "Generic (PLEG): container finished" podID="eaee5667-e42c-4ef1-8c6b-279ee6fc171a" containerID="1cc70892b0dd7216c8ade7ec994511e85d17018d17079894b1284beeef727892" exitCode=0 Mar 09 09:08:56 crc kubenswrapper[4861]: I0309 09:08:56.094205 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-97dkg" event={"ID":"eaee5667-e42c-4ef1-8c6b-279ee6fc171a","Type":"ContainerDied","Data":"1cc70892b0dd7216c8ade7ec994511e85d17018d17079894b1284beeef727892"} Mar 09 09:08:56 crc kubenswrapper[4861]: I0309 09:08:56.094256 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-97dkg" event={"ID":"eaee5667-e42c-4ef1-8c6b-279ee6fc171a","Type":"ContainerDied","Data":"824adca6296b893cfd963278ecdc1fb80b2fa49869c278a275449de57a9ee084"} Mar 09 09:08:56 crc kubenswrapper[4861]: I0309 09:08:56.094287 4861 scope.go:117] "RemoveContainer" containerID="1cc70892b0dd7216c8ade7ec994511e85d17018d17079894b1284beeef727892" Mar 09 09:08:56 crc kubenswrapper[4861]: I0309 09:08:56.094262 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-97dkg" Mar 09 09:08:56 crc kubenswrapper[4861]: I0309 09:08:56.121175 4861 scope.go:117] "RemoveContainer" containerID="1cc70892b0dd7216c8ade7ec994511e85d17018d17079894b1284beeef727892" Mar 09 09:08:56 crc kubenswrapper[4861]: E0309 09:08:56.121660 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cc70892b0dd7216c8ade7ec994511e85d17018d17079894b1284beeef727892\": container with ID starting with 1cc70892b0dd7216c8ade7ec994511e85d17018d17079894b1284beeef727892 not found: ID does not exist" containerID="1cc70892b0dd7216c8ade7ec994511e85d17018d17079894b1284beeef727892" Mar 09 09:08:56 crc kubenswrapper[4861]: I0309 09:08:56.121697 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cc70892b0dd7216c8ade7ec994511e85d17018d17079894b1284beeef727892"} err="failed to get container status \"1cc70892b0dd7216c8ade7ec994511e85d17018d17079894b1284beeef727892\": rpc error: code = NotFound desc = could not find container \"1cc70892b0dd7216c8ade7ec994511e85d17018d17079894b1284beeef727892\": container with ID starting with 1cc70892b0dd7216c8ade7ec994511e85d17018d17079894b1284beeef727892 not found: ID does not exist" Mar 09 09:08:56 crc kubenswrapper[4861]: I0309 09:08:56.147417 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-system-router-certs\") pod \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\" (UID: \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\") " Mar 09 09:08:56 crc kubenswrapper[4861]: I0309 09:08:56.147464 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-system-serving-cert\") pod \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\" (UID: \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\") " Mar 09 09:08:56 crc kubenswrapper[4861]: I0309 09:08:56.147493 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-user-idp-0-file-data\") pod \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\" (UID: \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\") " Mar 09 09:08:56 crc kubenswrapper[4861]: I0309 09:08:56.147518 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-system-session\") pod \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\" (UID: \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\") " Mar 09 09:08:56 crc kubenswrapper[4861]: I0309 09:08:56.147551 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9x2n\" (UniqueName: \"kubernetes.io/projected/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-kube-api-access-p9x2n\") pod \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\" (UID: \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\") " Mar 09 09:08:56 crc kubenswrapper[4861]: I0309 09:08:56.147596 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-user-template-provider-selection\") pod \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\" (UID: \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\") " Mar 09 09:08:56 crc kubenswrapper[4861]: I0309 09:08:56.147639 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-user-template-login\") pod \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\" (UID: \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\") " Mar 09 09:08:56 crc kubenswrapper[4861]: I0309 09:08:56.147658 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-audit-dir\") pod \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\" (UID: \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\") " Mar 09 09:08:56 crc kubenswrapper[4861]: I0309 09:08:56.147693 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-system-service-ca\") pod \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\" (UID: \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\") " Mar 09 09:08:56 crc kubenswrapper[4861]: I0309 09:08:56.147816 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-system-cliconfig\") pod \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\" (UID: \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\") " Mar 09 09:08:56 crc kubenswrapper[4861]: I0309 09:08:56.147841 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-audit-policies\") pod \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\" (UID: \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\") " Mar 09 09:08:56 crc kubenswrapper[4861]: I0309 09:08:56.147833 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "eaee5667-e42c-4ef1-8c6b-279ee6fc171a" (UID: "eaee5667-e42c-4ef1-8c6b-279ee6fc171a"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:08:56 crc kubenswrapper[4861]: I0309 09:08:56.147872 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-user-template-error\") pod \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\" (UID: \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\") " Mar 09 09:08:56 crc kubenswrapper[4861]: I0309 09:08:56.148035 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-system-ocp-branding-template\") pod \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\" (UID: \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\") " Mar 09 09:08:56 crc kubenswrapper[4861]: I0309 09:08:56.148086 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-system-trusted-ca-bundle\") pod \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\" (UID: \"eaee5667-e42c-4ef1-8c6b-279ee6fc171a\") " Mar 09 09:08:56 crc kubenswrapper[4861]: I0309 09:08:56.148314 4861 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:56 crc kubenswrapper[4861]: I0309 09:08:56.149842 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "eaee5667-e42c-4ef1-8c6b-279ee6fc171a" (UID: "eaee5667-e42c-4ef1-8c6b-279ee6fc171a"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:56 crc kubenswrapper[4861]: I0309 09:08:56.151081 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "eaee5667-e42c-4ef1-8c6b-279ee6fc171a" (UID: "eaee5667-e42c-4ef1-8c6b-279ee6fc171a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:56 crc kubenswrapper[4861]: I0309 09:08:56.151162 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "eaee5667-e42c-4ef1-8c6b-279ee6fc171a" (UID: "eaee5667-e42c-4ef1-8c6b-279ee6fc171a"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:56 crc kubenswrapper[4861]: I0309 09:08:56.151702 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "eaee5667-e42c-4ef1-8c6b-279ee6fc171a" (UID: "eaee5667-e42c-4ef1-8c6b-279ee6fc171a"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:08:56 crc kubenswrapper[4861]: I0309 09:08:56.154534 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "eaee5667-e42c-4ef1-8c6b-279ee6fc171a" (UID: "eaee5667-e42c-4ef1-8c6b-279ee6fc171a"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:56 crc kubenswrapper[4861]: I0309 09:08:56.154934 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-kube-api-access-p9x2n" (OuterVolumeSpecName: "kube-api-access-p9x2n") pod "eaee5667-e42c-4ef1-8c6b-279ee6fc171a" (UID: "eaee5667-e42c-4ef1-8c6b-279ee6fc171a"). InnerVolumeSpecName "kube-api-access-p9x2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:08:56 crc kubenswrapper[4861]: I0309 09:08:56.155600 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "eaee5667-e42c-4ef1-8c6b-279ee6fc171a" (UID: "eaee5667-e42c-4ef1-8c6b-279ee6fc171a"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:56 crc kubenswrapper[4861]: I0309 09:08:56.156477 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "eaee5667-e42c-4ef1-8c6b-279ee6fc171a" (UID: "eaee5667-e42c-4ef1-8c6b-279ee6fc171a"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:56 crc kubenswrapper[4861]: I0309 09:08:56.156707 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "eaee5667-e42c-4ef1-8c6b-279ee6fc171a" (UID: "eaee5667-e42c-4ef1-8c6b-279ee6fc171a"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:56 crc kubenswrapper[4861]: I0309 09:08:56.157100 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "eaee5667-e42c-4ef1-8c6b-279ee6fc171a" (UID: "eaee5667-e42c-4ef1-8c6b-279ee6fc171a"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:56 crc kubenswrapper[4861]: I0309 09:08:56.158289 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "eaee5667-e42c-4ef1-8c6b-279ee6fc171a" (UID: "eaee5667-e42c-4ef1-8c6b-279ee6fc171a"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:56 crc kubenswrapper[4861]: I0309 09:08:56.158737 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "eaee5667-e42c-4ef1-8c6b-279ee6fc171a" (UID: "eaee5667-e42c-4ef1-8c6b-279ee6fc171a"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:56 crc kubenswrapper[4861]: I0309 09:08:56.160182 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "eaee5667-e42c-4ef1-8c6b-279ee6fc171a" (UID: "eaee5667-e42c-4ef1-8c6b-279ee6fc171a"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:08:56 crc kubenswrapper[4861]: I0309 09:08:56.249192 4861 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:56 crc kubenswrapper[4861]: I0309 09:08:56.249264 4861 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:56 crc kubenswrapper[4861]: I0309 09:08:56.249279 4861 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:56 crc kubenswrapper[4861]: I0309 09:08:56.249292 4861 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:56 crc kubenswrapper[4861]: I0309 09:08:56.249305 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9x2n\" (UniqueName: \"kubernetes.io/projected/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-kube-api-access-p9x2n\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:56 crc kubenswrapper[4861]: I0309 09:08:56.249319 4861 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:56 crc kubenswrapper[4861]: I0309 09:08:56.249336 4861 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:56 crc kubenswrapper[4861]: I0309 09:08:56.249349 4861 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:56 crc kubenswrapper[4861]: I0309 09:08:56.249360 4861 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:56 crc kubenswrapper[4861]: I0309 09:08:56.249394 4861 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:56 crc kubenswrapper[4861]: I0309 09:08:56.249408 4861 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:56 crc kubenswrapper[4861]: I0309 09:08:56.249427 4861 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:56 crc kubenswrapper[4861]: I0309 09:08:56.249445 4861 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eaee5667-e42c-4ef1-8c6b-279ee6fc171a-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:08:56 crc kubenswrapper[4861]: I0309 09:08:56.433458 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-97dkg"] Mar 09 09:08:56 crc kubenswrapper[4861]: I0309 09:08:56.433919 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-97dkg"] Mar 09 09:08:57 crc kubenswrapper[4861]: I0309 09:08:57.669531 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eaee5667-e42c-4ef1-8c6b-279ee6fc171a" path="/var/lib/kubelet/pods/eaee5667-e42c-4ef1-8c6b-279ee6fc171a/volumes" Mar 09 09:09:02 crc kubenswrapper[4861]: I0309 09:09:02.603040 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7b76458476-6twtf"] Mar 09 09:09:02 crc kubenswrapper[4861]: I0309 09:09:02.603753 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7b76458476-6twtf" podUID="17fc9163-cd18-41c0-abbc-2f3d15063703" containerName="controller-manager" containerID="cri-o://559b8e24ae687e5a1b7e6f52bbb496c165639e1c743f12bf286c2273bb8ae09b" gracePeriod=30 Mar 09 09:09:02 crc kubenswrapper[4861]: I0309 09:09:02.701445 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-698587484d-g2lfn"] Mar 09 09:09:02 crc kubenswrapper[4861]: I0309 09:09:02.701669 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-698587484d-g2lfn" podUID="2d49cdfb-1198-408e-a97f-a03243e5a46c" containerName="route-controller-manager" containerID="cri-o://abf47dfe3aed929bdefcb4118d3105871b156bec1a89dac0e067eacb8a91a487" gracePeriod=30 Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.150674 4861 generic.go:334] "Generic (PLEG): container finished" podID="17fc9163-cd18-41c0-abbc-2f3d15063703" containerID="559b8e24ae687e5a1b7e6f52bbb496c165639e1c743f12bf286c2273bb8ae09b" exitCode=0 Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.150765 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b76458476-6twtf" event={"ID":"17fc9163-cd18-41c0-abbc-2f3d15063703","Type":"ContainerDied","Data":"559b8e24ae687e5a1b7e6f52bbb496c165639e1c743f12bf286c2273bb8ae09b"} Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.152410 4861 generic.go:334] "Generic (PLEG): container finished" podID="2d49cdfb-1198-408e-a97f-a03243e5a46c" containerID="abf47dfe3aed929bdefcb4118d3105871b156bec1a89dac0e067eacb8a91a487" exitCode=0 Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.152602 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-698587484d-g2lfn" event={"ID":"2d49cdfb-1198-408e-a97f-a03243e5a46c","Type":"ContainerDied","Data":"abf47dfe3aed929bdefcb4118d3105871b156bec1a89dac0e067eacb8a91a487"} Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.152829 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-698587484d-g2lfn" event={"ID":"2d49cdfb-1198-408e-a97f-a03243e5a46c","Type":"ContainerDied","Data":"9503657adacd1957167f343d02c0d293fc812ef2fe179133724d4b75098976fa"} Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.152846 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9503657adacd1957167f343d02c0d293fc812ef2fe179133724d4b75098976fa" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.184398 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-698587484d-g2lfn" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.191178 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b76458476-6twtf" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.248959 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17fc9163-cd18-41c0-abbc-2f3d15063703-config\") pod \"17fc9163-cd18-41c0-abbc-2f3d15063703\" (UID: \"17fc9163-cd18-41c0-abbc-2f3d15063703\") " Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.249022 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nxs5\" (UniqueName: \"kubernetes.io/projected/2d49cdfb-1198-408e-a97f-a03243e5a46c-kube-api-access-5nxs5\") pod \"2d49cdfb-1198-408e-a97f-a03243e5a46c\" (UID: \"2d49cdfb-1198-408e-a97f-a03243e5a46c\") " Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.249060 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17fc9163-cd18-41c0-abbc-2f3d15063703-serving-cert\") pod \"17fc9163-cd18-41c0-abbc-2f3d15063703\" (UID: \"17fc9163-cd18-41c0-abbc-2f3d15063703\") " Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.249082 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d49cdfb-1198-408e-a97f-a03243e5a46c-config\") pod \"2d49cdfb-1198-408e-a97f-a03243e5a46c\" (UID: \"2d49cdfb-1198-408e-a97f-a03243e5a46c\") " Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.249104 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2d49cdfb-1198-408e-a97f-a03243e5a46c-client-ca\") pod \"2d49cdfb-1198-408e-a97f-a03243e5a46c\" (UID: \"2d49cdfb-1198-408e-a97f-a03243e5a46c\") " Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.249155 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5867q\" (UniqueName: \"kubernetes.io/projected/17fc9163-cd18-41c0-abbc-2f3d15063703-kube-api-access-5867q\") pod \"17fc9163-cd18-41c0-abbc-2f3d15063703\" (UID: \"17fc9163-cd18-41c0-abbc-2f3d15063703\") " Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.249181 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d49cdfb-1198-408e-a97f-a03243e5a46c-serving-cert\") pod \"2d49cdfb-1198-408e-a97f-a03243e5a46c\" (UID: \"2d49cdfb-1198-408e-a97f-a03243e5a46c\") " Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.249198 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/17fc9163-cd18-41c0-abbc-2f3d15063703-proxy-ca-bundles\") pod \"17fc9163-cd18-41c0-abbc-2f3d15063703\" (UID: \"17fc9163-cd18-41c0-abbc-2f3d15063703\") " Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.249220 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/17fc9163-cd18-41c0-abbc-2f3d15063703-client-ca\") pod \"17fc9163-cd18-41c0-abbc-2f3d15063703\" (UID: \"17fc9163-cd18-41c0-abbc-2f3d15063703\") " Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.249865 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17fc9163-cd18-41c0-abbc-2f3d15063703-client-ca" (OuterVolumeSpecName: "client-ca") pod "17fc9163-cd18-41c0-abbc-2f3d15063703" (UID: "17fc9163-cd18-41c0-abbc-2f3d15063703"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.249883 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d49cdfb-1198-408e-a97f-a03243e5a46c-client-ca" (OuterVolumeSpecName: "client-ca") pod "2d49cdfb-1198-408e-a97f-a03243e5a46c" (UID: "2d49cdfb-1198-408e-a97f-a03243e5a46c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.249912 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17fc9163-cd18-41c0-abbc-2f3d15063703-config" (OuterVolumeSpecName: "config") pod "17fc9163-cd18-41c0-abbc-2f3d15063703" (UID: "17fc9163-cd18-41c0-abbc-2f3d15063703"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.249945 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d49cdfb-1198-408e-a97f-a03243e5a46c-config" (OuterVolumeSpecName: "config") pod "2d49cdfb-1198-408e-a97f-a03243e5a46c" (UID: "2d49cdfb-1198-408e-a97f-a03243e5a46c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.250343 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17fc9163-cd18-41c0-abbc-2f3d15063703-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "17fc9163-cd18-41c0-abbc-2f3d15063703" (UID: "17fc9163-cd18-41c0-abbc-2f3d15063703"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.256176 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17fc9163-cd18-41c0-abbc-2f3d15063703-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "17fc9163-cd18-41c0-abbc-2f3d15063703" (UID: "17fc9163-cd18-41c0-abbc-2f3d15063703"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.256293 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d49cdfb-1198-408e-a97f-a03243e5a46c-kube-api-access-5nxs5" (OuterVolumeSpecName: "kube-api-access-5nxs5") pod "2d49cdfb-1198-408e-a97f-a03243e5a46c" (UID: "2d49cdfb-1198-408e-a97f-a03243e5a46c"). InnerVolumeSpecName "kube-api-access-5nxs5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.258893 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17fc9163-cd18-41c0-abbc-2f3d15063703-kube-api-access-5867q" (OuterVolumeSpecName: "kube-api-access-5867q") pod "17fc9163-cd18-41c0-abbc-2f3d15063703" (UID: "17fc9163-cd18-41c0-abbc-2f3d15063703"). InnerVolumeSpecName "kube-api-access-5867q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.259489 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d49cdfb-1198-408e-a97f-a03243e5a46c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2d49cdfb-1198-408e-a97f-a03243e5a46c" (UID: "2d49cdfb-1198-408e-a97f-a03243e5a46c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.318776 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-7d9c768c99-njfhg"] Mar 09 09:09:03 crc kubenswrapper[4861]: E0309 09:09:03.318996 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17fc9163-cd18-41c0-abbc-2f3d15063703" containerName="controller-manager" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.319008 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="17fc9163-cd18-41c0-abbc-2f3d15063703" containerName="controller-manager" Mar 09 09:09:03 crc kubenswrapper[4861]: E0309 09:09:03.319021 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d49cdfb-1198-408e-a97f-a03243e5a46c" containerName="route-controller-manager" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.319028 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d49cdfb-1198-408e-a97f-a03243e5a46c" containerName="route-controller-manager" Mar 09 09:09:03 crc kubenswrapper[4861]: E0309 09:09:03.319039 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6eac3eed-7721-4030-b1e3-9dd28fea2e49" containerName="extract-content" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.319047 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eac3eed-7721-4030-b1e3-9dd28fea2e49" containerName="extract-content" Mar 09 09:09:03 crc kubenswrapper[4861]: E0309 09:09:03.319056 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaee5667-e42c-4ef1-8c6b-279ee6fc171a" containerName="oauth-openshift" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.319062 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaee5667-e42c-4ef1-8c6b-279ee6fc171a" containerName="oauth-openshift" Mar 09 09:09:03 crc kubenswrapper[4861]: E0309 09:09:03.319074 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6eac3eed-7721-4030-b1e3-9dd28fea2e49" containerName="extract-utilities" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.319079 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eac3eed-7721-4030-b1e3-9dd28fea2e49" containerName="extract-utilities" Mar 09 09:09:03 crc kubenswrapper[4861]: E0309 09:09:03.319088 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6eac3eed-7721-4030-b1e3-9dd28fea2e49" containerName="registry-server" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.319093 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eac3eed-7721-4030-b1e3-9dd28fea2e49" containerName="registry-server" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.319170 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="6eac3eed-7721-4030-b1e3-9dd28fea2e49" containerName="registry-server" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.319180 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaee5667-e42c-4ef1-8c6b-279ee6fc171a" containerName="oauth-openshift" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.319189 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d49cdfb-1198-408e-a97f-a03243e5a46c" containerName="route-controller-manager" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.319198 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="17fc9163-cd18-41c0-abbc-2f3d15063703" containerName="controller-manager" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.319551 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7d9c768c99-njfhg" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.332614 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.334161 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.335755 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.335967 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.336027 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.336136 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.338550 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.339476 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.339668 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.339785 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.339856 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.340329 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.351195 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7d9c768c99-njfhg"] Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.351269 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.351741 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb-v4-0-config-user-template-error\") pod \"oauth-openshift-7d9c768c99-njfhg\" (UID: \"88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-njfhg" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.351816 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb-audit-dir\") pod \"oauth-openshift-7d9c768c99-njfhg\" (UID: \"88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-njfhg" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.351864 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb-v4-0-config-user-template-login\") pod \"oauth-openshift-7d9c768c99-njfhg\" (UID: \"88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-njfhg" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.351893 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb-v4-0-config-system-service-ca\") pod \"oauth-openshift-7d9c768c99-njfhg\" (UID: \"88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-njfhg" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.351939 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7d9c768c99-njfhg\" (UID: \"88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-njfhg" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.351969 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7d9c768c99-njfhg\" (UID: \"88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-njfhg" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.351993 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtn9l\" (UniqueName: \"kubernetes.io/projected/88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb-kube-api-access-rtn9l\") pod \"oauth-openshift-7d9c768c99-njfhg\" (UID: \"88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-njfhg" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.352031 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb-v4-0-config-system-session\") pod \"oauth-openshift-7d9c768c99-njfhg\" (UID: \"88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-njfhg" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.352056 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7d9c768c99-njfhg\" (UID: \"88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-njfhg" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.352089 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7d9c768c99-njfhg\" (UID: \"88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-njfhg" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.352134 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7d9c768c99-njfhg\" (UID: \"88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-njfhg" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.352158 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7d9c768c99-njfhg\" (UID: \"88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-njfhg" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.352197 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb-audit-policies\") pod \"oauth-openshift-7d9c768c99-njfhg\" (UID: \"88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-njfhg" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.352232 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb-v4-0-config-system-router-certs\") pod \"oauth-openshift-7d9c768c99-njfhg\" (UID: \"88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-njfhg" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.352290 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5867q\" (UniqueName: \"kubernetes.io/projected/17fc9163-cd18-41c0-abbc-2f3d15063703-kube-api-access-5867q\") on node \"crc\" DevicePath \"\"" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.352306 4861 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/17fc9163-cd18-41c0-abbc-2f3d15063703-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.352324 4861 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d49cdfb-1198-408e-a97f-a03243e5a46c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.352338 4861 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/17fc9163-cd18-41c0-abbc-2f3d15063703-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.352353 4861 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17fc9163-cd18-41c0-abbc-2f3d15063703-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.352389 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5nxs5\" (UniqueName: \"kubernetes.io/projected/2d49cdfb-1198-408e-a97f-a03243e5a46c-kube-api-access-5nxs5\") on node \"crc\" DevicePath \"\"" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.352402 4861 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17fc9163-cd18-41c0-abbc-2f3d15063703-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.352415 4861 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d49cdfb-1198-408e-a97f-a03243e5a46c-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.352427 4861 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2d49cdfb-1198-408e-a97f-a03243e5a46c-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.356093 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.359401 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.453510 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb-v4-0-config-user-template-error\") pod \"oauth-openshift-7d9c768c99-njfhg\" (UID: \"88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-njfhg" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.453566 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb-audit-dir\") pod \"oauth-openshift-7d9c768c99-njfhg\" (UID: \"88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-njfhg" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.453600 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb-v4-0-config-user-template-login\") pod \"oauth-openshift-7d9c768c99-njfhg\" (UID: \"88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-njfhg" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.453628 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb-v4-0-config-system-service-ca\") pod \"oauth-openshift-7d9c768c99-njfhg\" (UID: \"88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-njfhg" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.453678 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7d9c768c99-njfhg\" (UID: \"88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-njfhg" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.453703 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7d9c768c99-njfhg\" (UID: \"88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-njfhg" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.453724 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtn9l\" (UniqueName: \"kubernetes.io/projected/88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb-kube-api-access-rtn9l\") pod \"oauth-openshift-7d9c768c99-njfhg\" (UID: \"88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-njfhg" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.453752 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb-v4-0-config-system-session\") pod \"oauth-openshift-7d9c768c99-njfhg\" (UID: \"88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-njfhg" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.453790 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7d9c768c99-njfhg\" (UID: \"88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-njfhg" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.453818 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7d9c768c99-njfhg\" (UID: \"88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-njfhg" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.453807 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb-audit-dir\") pod \"oauth-openshift-7d9c768c99-njfhg\" (UID: \"88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-njfhg" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.453856 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7d9c768c99-njfhg\" (UID: \"88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-njfhg" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.453925 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7d9c768c99-njfhg\" (UID: \"88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-njfhg" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.454024 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb-audit-policies\") pod \"oauth-openshift-7d9c768c99-njfhg\" (UID: \"88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-njfhg" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.454077 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb-v4-0-config-system-router-certs\") pod \"oauth-openshift-7d9c768c99-njfhg\" (UID: \"88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-njfhg" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.454638 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb-v4-0-config-system-service-ca\") pod \"oauth-openshift-7d9c768c99-njfhg\" (UID: \"88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-njfhg" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.454933 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7d9c768c99-njfhg\" (UID: \"88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-njfhg" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.455096 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7d9c768c99-njfhg\" (UID: \"88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-njfhg" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.455510 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb-audit-policies\") pod \"oauth-openshift-7d9c768c99-njfhg\" (UID: \"88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-njfhg" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.456995 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb-v4-0-config-user-template-error\") pod \"oauth-openshift-7d9c768c99-njfhg\" (UID: \"88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-njfhg" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.457098 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7d9c768c99-njfhg\" (UID: \"88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-njfhg" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.458758 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7d9c768c99-njfhg\" (UID: \"88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-njfhg" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.459750 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7d9c768c99-njfhg\" (UID: \"88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-njfhg" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.460335 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7d9c768c99-njfhg\" (UID: \"88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-njfhg" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.461450 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb-v4-0-config-user-template-login\") pod \"oauth-openshift-7d9c768c99-njfhg\" (UID: \"88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-njfhg" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.462829 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb-v4-0-config-system-router-certs\") pod \"oauth-openshift-7d9c768c99-njfhg\" (UID: \"88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-njfhg" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.463894 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb-v4-0-config-system-session\") pod \"oauth-openshift-7d9c768c99-njfhg\" (UID: \"88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-njfhg" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.474351 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtn9l\" (UniqueName: \"kubernetes.io/projected/88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb-kube-api-access-rtn9l\") pod \"oauth-openshift-7d9c768c99-njfhg\" (UID: \"88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-njfhg" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.658923 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7d9c768c99-njfhg" Mar 09 09:09:03 crc kubenswrapper[4861]: I0309 09:09:03.927355 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7d9c768c99-njfhg"] Mar 09 09:09:04 crc kubenswrapper[4861]: I0309 09:09:04.163218 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b76458476-6twtf" event={"ID":"17fc9163-cd18-41c0-abbc-2f3d15063703","Type":"ContainerDied","Data":"9fc28cfabf6b742bf4eab906fa7dde41866db831dfa2b95bbc1a314aae2f6788"} Mar 09 09:09:04 crc kubenswrapper[4861]: I0309 09:09:04.163299 4861 scope.go:117] "RemoveContainer" containerID="559b8e24ae687e5a1b7e6f52bbb496c165639e1c743f12bf286c2273bb8ae09b" Mar 09 09:09:04 crc kubenswrapper[4861]: I0309 09:09:04.163783 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b76458476-6twtf" Mar 09 09:09:04 crc kubenswrapper[4861]: I0309 09:09:04.164788 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-698587484d-g2lfn" Mar 09 09:09:04 crc kubenswrapper[4861]: I0309 09:09:04.165047 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7d9c768c99-njfhg" event={"ID":"88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb","Type":"ContainerStarted","Data":"5bd4decd7bb172edab7b8b01b308d6fdea966d030974299a36d1f45e33503a9d"} Mar 09 09:09:04 crc kubenswrapper[4861]: I0309 09:09:04.197743 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7b76458476-6twtf"] Mar 09 09:09:04 crc kubenswrapper[4861]: I0309 09:09:04.210284 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7b76458476-6twtf"] Mar 09 09:09:04 crc kubenswrapper[4861]: I0309 09:09:04.217422 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-698587484d-g2lfn"] Mar 09 09:09:04 crc kubenswrapper[4861]: I0309 09:09:04.221970 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-698587484d-g2lfn"] Mar 09 09:09:04 crc kubenswrapper[4861]: I0309 09:09:04.317836 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bbc7557c8-pw74s"] Mar 09 09:09:04 crc kubenswrapper[4861]: I0309 09:09:04.318681 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-bbc7557c8-pw74s" Mar 09 09:09:04 crc kubenswrapper[4861]: I0309 09:09:04.320302 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 09 09:09:04 crc kubenswrapper[4861]: I0309 09:09:04.320832 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 09 09:09:04 crc kubenswrapper[4861]: I0309 09:09:04.320913 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 09 09:09:04 crc kubenswrapper[4861]: I0309 09:09:04.321306 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-56f4f7678b-gvf92"] Mar 09 09:09:04 crc kubenswrapper[4861]: I0309 09:09:04.320976 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 09 09:09:04 crc kubenswrapper[4861]: I0309 09:09:04.321022 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 09 09:09:04 crc kubenswrapper[4861]: I0309 09:09:04.321926 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 09 09:09:04 crc kubenswrapper[4861]: I0309 09:09:04.322288 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56f4f7678b-gvf92" Mar 09 09:09:04 crc kubenswrapper[4861]: I0309 09:09:04.324500 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 09 09:09:04 crc kubenswrapper[4861]: I0309 09:09:04.325110 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 09 09:09:04 crc kubenswrapper[4861]: I0309 09:09:04.325761 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 09 09:09:04 crc kubenswrapper[4861]: I0309 09:09:04.326247 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 09 09:09:04 crc kubenswrapper[4861]: I0309 09:09:04.330335 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 09 09:09:04 crc kubenswrapper[4861]: I0309 09:09:04.332164 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bbc7557c8-pw74s"] Mar 09 09:09:04 crc kubenswrapper[4861]: I0309 09:09:04.335157 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 09 09:09:04 crc kubenswrapper[4861]: I0309 09:09:04.335835 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 09 09:09:04 crc kubenswrapper[4861]: I0309 09:09:04.338297 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-56f4f7678b-gvf92"] Mar 09 09:09:04 crc kubenswrapper[4861]: I0309 09:09:04.374281 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/12f9feff-47d4-471e-a979-7c37b3922248-client-ca\") pod \"controller-manager-56f4f7678b-gvf92\" (UID: \"12f9feff-47d4-471e-a979-7c37b3922248\") " pod="openshift-controller-manager/controller-manager-56f4f7678b-gvf92" Mar 09 09:09:04 crc kubenswrapper[4861]: I0309 09:09:04.374347 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12f9feff-47d4-471e-a979-7c37b3922248-config\") pod \"controller-manager-56f4f7678b-gvf92\" (UID: \"12f9feff-47d4-471e-a979-7c37b3922248\") " pod="openshift-controller-manager/controller-manager-56f4f7678b-gvf92" Mar 09 09:09:04 crc kubenswrapper[4861]: I0309 09:09:04.374407 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqkkf\" (UniqueName: \"kubernetes.io/projected/12f9feff-47d4-471e-a979-7c37b3922248-kube-api-access-tqkkf\") pod \"controller-manager-56f4f7678b-gvf92\" (UID: \"12f9feff-47d4-471e-a979-7c37b3922248\") " pod="openshift-controller-manager/controller-manager-56f4f7678b-gvf92" Mar 09 09:09:04 crc kubenswrapper[4861]: I0309 09:09:04.374437 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12f9feff-47d4-471e-a979-7c37b3922248-serving-cert\") pod \"controller-manager-56f4f7678b-gvf92\" (UID: \"12f9feff-47d4-471e-a979-7c37b3922248\") " pod="openshift-controller-manager/controller-manager-56f4f7678b-gvf92" Mar 09 09:09:04 crc kubenswrapper[4861]: I0309 09:09:04.374466 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qn9mk\" (UniqueName: \"kubernetes.io/projected/f881fd60-0955-4706-8e9e-bdbc5259248a-kube-api-access-qn9mk\") pod \"route-controller-manager-bbc7557c8-pw74s\" (UID: \"f881fd60-0955-4706-8e9e-bdbc5259248a\") " pod="openshift-route-controller-manager/route-controller-manager-bbc7557c8-pw74s" Mar 09 09:09:04 crc kubenswrapper[4861]: I0309 09:09:04.374498 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f881fd60-0955-4706-8e9e-bdbc5259248a-serving-cert\") pod \"route-controller-manager-bbc7557c8-pw74s\" (UID: \"f881fd60-0955-4706-8e9e-bdbc5259248a\") " pod="openshift-route-controller-manager/route-controller-manager-bbc7557c8-pw74s" Mar 09 09:09:04 crc kubenswrapper[4861]: I0309 09:09:04.374530 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f881fd60-0955-4706-8e9e-bdbc5259248a-config\") pod \"route-controller-manager-bbc7557c8-pw74s\" (UID: \"f881fd60-0955-4706-8e9e-bdbc5259248a\") " pod="openshift-route-controller-manager/route-controller-manager-bbc7557c8-pw74s" Mar 09 09:09:04 crc kubenswrapper[4861]: I0309 09:09:04.374559 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/12f9feff-47d4-471e-a979-7c37b3922248-proxy-ca-bundles\") pod \"controller-manager-56f4f7678b-gvf92\" (UID: \"12f9feff-47d4-471e-a979-7c37b3922248\") " pod="openshift-controller-manager/controller-manager-56f4f7678b-gvf92" Mar 09 09:09:04 crc kubenswrapper[4861]: I0309 09:09:04.374795 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f881fd60-0955-4706-8e9e-bdbc5259248a-client-ca\") pod \"route-controller-manager-bbc7557c8-pw74s\" (UID: \"f881fd60-0955-4706-8e9e-bdbc5259248a\") " pod="openshift-route-controller-manager/route-controller-manager-bbc7557c8-pw74s" Mar 09 09:09:04 crc kubenswrapper[4861]: I0309 09:09:04.475982 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qn9mk\" (UniqueName: \"kubernetes.io/projected/f881fd60-0955-4706-8e9e-bdbc5259248a-kube-api-access-qn9mk\") pod \"route-controller-manager-bbc7557c8-pw74s\" (UID: \"f881fd60-0955-4706-8e9e-bdbc5259248a\") " pod="openshift-route-controller-manager/route-controller-manager-bbc7557c8-pw74s" Mar 09 09:09:04 crc kubenswrapper[4861]: I0309 09:09:04.476039 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f881fd60-0955-4706-8e9e-bdbc5259248a-serving-cert\") pod \"route-controller-manager-bbc7557c8-pw74s\" (UID: \"f881fd60-0955-4706-8e9e-bdbc5259248a\") " pod="openshift-route-controller-manager/route-controller-manager-bbc7557c8-pw74s" Mar 09 09:09:04 crc kubenswrapper[4861]: I0309 09:09:04.476073 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f881fd60-0955-4706-8e9e-bdbc5259248a-config\") pod \"route-controller-manager-bbc7557c8-pw74s\" (UID: \"f881fd60-0955-4706-8e9e-bdbc5259248a\") " pod="openshift-route-controller-manager/route-controller-manager-bbc7557c8-pw74s" Mar 09 09:09:04 crc kubenswrapper[4861]: I0309 09:09:04.476093 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/12f9feff-47d4-471e-a979-7c37b3922248-proxy-ca-bundles\") pod \"controller-manager-56f4f7678b-gvf92\" (UID: \"12f9feff-47d4-471e-a979-7c37b3922248\") " pod="openshift-controller-manager/controller-manager-56f4f7678b-gvf92" Mar 09 09:09:04 crc kubenswrapper[4861]: I0309 09:09:04.476122 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f881fd60-0955-4706-8e9e-bdbc5259248a-client-ca\") pod \"route-controller-manager-bbc7557c8-pw74s\" (UID: \"f881fd60-0955-4706-8e9e-bdbc5259248a\") " pod="openshift-route-controller-manager/route-controller-manager-bbc7557c8-pw74s" Mar 09 09:09:04 crc kubenswrapper[4861]: I0309 09:09:04.476147 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/12f9feff-47d4-471e-a979-7c37b3922248-client-ca\") pod \"controller-manager-56f4f7678b-gvf92\" (UID: \"12f9feff-47d4-471e-a979-7c37b3922248\") " pod="openshift-controller-manager/controller-manager-56f4f7678b-gvf92" Mar 09 09:09:04 crc kubenswrapper[4861]: I0309 09:09:04.476170 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12f9feff-47d4-471e-a979-7c37b3922248-config\") pod \"controller-manager-56f4f7678b-gvf92\" (UID: \"12f9feff-47d4-471e-a979-7c37b3922248\") " pod="openshift-controller-manager/controller-manager-56f4f7678b-gvf92" Mar 09 09:09:04 crc kubenswrapper[4861]: I0309 09:09:04.476208 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqkkf\" (UniqueName: \"kubernetes.io/projected/12f9feff-47d4-471e-a979-7c37b3922248-kube-api-access-tqkkf\") pod \"controller-manager-56f4f7678b-gvf92\" (UID: \"12f9feff-47d4-471e-a979-7c37b3922248\") " pod="openshift-controller-manager/controller-manager-56f4f7678b-gvf92" Mar 09 09:09:04 crc kubenswrapper[4861]: I0309 09:09:04.476232 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12f9feff-47d4-471e-a979-7c37b3922248-serving-cert\") pod \"controller-manager-56f4f7678b-gvf92\" (UID: \"12f9feff-47d4-471e-a979-7c37b3922248\") " pod="openshift-controller-manager/controller-manager-56f4f7678b-gvf92" Mar 09 09:09:04 crc kubenswrapper[4861]: I0309 09:09:04.477224 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f881fd60-0955-4706-8e9e-bdbc5259248a-client-ca\") pod \"route-controller-manager-bbc7557c8-pw74s\" (UID: \"f881fd60-0955-4706-8e9e-bdbc5259248a\") " pod="openshift-route-controller-manager/route-controller-manager-bbc7557c8-pw74s" Mar 09 09:09:04 crc kubenswrapper[4861]: I0309 09:09:04.477959 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f881fd60-0955-4706-8e9e-bdbc5259248a-config\") pod \"route-controller-manager-bbc7557c8-pw74s\" (UID: \"f881fd60-0955-4706-8e9e-bdbc5259248a\") " pod="openshift-route-controller-manager/route-controller-manager-bbc7557c8-pw74s" Mar 09 09:09:04 crc kubenswrapper[4861]: I0309 09:09:04.478286 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/12f9feff-47d4-471e-a979-7c37b3922248-proxy-ca-bundles\") pod \"controller-manager-56f4f7678b-gvf92\" (UID: \"12f9feff-47d4-471e-a979-7c37b3922248\") " pod="openshift-controller-manager/controller-manager-56f4f7678b-gvf92" Mar 09 09:09:04 crc kubenswrapper[4861]: I0309 09:09:04.478990 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/12f9feff-47d4-471e-a979-7c37b3922248-client-ca\") pod \"controller-manager-56f4f7678b-gvf92\" (UID: \"12f9feff-47d4-471e-a979-7c37b3922248\") " pod="openshift-controller-manager/controller-manager-56f4f7678b-gvf92" Mar 09 09:09:04 crc kubenswrapper[4861]: I0309 09:09:04.479445 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12f9feff-47d4-471e-a979-7c37b3922248-config\") pod \"controller-manager-56f4f7678b-gvf92\" (UID: \"12f9feff-47d4-471e-a979-7c37b3922248\") " pod="openshift-controller-manager/controller-manager-56f4f7678b-gvf92" Mar 09 09:09:04 crc kubenswrapper[4861]: I0309 09:09:04.481224 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12f9feff-47d4-471e-a979-7c37b3922248-serving-cert\") pod \"controller-manager-56f4f7678b-gvf92\" (UID: \"12f9feff-47d4-471e-a979-7c37b3922248\") " pod="openshift-controller-manager/controller-manager-56f4f7678b-gvf92" Mar 09 09:09:04 crc kubenswrapper[4861]: I0309 09:09:04.481620 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f881fd60-0955-4706-8e9e-bdbc5259248a-serving-cert\") pod \"route-controller-manager-bbc7557c8-pw74s\" (UID: \"f881fd60-0955-4706-8e9e-bdbc5259248a\") " pod="openshift-route-controller-manager/route-controller-manager-bbc7557c8-pw74s" Mar 09 09:09:04 crc kubenswrapper[4861]: I0309 09:09:04.499418 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qn9mk\" (UniqueName: \"kubernetes.io/projected/f881fd60-0955-4706-8e9e-bdbc5259248a-kube-api-access-qn9mk\") pod \"route-controller-manager-bbc7557c8-pw74s\" (UID: \"f881fd60-0955-4706-8e9e-bdbc5259248a\") " pod="openshift-route-controller-manager/route-controller-manager-bbc7557c8-pw74s" Mar 09 09:09:04 crc kubenswrapper[4861]: I0309 09:09:04.503154 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqkkf\" (UniqueName: \"kubernetes.io/projected/12f9feff-47d4-471e-a979-7c37b3922248-kube-api-access-tqkkf\") pod \"controller-manager-56f4f7678b-gvf92\" (UID: \"12f9feff-47d4-471e-a979-7c37b3922248\") " pod="openshift-controller-manager/controller-manager-56f4f7678b-gvf92" Mar 09 09:09:04 crc kubenswrapper[4861]: I0309 09:09:04.634334 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-bbc7557c8-pw74s" Mar 09 09:09:04 crc kubenswrapper[4861]: I0309 09:09:04.641995 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56f4f7678b-gvf92" Mar 09 09:09:05 crc kubenswrapper[4861]: I0309 09:09:05.084324 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-bbc7557c8-pw74s"] Mar 09 09:09:05 crc kubenswrapper[4861]: W0309 09:09:05.129820 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf881fd60_0955_4706_8e9e_bdbc5259248a.slice/crio-15ac6a5fe35159a01277c7a83652230cdde3527c2bde1c16167a6b01f3b796a8 WatchSource:0}: Error finding container 15ac6a5fe35159a01277c7a83652230cdde3527c2bde1c16167a6b01f3b796a8: Status 404 returned error can't find the container with id 15ac6a5fe35159a01277c7a83652230cdde3527c2bde1c16167a6b01f3b796a8 Mar 09 09:09:05 crc kubenswrapper[4861]: I0309 09:09:05.184465 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-56f4f7678b-gvf92"] Mar 09 09:09:05 crc kubenswrapper[4861]: I0309 09:09:05.190000 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-bbc7557c8-pw74s" event={"ID":"f881fd60-0955-4706-8e9e-bdbc5259248a","Type":"ContainerStarted","Data":"15ac6a5fe35159a01277c7a83652230cdde3527c2bde1c16167a6b01f3b796a8"} Mar 09 09:09:05 crc kubenswrapper[4861]: W0309 09:09:05.192249 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12f9feff_47d4_471e_a979_7c37b3922248.slice/crio-e455e9215f78f23b78cfea665a95c1cb5c920f8b26321f8cdeafee622ba54e66 WatchSource:0}: Error finding container e455e9215f78f23b78cfea665a95c1cb5c920f8b26321f8cdeafee622ba54e66: Status 404 returned error can't find the container with id e455e9215f78f23b78cfea665a95c1cb5c920f8b26321f8cdeafee622ba54e66 Mar 09 09:09:05 crc kubenswrapper[4861]: I0309 09:09:05.192876 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7d9c768c99-njfhg" event={"ID":"88b572cc-2fbf-42f5-8e0e-61b1b7c34ccb","Type":"ContainerStarted","Data":"b863cffd9ea5461189e05abfc2d73319cf57730660cc209aadab709dd0a97657"} Mar 09 09:09:05 crc kubenswrapper[4861]: I0309 09:09:05.193494 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7d9c768c99-njfhg" Mar 09 09:09:05 crc kubenswrapper[4861]: I0309 09:09:05.200824 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-7d9c768c99-njfhg" Mar 09 09:09:05 crc kubenswrapper[4861]: I0309 09:09:05.222149 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-7d9c768c99-njfhg" podStartSLOduration=36.222125773 podStartE2EDuration="36.222125773s" podCreationTimestamp="2026-03-09 09:08:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:09:05.221464545 +0000 UTC m=+188.306504036" watchObservedRunningTime="2026-03-09 09:09:05.222125773 +0000 UTC m=+188.307165214" Mar 09 09:09:05 crc kubenswrapper[4861]: I0309 09:09:05.663990 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17fc9163-cd18-41c0-abbc-2f3d15063703" path="/var/lib/kubelet/pods/17fc9163-cd18-41c0-abbc-2f3d15063703/volumes" Mar 09 09:09:05 crc kubenswrapper[4861]: I0309 09:09:05.664911 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d49cdfb-1198-408e-a97f-a03243e5a46c" path="/var/lib/kubelet/pods/2d49cdfb-1198-408e-a97f-a03243e5a46c/volumes" Mar 09 09:09:06 crc kubenswrapper[4861]: I0309 09:09:06.212827 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-bbc7557c8-pw74s" event={"ID":"f881fd60-0955-4706-8e9e-bdbc5259248a","Type":"ContainerStarted","Data":"2a135fb881cf0d925cb65fe93fd4f74cb25f415e3353fe96d1640c011e4fb831"} Mar 09 09:09:06 crc kubenswrapper[4861]: I0309 09:09:06.213196 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-bbc7557c8-pw74s" Mar 09 09:09:06 crc kubenswrapper[4861]: I0309 09:09:06.215741 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-56f4f7678b-gvf92" event={"ID":"12f9feff-47d4-471e-a979-7c37b3922248","Type":"ContainerStarted","Data":"14ec663e66363fe8d020fc1dc17483d826281950522d554197785faa5b8b1b4b"} Mar 09 09:09:06 crc kubenswrapper[4861]: I0309 09:09:06.215802 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-56f4f7678b-gvf92" event={"ID":"12f9feff-47d4-471e-a979-7c37b3922248","Type":"ContainerStarted","Data":"e455e9215f78f23b78cfea665a95c1cb5c920f8b26321f8cdeafee622ba54e66"} Mar 09 09:09:06 crc kubenswrapper[4861]: I0309 09:09:06.220752 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-bbc7557c8-pw74s" Mar 09 09:09:06 crc kubenswrapper[4861]: I0309 09:09:06.263975 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-bbc7557c8-pw74s" podStartSLOduration=4.263916268 podStartE2EDuration="4.263916268s" podCreationTimestamp="2026-03-09 09:09:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:09:06.241698604 +0000 UTC m=+189.326738005" watchObservedRunningTime="2026-03-09 09:09:06.263916268 +0000 UTC m=+189.348955699" Mar 09 09:09:06 crc kubenswrapper[4861]: I0309 09:09:06.265055 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-56f4f7678b-gvf92" podStartSLOduration=4.265042509 podStartE2EDuration="4.265042509s" podCreationTimestamp="2026-03-09 09:09:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:09:06.261218623 +0000 UTC m=+189.346258044" watchObservedRunningTime="2026-03-09 09:09:06.265042509 +0000 UTC m=+189.350081940" Mar 09 09:09:07 crc kubenswrapper[4861]: I0309 09:09:07.223544 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-56f4f7678b-gvf92" Mar 09 09:09:07 crc kubenswrapper[4861]: I0309 09:09:07.232045 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-56f4f7678b-gvf92" Mar 09 09:09:14 crc kubenswrapper[4861]: I0309 09:09:14.028355 4861 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 09 09:09:14 crc kubenswrapper[4861]: I0309 09:09:14.029673 4861 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 09 09:09:14 crc kubenswrapper[4861]: I0309 09:09:14.029949 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://7b02ebcfd63db720474d3acbc618d87e9b419be0847a9f96152b5037ca454a36" gracePeriod=15 Mar 09 09:09:14 crc kubenswrapper[4861]: I0309 09:09:14.030030 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://7c14650b94b6bde491a2375ca5766d14fc93947f5b993395965a00a13a953489" gracePeriod=15 Mar 09 09:09:14 crc kubenswrapper[4861]: I0309 09:09:14.030066 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://dadb8247e79ce5a6d77850c3b946b50622b8aa99369385e0d303b7e232e97b54" gracePeriod=15 Mar 09 09:09:14 crc kubenswrapper[4861]: I0309 09:09:14.030129 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://c2f4b30f196b7900b7eed61144f32f48056374dc7e92c28cd5bffbf774945702" gracePeriod=15 Mar 09 09:09:14 crc kubenswrapper[4861]: I0309 09:09:14.030148 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 09:09:14 crc kubenswrapper[4861]: I0309 09:09:14.030154 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://f9d12c95c652ca0a136cf2124f76dce6920bef3013aa0ed2eabd6a1deaf707ed" gracePeriod=15 Mar 09 09:09:14 crc kubenswrapper[4861]: I0309 09:09:14.031697 4861 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 09 09:09:14 crc kubenswrapper[4861]: E0309 09:09:14.031986 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 09:09:14 crc kubenswrapper[4861]: I0309 09:09:14.032008 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 09:09:14 crc kubenswrapper[4861]: E0309 09:09:14.032027 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 09:09:14 crc kubenswrapper[4861]: I0309 09:09:14.032041 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 09:09:14 crc kubenswrapper[4861]: E0309 09:09:14.032059 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 09:09:14 crc kubenswrapper[4861]: I0309 09:09:14.032071 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 09:09:14 crc kubenswrapper[4861]: E0309 09:09:14.032088 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 09:09:14 crc kubenswrapper[4861]: I0309 09:09:14.032100 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 09:09:14 crc kubenswrapper[4861]: E0309 09:09:14.032118 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 09 09:09:14 crc kubenswrapper[4861]: I0309 09:09:14.032129 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 09 09:09:14 crc kubenswrapper[4861]: E0309 09:09:14.032149 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 09 09:09:14 crc kubenswrapper[4861]: I0309 09:09:14.032162 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 09 09:09:14 crc kubenswrapper[4861]: E0309 09:09:14.032181 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 09 09:09:14 crc kubenswrapper[4861]: I0309 09:09:14.032193 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 09 09:09:14 crc kubenswrapper[4861]: E0309 09:09:14.032208 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 09 09:09:14 crc kubenswrapper[4861]: I0309 09:09:14.032219 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 09 09:09:14 crc kubenswrapper[4861]: E0309 09:09:14.032241 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 09 09:09:14 crc kubenswrapper[4861]: I0309 09:09:14.032254 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 09 09:09:14 crc kubenswrapper[4861]: I0309 09:09:14.032472 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 09:09:14 crc kubenswrapper[4861]: I0309 09:09:14.032490 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 09:09:14 crc kubenswrapper[4861]: I0309 09:09:14.032511 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 09 09:09:14 crc kubenswrapper[4861]: I0309 09:09:14.032532 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 09 09:09:14 crc kubenswrapper[4861]: I0309 09:09:14.032550 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 09 09:09:14 crc kubenswrapper[4861]: I0309 09:09:14.032564 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 09:09:14 crc kubenswrapper[4861]: I0309 09:09:14.032579 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 09 09:09:14 crc kubenswrapper[4861]: E0309 09:09:14.032740 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 09:09:14 crc kubenswrapper[4861]: I0309 09:09:14.032754 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 09:09:14 crc kubenswrapper[4861]: I0309 09:09:14.032963 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 09:09:14 crc kubenswrapper[4861]: I0309 09:09:14.032983 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 09:09:14 crc kubenswrapper[4861]: I0309 09:09:14.116609 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 09:09:14 crc kubenswrapper[4861]: I0309 09:09:14.116693 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 09:09:14 crc kubenswrapper[4861]: I0309 09:09:14.116768 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 09:09:14 crc kubenswrapper[4861]: I0309 09:09:14.116791 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 09:09:14 crc kubenswrapper[4861]: I0309 09:09:14.116813 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:09:14 crc kubenswrapper[4861]: I0309 09:09:14.116835 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 09:09:14 crc kubenswrapper[4861]: I0309 09:09:14.116869 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:09:14 crc kubenswrapper[4861]: I0309 09:09:14.116915 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:09:14 crc kubenswrapper[4861]: I0309 09:09:14.217720 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 09:09:14 crc kubenswrapper[4861]: I0309 09:09:14.217765 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 09:09:14 crc kubenswrapper[4861]: I0309 09:09:14.217783 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:09:14 crc kubenswrapper[4861]: I0309 09:09:14.217804 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 09:09:14 crc kubenswrapper[4861]: I0309 09:09:14.217838 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:09:14 crc kubenswrapper[4861]: I0309 09:09:14.217854 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:09:14 crc kubenswrapper[4861]: I0309 09:09:14.217871 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 09:09:14 crc kubenswrapper[4861]: I0309 09:09:14.217901 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 09:09:14 crc kubenswrapper[4861]: I0309 09:09:14.217910 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 09:09:14 crc kubenswrapper[4861]: I0309 09:09:14.217930 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 09:09:14 crc kubenswrapper[4861]: I0309 09:09:14.217863 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 09:09:14 crc kubenswrapper[4861]: I0309 09:09:14.217904 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 09:09:14 crc kubenswrapper[4861]: I0309 09:09:14.217995 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:09:14 crc kubenswrapper[4861]: I0309 09:09:14.218040 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:09:14 crc kubenswrapper[4861]: I0309 09:09:14.218056 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:09:14 crc kubenswrapper[4861]: I0309 09:09:14.218058 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 09:09:14 crc kubenswrapper[4861]: I0309 09:09:14.273247 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 09 09:09:14 crc kubenswrapper[4861]: I0309 09:09:14.274724 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 09 09:09:14 crc kubenswrapper[4861]: I0309 09:09:14.275301 4861 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f9d12c95c652ca0a136cf2124f76dce6920bef3013aa0ed2eabd6a1deaf707ed" exitCode=0 Mar 09 09:09:14 crc kubenswrapper[4861]: I0309 09:09:14.275326 4861 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7c14650b94b6bde491a2375ca5766d14fc93947f5b993395965a00a13a953489" exitCode=0 Mar 09 09:09:14 crc kubenswrapper[4861]: I0309 09:09:14.275334 4861 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="dadb8247e79ce5a6d77850c3b946b50622b8aa99369385e0d303b7e232e97b54" exitCode=0 Mar 09 09:09:14 crc kubenswrapper[4861]: I0309 09:09:14.275343 4861 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c2f4b30f196b7900b7eed61144f32f48056374dc7e92c28cd5bffbf774945702" exitCode=2 Mar 09 09:09:14 crc kubenswrapper[4861]: I0309 09:09:14.275468 4861 scope.go:117] "RemoveContainer" containerID="f88f2e129d4bfc17b0ac4416da5c5096bf314e097dfa40a48858a83425ca91e1" Mar 09 09:09:14 crc kubenswrapper[4861]: I0309 09:09:14.277625 4861 generic.go:334] "Generic (PLEG): container finished" podID="6be593ec-c05d-4500-961d-74aa43fa5ff7" containerID="2ed2fe73d7fec7263224791214e79191dbf6b4ef7b1abf9585957c2c63d89b7c" exitCode=0 Mar 09 09:09:14 crc kubenswrapper[4861]: I0309 09:09:14.277661 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"6be593ec-c05d-4500-961d-74aa43fa5ff7","Type":"ContainerDied","Data":"2ed2fe73d7fec7263224791214e79191dbf6b4ef7b1abf9585957c2c63d89b7c"} Mar 09 09:09:14 crc kubenswrapper[4861]: I0309 09:09:14.278787 4861 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.163:6443: connect: connection refused" Mar 09 09:09:14 crc kubenswrapper[4861]: I0309 09:09:14.279031 4861 status_manager.go:851] "Failed to get status for pod" podUID="6be593ec-c05d-4500-961d-74aa43fa5ff7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.163:6443: connect: connection refused" Mar 09 09:09:15 crc kubenswrapper[4861]: I0309 09:09:15.291019 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 09 09:09:15 crc kubenswrapper[4861]: I0309 09:09:15.731006 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 09 09:09:15 crc kubenswrapper[4861]: I0309 09:09:15.731567 4861 status_manager.go:851] "Failed to get status for pod" podUID="6be593ec-c05d-4500-961d-74aa43fa5ff7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.163:6443: connect: connection refused" Mar 09 09:09:15 crc kubenswrapper[4861]: I0309 09:09:15.842306 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6be593ec-c05d-4500-961d-74aa43fa5ff7-kubelet-dir\") pod \"6be593ec-c05d-4500-961d-74aa43fa5ff7\" (UID: \"6be593ec-c05d-4500-961d-74aa43fa5ff7\") " Mar 09 09:09:15 crc kubenswrapper[4861]: I0309 09:09:15.842484 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6be593ec-c05d-4500-961d-74aa43fa5ff7-var-lock\") pod \"6be593ec-c05d-4500-961d-74aa43fa5ff7\" (UID: \"6be593ec-c05d-4500-961d-74aa43fa5ff7\") " Mar 09 09:09:15 crc kubenswrapper[4861]: I0309 09:09:15.842473 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6be593ec-c05d-4500-961d-74aa43fa5ff7-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6be593ec-c05d-4500-961d-74aa43fa5ff7" (UID: "6be593ec-c05d-4500-961d-74aa43fa5ff7"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:09:15 crc kubenswrapper[4861]: I0309 09:09:15.842594 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6be593ec-c05d-4500-961d-74aa43fa5ff7-kube-api-access\") pod \"6be593ec-c05d-4500-961d-74aa43fa5ff7\" (UID: \"6be593ec-c05d-4500-961d-74aa43fa5ff7\") " Mar 09 09:09:15 crc kubenswrapper[4861]: I0309 09:09:15.842636 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6be593ec-c05d-4500-961d-74aa43fa5ff7-var-lock" (OuterVolumeSpecName: "var-lock") pod "6be593ec-c05d-4500-961d-74aa43fa5ff7" (UID: "6be593ec-c05d-4500-961d-74aa43fa5ff7"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:09:15 crc kubenswrapper[4861]: I0309 09:09:15.842816 4861 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6be593ec-c05d-4500-961d-74aa43fa5ff7-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 09 09:09:15 crc kubenswrapper[4861]: I0309 09:09:15.842830 4861 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6be593ec-c05d-4500-961d-74aa43fa5ff7-var-lock\") on node \"crc\" DevicePath \"\"" Mar 09 09:09:15 crc kubenswrapper[4861]: I0309 09:09:15.852021 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6be593ec-c05d-4500-961d-74aa43fa5ff7-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6be593ec-c05d-4500-961d-74aa43fa5ff7" (UID: "6be593ec-c05d-4500-961d-74aa43fa5ff7"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:09:15 crc kubenswrapper[4861]: I0309 09:09:15.944916 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6be593ec-c05d-4500-961d-74aa43fa5ff7-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 09:09:16 crc kubenswrapper[4861]: I0309 09:09:16.300805 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"6be593ec-c05d-4500-961d-74aa43fa5ff7","Type":"ContainerDied","Data":"c95bf7c2f7a33395d15dae2d27d73eec14a36e4fb890c8e1d3199f83f49d3173"} Mar 09 09:09:16 crc kubenswrapper[4861]: I0309 09:09:16.301044 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c95bf7c2f7a33395d15dae2d27d73eec14a36e4fb890c8e1d3199f83f49d3173" Mar 09 09:09:16 crc kubenswrapper[4861]: I0309 09:09:16.300878 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 09 09:09:16 crc kubenswrapper[4861]: I0309 09:09:16.312919 4861 status_manager.go:851] "Failed to get status for pod" podUID="6be593ec-c05d-4500-961d-74aa43fa5ff7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.163:6443: connect: connection refused" Mar 09 09:09:16 crc kubenswrapper[4861]: I0309 09:09:16.409330 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 09 09:09:16 crc kubenswrapper[4861]: I0309 09:09:16.410306 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:09:16 crc kubenswrapper[4861]: I0309 09:09:16.411168 4861 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.163:6443: connect: connection refused" Mar 09 09:09:16 crc kubenswrapper[4861]: I0309 09:09:16.411736 4861 status_manager.go:851] "Failed to get status for pod" podUID="6be593ec-c05d-4500-961d-74aa43fa5ff7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.163:6443: connect: connection refused" Mar 09 09:09:16 crc kubenswrapper[4861]: I0309 09:09:16.451826 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 09 09:09:16 crc kubenswrapper[4861]: I0309 09:09:16.451893 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:09:16 crc kubenswrapper[4861]: I0309 09:09:16.451936 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 09 09:09:16 crc kubenswrapper[4861]: I0309 09:09:16.451963 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:09:16 crc kubenswrapper[4861]: I0309 09:09:16.452026 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 09 09:09:16 crc kubenswrapper[4861]: I0309 09:09:16.452112 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:09:16 crc kubenswrapper[4861]: I0309 09:09:16.452318 4861 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 09 09:09:16 crc kubenswrapper[4861]: I0309 09:09:16.452350 4861 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 09 09:09:16 crc kubenswrapper[4861]: I0309 09:09:16.452428 4861 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 09 09:09:17 crc kubenswrapper[4861]: I0309 09:09:17.311552 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 09 09:09:17 crc kubenswrapper[4861]: I0309 09:09:17.314095 4861 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7b02ebcfd63db720474d3acbc618d87e9b419be0847a9f96152b5037ca454a36" exitCode=0 Mar 09 09:09:17 crc kubenswrapper[4861]: I0309 09:09:17.314169 4861 scope.go:117] "RemoveContainer" containerID="f9d12c95c652ca0a136cf2124f76dce6920bef3013aa0ed2eabd6a1deaf707ed" Mar 09 09:09:17 crc kubenswrapper[4861]: I0309 09:09:17.314228 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:09:17 crc kubenswrapper[4861]: I0309 09:09:17.334670 4861 scope.go:117] "RemoveContainer" containerID="7c14650b94b6bde491a2375ca5766d14fc93947f5b993395965a00a13a953489" Mar 09 09:09:17 crc kubenswrapper[4861]: I0309 09:09:17.338324 4861 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.163:6443: connect: connection refused" Mar 09 09:09:17 crc kubenswrapper[4861]: I0309 09:09:17.338769 4861 status_manager.go:851] "Failed to get status for pod" podUID="6be593ec-c05d-4500-961d-74aa43fa5ff7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.163:6443: connect: connection refused" Mar 09 09:09:17 crc kubenswrapper[4861]: I0309 09:09:17.356501 4861 scope.go:117] "RemoveContainer" containerID="dadb8247e79ce5a6d77850c3b946b50622b8aa99369385e0d303b7e232e97b54" Mar 09 09:09:17 crc kubenswrapper[4861]: I0309 09:09:17.374341 4861 scope.go:117] "RemoveContainer" containerID="c2f4b30f196b7900b7eed61144f32f48056374dc7e92c28cd5bffbf774945702" Mar 09 09:09:17 crc kubenswrapper[4861]: I0309 09:09:17.392097 4861 scope.go:117] "RemoveContainer" containerID="7b02ebcfd63db720474d3acbc618d87e9b419be0847a9f96152b5037ca454a36" Mar 09 09:09:17 crc kubenswrapper[4861]: I0309 09:09:17.408573 4861 scope.go:117] "RemoveContainer" containerID="591835de84a2d61d44c0723a30b71b9fae2661bb7db61e8938a9e3470dba5c25" Mar 09 09:09:17 crc kubenswrapper[4861]: I0309 09:09:17.433188 4861 scope.go:117] "RemoveContainer" containerID="f9d12c95c652ca0a136cf2124f76dce6920bef3013aa0ed2eabd6a1deaf707ed" Mar 09 09:09:17 crc kubenswrapper[4861]: E0309 09:09:17.434002 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9d12c95c652ca0a136cf2124f76dce6920bef3013aa0ed2eabd6a1deaf707ed\": container with ID starting with f9d12c95c652ca0a136cf2124f76dce6920bef3013aa0ed2eabd6a1deaf707ed not found: ID does not exist" containerID="f9d12c95c652ca0a136cf2124f76dce6920bef3013aa0ed2eabd6a1deaf707ed" Mar 09 09:09:17 crc kubenswrapper[4861]: I0309 09:09:17.434047 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9d12c95c652ca0a136cf2124f76dce6920bef3013aa0ed2eabd6a1deaf707ed"} err="failed to get container status \"f9d12c95c652ca0a136cf2124f76dce6920bef3013aa0ed2eabd6a1deaf707ed\": rpc error: code = NotFound desc = could not find container \"f9d12c95c652ca0a136cf2124f76dce6920bef3013aa0ed2eabd6a1deaf707ed\": container with ID starting with f9d12c95c652ca0a136cf2124f76dce6920bef3013aa0ed2eabd6a1deaf707ed not found: ID does not exist" Mar 09 09:09:17 crc kubenswrapper[4861]: I0309 09:09:17.434078 4861 scope.go:117] "RemoveContainer" containerID="7c14650b94b6bde491a2375ca5766d14fc93947f5b993395965a00a13a953489" Mar 09 09:09:17 crc kubenswrapper[4861]: E0309 09:09:17.434820 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c14650b94b6bde491a2375ca5766d14fc93947f5b993395965a00a13a953489\": container with ID starting with 7c14650b94b6bde491a2375ca5766d14fc93947f5b993395965a00a13a953489 not found: ID does not exist" containerID="7c14650b94b6bde491a2375ca5766d14fc93947f5b993395965a00a13a953489" Mar 09 09:09:17 crc kubenswrapper[4861]: I0309 09:09:17.434874 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c14650b94b6bde491a2375ca5766d14fc93947f5b993395965a00a13a953489"} err="failed to get container status \"7c14650b94b6bde491a2375ca5766d14fc93947f5b993395965a00a13a953489\": rpc error: code = NotFound desc = could not find container \"7c14650b94b6bde491a2375ca5766d14fc93947f5b993395965a00a13a953489\": container with ID starting with 7c14650b94b6bde491a2375ca5766d14fc93947f5b993395965a00a13a953489 not found: ID does not exist" Mar 09 09:09:17 crc kubenswrapper[4861]: I0309 09:09:17.434906 4861 scope.go:117] "RemoveContainer" containerID="dadb8247e79ce5a6d77850c3b946b50622b8aa99369385e0d303b7e232e97b54" Mar 09 09:09:17 crc kubenswrapper[4861]: E0309 09:09:17.435341 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dadb8247e79ce5a6d77850c3b946b50622b8aa99369385e0d303b7e232e97b54\": container with ID starting with dadb8247e79ce5a6d77850c3b946b50622b8aa99369385e0d303b7e232e97b54 not found: ID does not exist" containerID="dadb8247e79ce5a6d77850c3b946b50622b8aa99369385e0d303b7e232e97b54" Mar 09 09:09:17 crc kubenswrapper[4861]: I0309 09:09:17.435398 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dadb8247e79ce5a6d77850c3b946b50622b8aa99369385e0d303b7e232e97b54"} err="failed to get container status \"dadb8247e79ce5a6d77850c3b946b50622b8aa99369385e0d303b7e232e97b54\": rpc error: code = NotFound desc = could not find container \"dadb8247e79ce5a6d77850c3b946b50622b8aa99369385e0d303b7e232e97b54\": container with ID starting with dadb8247e79ce5a6d77850c3b946b50622b8aa99369385e0d303b7e232e97b54 not found: ID does not exist" Mar 09 09:09:17 crc kubenswrapper[4861]: I0309 09:09:17.435423 4861 scope.go:117] "RemoveContainer" containerID="c2f4b30f196b7900b7eed61144f32f48056374dc7e92c28cd5bffbf774945702" Mar 09 09:09:17 crc kubenswrapper[4861]: E0309 09:09:17.435912 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2f4b30f196b7900b7eed61144f32f48056374dc7e92c28cd5bffbf774945702\": container with ID starting with c2f4b30f196b7900b7eed61144f32f48056374dc7e92c28cd5bffbf774945702 not found: ID does not exist" containerID="c2f4b30f196b7900b7eed61144f32f48056374dc7e92c28cd5bffbf774945702" Mar 09 09:09:17 crc kubenswrapper[4861]: I0309 09:09:17.435933 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2f4b30f196b7900b7eed61144f32f48056374dc7e92c28cd5bffbf774945702"} err="failed to get container status \"c2f4b30f196b7900b7eed61144f32f48056374dc7e92c28cd5bffbf774945702\": rpc error: code = NotFound desc = could not find container \"c2f4b30f196b7900b7eed61144f32f48056374dc7e92c28cd5bffbf774945702\": container with ID starting with c2f4b30f196b7900b7eed61144f32f48056374dc7e92c28cd5bffbf774945702 not found: ID does not exist" Mar 09 09:09:17 crc kubenswrapper[4861]: I0309 09:09:17.435951 4861 scope.go:117] "RemoveContainer" containerID="7b02ebcfd63db720474d3acbc618d87e9b419be0847a9f96152b5037ca454a36" Mar 09 09:09:17 crc kubenswrapper[4861]: E0309 09:09:17.436260 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b02ebcfd63db720474d3acbc618d87e9b419be0847a9f96152b5037ca454a36\": container with ID starting with 7b02ebcfd63db720474d3acbc618d87e9b419be0847a9f96152b5037ca454a36 not found: ID does not exist" containerID="7b02ebcfd63db720474d3acbc618d87e9b419be0847a9f96152b5037ca454a36" Mar 09 09:09:17 crc kubenswrapper[4861]: I0309 09:09:17.436292 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b02ebcfd63db720474d3acbc618d87e9b419be0847a9f96152b5037ca454a36"} err="failed to get container status \"7b02ebcfd63db720474d3acbc618d87e9b419be0847a9f96152b5037ca454a36\": rpc error: code = NotFound desc = could not find container \"7b02ebcfd63db720474d3acbc618d87e9b419be0847a9f96152b5037ca454a36\": container with ID starting with 7b02ebcfd63db720474d3acbc618d87e9b419be0847a9f96152b5037ca454a36 not found: ID does not exist" Mar 09 09:09:17 crc kubenswrapper[4861]: I0309 09:09:17.436326 4861 scope.go:117] "RemoveContainer" containerID="591835de84a2d61d44c0723a30b71b9fae2661bb7db61e8938a9e3470dba5c25" Mar 09 09:09:17 crc kubenswrapper[4861]: E0309 09:09:17.436599 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"591835de84a2d61d44c0723a30b71b9fae2661bb7db61e8938a9e3470dba5c25\": container with ID starting with 591835de84a2d61d44c0723a30b71b9fae2661bb7db61e8938a9e3470dba5c25 not found: ID does not exist" containerID="591835de84a2d61d44c0723a30b71b9fae2661bb7db61e8938a9e3470dba5c25" Mar 09 09:09:17 crc kubenswrapper[4861]: I0309 09:09:17.436619 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"591835de84a2d61d44c0723a30b71b9fae2661bb7db61e8938a9e3470dba5c25"} err="failed to get container status \"591835de84a2d61d44c0723a30b71b9fae2661bb7db61e8938a9e3470dba5c25\": rpc error: code = NotFound desc = could not find container \"591835de84a2d61d44c0723a30b71b9fae2661bb7db61e8938a9e3470dba5c25\": container with ID starting with 591835de84a2d61d44c0723a30b71b9fae2661bb7db61e8938a9e3470dba5c25 not found: ID does not exist" Mar 09 09:09:17 crc kubenswrapper[4861]: I0309 09:09:17.662009 4861 status_manager.go:851] "Failed to get status for pod" podUID="6be593ec-c05d-4500-961d-74aa43fa5ff7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.163:6443: connect: connection refused" Mar 09 09:09:17 crc kubenswrapper[4861]: I0309 09:09:17.662623 4861 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.163:6443: connect: connection refused" Mar 09 09:09:17 crc kubenswrapper[4861]: I0309 09:09:17.667649 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 09 09:09:17 crc kubenswrapper[4861]: E0309 09:09:17.756267 4861 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.163:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" volumeName="registry-storage" Mar 09 09:09:19 crc kubenswrapper[4861]: E0309 09:09:19.077976 4861 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.163:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 09:09:19 crc kubenswrapper[4861]: I0309 09:09:19.079735 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 09:09:19 crc kubenswrapper[4861]: E0309 09:09:19.125621 4861 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.163:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189b212dda897fde openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:09:19.124111326 +0000 UTC m=+202.209150767,LastTimestamp:2026-03-09 09:09:19.124111326 +0000 UTC m=+202.209150767,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:09:19 crc kubenswrapper[4861]: I0309 09:09:19.332221 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"2b85db2f4ff031745a4db49a1be94b76e84c71f50f76a15517e6589cd020ffd7"} Mar 09 09:09:20 crc kubenswrapper[4861]: E0309 09:09:20.303360 4861 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.163:6443: connect: connection refused" Mar 09 09:09:20 crc kubenswrapper[4861]: E0309 09:09:20.304661 4861 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.163:6443: connect: connection refused" Mar 09 09:09:20 crc kubenswrapper[4861]: E0309 09:09:20.305220 4861 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.163:6443: connect: connection refused" Mar 09 09:09:20 crc kubenswrapper[4861]: E0309 09:09:20.305752 4861 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.163:6443: connect: connection refused" Mar 09 09:09:20 crc kubenswrapper[4861]: E0309 09:09:20.306178 4861 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.163:6443: connect: connection refused" Mar 09 09:09:20 crc kubenswrapper[4861]: I0309 09:09:20.306245 4861 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 09 09:09:20 crc kubenswrapper[4861]: E0309 09:09:20.306780 4861 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.163:6443: connect: connection refused" interval="200ms" Mar 09 09:09:20 crc kubenswrapper[4861]: I0309 09:09:20.341223 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"cdf9a28bbaea69794beeb97b690b764813f5b952e0a3bf72856c6ce7db2fba36"} Mar 09 09:09:20 crc kubenswrapper[4861]: I0309 09:09:20.341856 4861 status_manager.go:851] "Failed to get status for pod" podUID="6be593ec-c05d-4500-961d-74aa43fa5ff7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.163:6443: connect: connection refused" Mar 09 09:09:20 crc kubenswrapper[4861]: E0309 09:09:20.341936 4861 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.163:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 09:09:20 crc kubenswrapper[4861]: E0309 09:09:20.507828 4861 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.163:6443: connect: connection refused" interval="400ms" Mar 09 09:09:20 crc kubenswrapper[4861]: E0309 09:09:20.909044 4861 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.163:6443: connect: connection refused" interval="800ms" Mar 09 09:09:20 crc kubenswrapper[4861]: E0309 09:09:20.962175 4861 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:20Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:20Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:20Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T09:09:20Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.163:6443: connect: connection refused" Mar 09 09:09:20 crc kubenswrapper[4861]: E0309 09:09:20.962909 4861 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.163:6443: connect: connection refused" Mar 09 09:09:20 crc kubenswrapper[4861]: E0309 09:09:20.963571 4861 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.163:6443: connect: connection refused" Mar 09 09:09:20 crc kubenswrapper[4861]: E0309 09:09:20.964019 4861 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.163:6443: connect: connection refused" Mar 09 09:09:20 crc kubenswrapper[4861]: E0309 09:09:20.964510 4861 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.163:6443: connect: connection refused" Mar 09 09:09:20 crc kubenswrapper[4861]: E0309 09:09:20.964561 4861 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 09:09:21 crc kubenswrapper[4861]: E0309 09:09:21.346048 4861 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.163:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 09:09:21 crc kubenswrapper[4861]: E0309 09:09:21.710720 4861 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.163:6443: connect: connection refused" interval="1.6s" Mar 09 09:09:23 crc kubenswrapper[4861]: E0309 09:09:23.311855 4861 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.163:6443: connect: connection refused" interval="3.2s" Mar 09 09:09:24 crc kubenswrapper[4861]: I0309 09:09:24.606533 4861 patch_prober.go:28] interesting pod/machine-config-daemon-5g7gc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:09:24 crc kubenswrapper[4861]: I0309 09:09:24.606627 4861 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:09:25 crc kubenswrapper[4861]: E0309 09:09:25.464143 4861 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.163:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189b212dda897fde openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 09:09:19.124111326 +0000 UTC m=+202.209150767,LastTimestamp:2026-03-09 09:09:19.124111326 +0000 UTC m=+202.209150767,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 09:09:26 crc kubenswrapper[4861]: E0309 09:09:26.513136 4861 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.163:6443: connect: connection refused" interval="6.4s" Mar 09 09:09:27 crc kubenswrapper[4861]: I0309 09:09:27.662657 4861 status_manager.go:851] "Failed to get status for pod" podUID="6be593ec-c05d-4500-961d-74aa43fa5ff7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.163:6443: connect: connection refused" Mar 09 09:09:28 crc kubenswrapper[4861]: I0309 09:09:28.657912 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:09:28 crc kubenswrapper[4861]: I0309 09:09:28.659436 4861 status_manager.go:851] "Failed to get status for pod" podUID="6be593ec-c05d-4500-961d-74aa43fa5ff7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.163:6443: connect: connection refused" Mar 09 09:09:28 crc kubenswrapper[4861]: I0309 09:09:28.681811 4861 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="84205522-d7de-4553-9297-e7cf1a5e0336" Mar 09 09:09:28 crc kubenswrapper[4861]: I0309 09:09:28.681865 4861 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="84205522-d7de-4553-9297-e7cf1a5e0336" Mar 09 09:09:28 crc kubenswrapper[4861]: E0309 09:09:28.682563 4861 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.163:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:09:28 crc kubenswrapper[4861]: I0309 09:09:28.683208 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:09:29 crc kubenswrapper[4861]: I0309 09:09:29.401448 4861 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="153698773f794ff5d357ceb5089af1c57cd409d5f6904b99f15bf4bf6d81c20c" exitCode=0 Mar 09 09:09:29 crc kubenswrapper[4861]: I0309 09:09:29.401568 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"153698773f794ff5d357ceb5089af1c57cd409d5f6904b99f15bf4bf6d81c20c"} Mar 09 09:09:29 crc kubenswrapper[4861]: I0309 09:09:29.401872 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1cfe4ea66e284e7e8a4d82cab98d73023263345b23bbff11d07092b9110e2fec"} Mar 09 09:09:29 crc kubenswrapper[4861]: I0309 09:09:29.402510 4861 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="84205522-d7de-4553-9297-e7cf1a5e0336" Mar 09 09:09:29 crc kubenswrapper[4861]: I0309 09:09:29.402548 4861 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="84205522-d7de-4553-9297-e7cf1a5e0336" Mar 09 09:09:29 crc kubenswrapper[4861]: I0309 09:09:29.402900 4861 status_manager.go:851] "Failed to get status for pod" podUID="6be593ec-c05d-4500-961d-74aa43fa5ff7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.163:6443: connect: connection refused" Mar 09 09:09:29 crc kubenswrapper[4861]: E0309 09:09:29.403236 4861 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.163:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:09:29 crc kubenswrapper[4861]: I0309 09:09:29.407175 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 09 09:09:29 crc kubenswrapper[4861]: I0309 09:09:29.408332 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 09 09:09:29 crc kubenswrapper[4861]: I0309 09:09:29.408445 4861 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="35f0274e9da4b30cfc1e2d7706951fd7501832f5c6f5c4e0eb2696b1179cd3ad" exitCode=1 Mar 09 09:09:29 crc kubenswrapper[4861]: I0309 09:09:29.408486 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"35f0274e9da4b30cfc1e2d7706951fd7501832f5c6f5c4e0eb2696b1179cd3ad"} Mar 09 09:09:29 crc kubenswrapper[4861]: I0309 09:09:29.409006 4861 scope.go:117] "RemoveContainer" containerID="35f0274e9da4b30cfc1e2d7706951fd7501832f5c6f5c4e0eb2696b1179cd3ad" Mar 09 09:09:29 crc kubenswrapper[4861]: I0309 09:09:29.409721 4861 status_manager.go:851] "Failed to get status for pod" podUID="6be593ec-c05d-4500-961d-74aa43fa5ff7" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.163:6443: connect: connection refused" Mar 09 09:09:29 crc kubenswrapper[4861]: I0309 09:09:29.410151 4861 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.163:6443: connect: connection refused" Mar 09 09:09:29 crc kubenswrapper[4861]: I0309 09:09:29.568890 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 09:09:30 crc kubenswrapper[4861]: I0309 09:09:30.420743 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"149834183c82ef27b5d6a64c24c638fdf0842744f09179d420d18c14abbf3e7d"} Mar 09 09:09:30 crc kubenswrapper[4861]: I0309 09:09:30.421134 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"6a79dfabeca2b992acbd37e62dbd6e34f0e23052a110079d86e09b67e763173d"} Mar 09 09:09:30 crc kubenswrapper[4861]: I0309 09:09:30.421155 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"11adf3d5130c06738b89d9a24fd8061ed54292fbbc82f8892365e10125c1aae5"} Mar 09 09:09:30 crc kubenswrapper[4861]: I0309 09:09:30.423840 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 09 09:09:30 crc kubenswrapper[4861]: I0309 09:09:30.424359 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 09 09:09:30 crc kubenswrapper[4861]: I0309 09:09:30.424440 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"bb246fb742e93038c7b699f443604b07951f298b66ebba6bd2a8b70e7ec7e340"} Mar 09 09:09:31 crc kubenswrapper[4861]: I0309 09:09:31.435099 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"85759335969dc6d0d4a3b054f705958bef776e6d5761efc9050e470db4fc1270"} Mar 09 09:09:31 crc kubenswrapper[4861]: I0309 09:09:31.435480 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"0ce3285d77cf79ef2a154a1c75ba91c21719d949cdd19d551ef29cebaf6361a3"} Mar 09 09:09:31 crc kubenswrapper[4861]: I0309 09:09:31.435578 4861 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="84205522-d7de-4553-9297-e7cf1a5e0336" Mar 09 09:09:31 crc kubenswrapper[4861]: I0309 09:09:31.435634 4861 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="84205522-d7de-4553-9297-e7cf1a5e0336" Mar 09 09:09:33 crc kubenswrapper[4861]: I0309 09:09:33.684259 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:09:33 crc kubenswrapper[4861]: I0309 09:09:33.685001 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:09:33 crc kubenswrapper[4861]: I0309 09:09:33.692526 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:09:36 crc kubenswrapper[4861]: I0309 09:09:36.446262 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 09:09:36 crc kubenswrapper[4861]: I0309 09:09:36.450015 4861 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:09:36 crc kubenswrapper[4861]: I0309 09:09:36.459988 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 09:09:36 crc kubenswrapper[4861]: I0309 09:09:36.466348 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 09:09:37 crc kubenswrapper[4861]: I0309 09:09:37.472640 4861 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="84205522-d7de-4553-9297-e7cf1a5e0336" Mar 09 09:09:37 crc kubenswrapper[4861]: I0309 09:09:37.473025 4861 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="84205522-d7de-4553-9297-e7cf1a5e0336" Mar 09 09:09:37 crc kubenswrapper[4861]: I0309 09:09:37.472909 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:09:37 crc kubenswrapper[4861]: I0309 09:09:37.478263 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:09:37 crc kubenswrapper[4861]: I0309 09:09:37.690171 4861 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="28460e7c-8600-4458-9ec9-882a778ae012" Mar 09 09:09:38 crc kubenswrapper[4861]: I0309 09:09:38.479618 4861 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="84205522-d7de-4553-9297-e7cf1a5e0336" Mar 09 09:09:38 crc kubenswrapper[4861]: I0309 09:09:38.479659 4861 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="84205522-d7de-4553-9297-e7cf1a5e0336" Mar 09 09:09:38 crc kubenswrapper[4861]: I0309 09:09:38.483706 4861 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="28460e7c-8600-4458-9ec9-882a778ae012" Mar 09 09:09:39 crc kubenswrapper[4861]: I0309 09:09:39.575121 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 09:09:46 crc kubenswrapper[4861]: I0309 09:09:46.887818 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 09 09:09:47 crc kubenswrapper[4861]: I0309 09:09:47.201600 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 09 09:09:47 crc kubenswrapper[4861]: I0309 09:09:47.513152 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 09 09:09:47 crc kubenswrapper[4861]: I0309 09:09:47.989280 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 09 09:09:48 crc kubenswrapper[4861]: I0309 09:09:48.159252 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 09 09:09:48 crc kubenswrapper[4861]: I0309 09:09:48.320064 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 09 09:09:48 crc kubenswrapper[4861]: I0309 09:09:48.347108 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 09 09:09:48 crc kubenswrapper[4861]: I0309 09:09:48.670076 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 09 09:09:48 crc kubenswrapper[4861]: I0309 09:09:48.783924 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 09 09:09:48 crc kubenswrapper[4861]: I0309 09:09:48.800254 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 09 09:09:49 crc kubenswrapper[4861]: I0309 09:09:49.013764 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 09 09:09:49 crc kubenswrapper[4861]: I0309 09:09:49.237155 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 09 09:09:49 crc kubenswrapper[4861]: I0309 09:09:49.253865 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 09 09:09:49 crc kubenswrapper[4861]: I0309 09:09:49.266685 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 09 09:09:49 crc kubenswrapper[4861]: I0309 09:09:49.385507 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 09 09:09:49 crc kubenswrapper[4861]: I0309 09:09:49.438308 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 09 09:09:49 crc kubenswrapper[4861]: I0309 09:09:49.449331 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 09 09:09:49 crc kubenswrapper[4861]: I0309 09:09:49.480889 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 09 09:09:49 crc kubenswrapper[4861]: I0309 09:09:49.490759 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 09 09:09:49 crc kubenswrapper[4861]: I0309 09:09:49.541290 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 09 09:09:49 crc kubenswrapper[4861]: I0309 09:09:49.691029 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 09 09:09:49 crc kubenswrapper[4861]: I0309 09:09:49.739992 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 09 09:09:49 crc kubenswrapper[4861]: I0309 09:09:49.765091 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 09 09:09:49 crc kubenswrapper[4861]: I0309 09:09:49.822426 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 09 09:09:49 crc kubenswrapper[4861]: I0309 09:09:49.911404 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 09 09:09:50 crc kubenswrapper[4861]: I0309 09:09:50.089420 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 09 09:09:50 crc kubenswrapper[4861]: I0309 09:09:50.097473 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 09 09:09:50 crc kubenswrapper[4861]: I0309 09:09:50.159265 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 09 09:09:50 crc kubenswrapper[4861]: I0309 09:09:50.181763 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 09 09:09:50 crc kubenswrapper[4861]: I0309 09:09:50.275677 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 09 09:09:50 crc kubenswrapper[4861]: I0309 09:09:50.503465 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 09 09:09:50 crc kubenswrapper[4861]: I0309 09:09:50.524418 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 09 09:09:50 crc kubenswrapper[4861]: I0309 09:09:50.540175 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 09 09:09:50 crc kubenswrapper[4861]: I0309 09:09:50.642111 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 09 09:09:50 crc kubenswrapper[4861]: I0309 09:09:50.692912 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 09 09:09:50 crc kubenswrapper[4861]: I0309 09:09:50.705273 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 09 09:09:50 crc kubenswrapper[4861]: I0309 09:09:50.755693 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 09 09:09:50 crc kubenswrapper[4861]: I0309 09:09:50.784148 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 09 09:09:50 crc kubenswrapper[4861]: I0309 09:09:50.793960 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 09 09:09:50 crc kubenswrapper[4861]: I0309 09:09:50.985651 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 09 09:09:50 crc kubenswrapper[4861]: I0309 09:09:50.988727 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 09 09:09:50 crc kubenswrapper[4861]: I0309 09:09:50.994209 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 09 09:09:51 crc kubenswrapper[4861]: I0309 09:09:51.008240 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 09 09:09:51 crc kubenswrapper[4861]: I0309 09:09:51.026641 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 09 09:09:51 crc kubenswrapper[4861]: I0309 09:09:51.140064 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 09 09:09:51 crc kubenswrapper[4861]: I0309 09:09:51.149013 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 09 09:09:51 crc kubenswrapper[4861]: I0309 09:09:51.244036 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 09 09:09:51 crc kubenswrapper[4861]: I0309 09:09:51.292735 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 09 09:09:51 crc kubenswrapper[4861]: I0309 09:09:51.329394 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 09 09:09:51 crc kubenswrapper[4861]: I0309 09:09:51.338640 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 09 09:09:51 crc kubenswrapper[4861]: I0309 09:09:51.355773 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 09 09:09:51 crc kubenswrapper[4861]: I0309 09:09:51.376082 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 09 09:09:51 crc kubenswrapper[4861]: I0309 09:09:51.383050 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 09 09:09:51 crc kubenswrapper[4861]: I0309 09:09:51.385134 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 09 09:09:51 crc kubenswrapper[4861]: I0309 09:09:51.393225 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 09 09:09:51 crc kubenswrapper[4861]: I0309 09:09:51.402685 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 09 09:09:51 crc kubenswrapper[4861]: I0309 09:09:51.445602 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 09 09:09:51 crc kubenswrapper[4861]: I0309 09:09:51.466130 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 09 09:09:51 crc kubenswrapper[4861]: I0309 09:09:51.673638 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 09 09:09:51 crc kubenswrapper[4861]: I0309 09:09:51.734662 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 09 09:09:51 crc kubenswrapper[4861]: I0309 09:09:51.741828 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 09 09:09:52 crc kubenswrapper[4861]: I0309 09:09:52.018538 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 09 09:09:52 crc kubenswrapper[4861]: I0309 09:09:52.036903 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 09 09:09:52 crc kubenswrapper[4861]: I0309 09:09:52.063986 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 09 09:09:52 crc kubenswrapper[4861]: I0309 09:09:52.093113 4861 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 09 09:09:52 crc kubenswrapper[4861]: I0309 09:09:52.100583 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 09 09:09:52 crc kubenswrapper[4861]: I0309 09:09:52.128677 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 09 09:09:52 crc kubenswrapper[4861]: I0309 09:09:52.130109 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 09 09:09:52 crc kubenswrapper[4861]: I0309 09:09:52.153959 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 09 09:09:52 crc kubenswrapper[4861]: I0309 09:09:52.215193 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 09 09:09:52 crc kubenswrapper[4861]: I0309 09:09:52.442406 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 09 09:09:52 crc kubenswrapper[4861]: I0309 09:09:52.570556 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 09 09:09:52 crc kubenswrapper[4861]: I0309 09:09:52.616545 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 09 09:09:52 crc kubenswrapper[4861]: I0309 09:09:52.653969 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 09 09:09:52 crc kubenswrapper[4861]: I0309 09:09:52.733155 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 09 09:09:52 crc kubenswrapper[4861]: I0309 09:09:52.757649 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 09 09:09:52 crc kubenswrapper[4861]: I0309 09:09:52.767634 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 09 09:09:52 crc kubenswrapper[4861]: I0309 09:09:52.826776 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 09 09:09:52 crc kubenswrapper[4861]: I0309 09:09:52.891511 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 09 09:09:52 crc kubenswrapper[4861]: I0309 09:09:52.965306 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 09 09:09:53 crc kubenswrapper[4861]: I0309 09:09:53.065457 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 09 09:09:53 crc kubenswrapper[4861]: I0309 09:09:53.090445 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 09 09:09:53 crc kubenswrapper[4861]: I0309 09:09:53.120520 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 09 09:09:53 crc kubenswrapper[4861]: I0309 09:09:53.246697 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 09 09:09:53 crc kubenswrapper[4861]: I0309 09:09:53.381157 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 09 09:09:53 crc kubenswrapper[4861]: I0309 09:09:53.395303 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 09 09:09:53 crc kubenswrapper[4861]: I0309 09:09:53.449085 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 09 09:09:53 crc kubenswrapper[4861]: I0309 09:09:53.467918 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 09 09:09:53 crc kubenswrapper[4861]: I0309 09:09:53.570530 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 09 09:09:53 crc kubenswrapper[4861]: I0309 09:09:53.584501 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 09 09:09:53 crc kubenswrapper[4861]: I0309 09:09:53.610251 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 09 09:09:53 crc kubenswrapper[4861]: I0309 09:09:53.650745 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 09 09:09:53 crc kubenswrapper[4861]: I0309 09:09:53.685400 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 09 09:09:53 crc kubenswrapper[4861]: I0309 09:09:53.708942 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 09 09:09:53 crc kubenswrapper[4861]: I0309 09:09:53.766654 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 09 09:09:53 crc kubenswrapper[4861]: I0309 09:09:53.828653 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 09 09:09:53 crc kubenswrapper[4861]: I0309 09:09:53.918366 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 09 09:09:53 crc kubenswrapper[4861]: I0309 09:09:53.928111 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 09 09:09:53 crc kubenswrapper[4861]: I0309 09:09:53.991995 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 09 09:09:54 crc kubenswrapper[4861]: I0309 09:09:54.061788 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 09 09:09:54 crc kubenswrapper[4861]: I0309 09:09:54.201079 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 09 09:09:54 crc kubenswrapper[4861]: I0309 09:09:54.206920 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 09 09:09:54 crc kubenswrapper[4861]: I0309 09:09:54.296323 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 09 09:09:54 crc kubenswrapper[4861]: I0309 09:09:54.348964 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 09 09:09:54 crc kubenswrapper[4861]: I0309 09:09:54.433639 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 09 09:09:54 crc kubenswrapper[4861]: I0309 09:09:54.436238 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 09 09:09:54 crc kubenswrapper[4861]: I0309 09:09:54.507487 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 09 09:09:54 crc kubenswrapper[4861]: I0309 09:09:54.606394 4861 patch_prober.go:28] interesting pod/machine-config-daemon-5g7gc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:09:54 crc kubenswrapper[4861]: I0309 09:09:54.606834 4861 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:09:54 crc kubenswrapper[4861]: I0309 09:09:54.713560 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 09 09:09:54 crc kubenswrapper[4861]: I0309 09:09:54.736418 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 09 09:09:54 crc kubenswrapper[4861]: I0309 09:09:54.853361 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 09 09:09:55 crc kubenswrapper[4861]: I0309 09:09:55.010995 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 09 09:09:55 crc kubenswrapper[4861]: I0309 09:09:55.011880 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 09 09:09:55 crc kubenswrapper[4861]: I0309 09:09:55.016620 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 09 09:09:55 crc kubenswrapper[4861]: I0309 09:09:55.079735 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 09 09:09:55 crc kubenswrapper[4861]: I0309 09:09:55.138701 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 09 09:09:55 crc kubenswrapper[4861]: I0309 09:09:55.187516 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 09 09:09:55 crc kubenswrapper[4861]: I0309 09:09:55.345211 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 09 09:09:55 crc kubenswrapper[4861]: I0309 09:09:55.346984 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 09 09:09:55 crc kubenswrapper[4861]: I0309 09:09:55.360022 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 09 09:09:55 crc kubenswrapper[4861]: I0309 09:09:55.378185 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 09 09:09:55 crc kubenswrapper[4861]: I0309 09:09:55.428934 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 09 09:09:55 crc kubenswrapper[4861]: I0309 09:09:55.481282 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 09 09:09:55 crc kubenswrapper[4861]: I0309 09:09:55.729392 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 09 09:09:55 crc kubenswrapper[4861]: I0309 09:09:55.745958 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 09 09:09:55 crc kubenswrapper[4861]: I0309 09:09:55.849138 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 09 09:09:56 crc kubenswrapper[4861]: I0309 09:09:56.020152 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 09 09:09:56 crc kubenswrapper[4861]: I0309 09:09:56.028063 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 09 09:09:56 crc kubenswrapper[4861]: I0309 09:09:56.134780 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 09 09:09:56 crc kubenswrapper[4861]: I0309 09:09:56.143083 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 09 09:09:56 crc kubenswrapper[4861]: I0309 09:09:56.271518 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 09 09:09:56 crc kubenswrapper[4861]: I0309 09:09:56.310871 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 09 09:09:56 crc kubenswrapper[4861]: I0309 09:09:56.320866 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 09 09:09:56 crc kubenswrapper[4861]: I0309 09:09:56.323046 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 09 09:09:56 crc kubenswrapper[4861]: I0309 09:09:56.338217 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 09 09:09:56 crc kubenswrapper[4861]: I0309 09:09:56.370451 4861 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 09 09:09:56 crc kubenswrapper[4861]: I0309 09:09:56.374104 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 09 09:09:56 crc kubenswrapper[4861]: I0309 09:09:56.374156 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 09 09:09:56 crc kubenswrapper[4861]: I0309 09:09:56.379478 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 09:09:56 crc kubenswrapper[4861]: I0309 09:09:56.393824 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=20.393804803 podStartE2EDuration="20.393804803s" podCreationTimestamp="2026-03-09 09:09:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:09:56.389432612 +0000 UTC m=+239.474472013" watchObservedRunningTime="2026-03-09 09:09:56.393804803 +0000 UTC m=+239.478844204" Mar 09 09:09:56 crc kubenswrapper[4861]: I0309 09:09:56.435719 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 09 09:09:56 crc kubenswrapper[4861]: I0309 09:09:56.529224 4861 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 09 09:09:56 crc kubenswrapper[4861]: I0309 09:09:56.529223 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 09 09:09:56 crc kubenswrapper[4861]: I0309 09:09:56.592758 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 09 09:09:56 crc kubenswrapper[4861]: I0309 09:09:56.770901 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 09 09:09:56 crc kubenswrapper[4861]: I0309 09:09:56.838151 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 09 09:09:56 crc kubenswrapper[4861]: I0309 09:09:56.850969 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 09 09:09:56 crc kubenswrapper[4861]: I0309 09:09:56.851175 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 09 09:09:56 crc kubenswrapper[4861]: I0309 09:09:56.858897 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 09 09:09:56 crc kubenswrapper[4861]: I0309 09:09:56.866944 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 09 09:09:56 crc kubenswrapper[4861]: I0309 09:09:56.923984 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 09 09:09:56 crc kubenswrapper[4861]: I0309 09:09:56.931247 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 09 09:09:57 crc kubenswrapper[4861]: I0309 09:09:57.062554 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 09 09:09:57 crc kubenswrapper[4861]: I0309 09:09:57.081077 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 09 09:09:57 crc kubenswrapper[4861]: I0309 09:09:57.083489 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 09 09:09:57 crc kubenswrapper[4861]: I0309 09:09:57.125455 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 09 09:09:57 crc kubenswrapper[4861]: I0309 09:09:57.134067 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 09 09:09:57 crc kubenswrapper[4861]: I0309 09:09:57.159953 4861 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 09 09:09:57 crc kubenswrapper[4861]: I0309 09:09:57.171792 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 09 09:09:57 crc kubenswrapper[4861]: I0309 09:09:57.192019 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 09 09:09:57 crc kubenswrapper[4861]: I0309 09:09:57.297627 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 09 09:09:57 crc kubenswrapper[4861]: I0309 09:09:57.298427 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 09 09:09:57 crc kubenswrapper[4861]: I0309 09:09:57.342957 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 09 09:09:57 crc kubenswrapper[4861]: I0309 09:09:57.399156 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 09 09:09:57 crc kubenswrapper[4861]: I0309 09:09:57.407127 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 09 09:09:57 crc kubenswrapper[4861]: I0309 09:09:57.415639 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 09 09:09:57 crc kubenswrapper[4861]: I0309 09:09:57.454986 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 09 09:09:57 crc kubenswrapper[4861]: I0309 09:09:57.465473 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 09 09:09:57 crc kubenswrapper[4861]: I0309 09:09:57.504157 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 09 09:09:57 crc kubenswrapper[4861]: I0309 09:09:57.540728 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 09 09:09:57 crc kubenswrapper[4861]: I0309 09:09:57.549669 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 09 09:09:57 crc kubenswrapper[4861]: I0309 09:09:57.741294 4861 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 09 09:09:57 crc kubenswrapper[4861]: I0309 09:09:57.776966 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 09 09:09:57 crc kubenswrapper[4861]: I0309 09:09:57.866149 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 09 09:09:57 crc kubenswrapper[4861]: I0309 09:09:57.964806 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 09 09:09:57 crc kubenswrapper[4861]: I0309 09:09:57.992727 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 09 09:09:58 crc kubenswrapper[4861]: I0309 09:09:58.155327 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 09 09:09:58 crc kubenswrapper[4861]: I0309 09:09:58.173749 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 09 09:09:58 crc kubenswrapper[4861]: I0309 09:09:58.175722 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 09 09:09:58 crc kubenswrapper[4861]: I0309 09:09:58.185238 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 09 09:09:58 crc kubenswrapper[4861]: I0309 09:09:58.186282 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 09 09:09:58 crc kubenswrapper[4861]: I0309 09:09:58.208964 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 09 09:09:58 crc kubenswrapper[4861]: I0309 09:09:58.290951 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 09 09:09:58 crc kubenswrapper[4861]: I0309 09:09:58.312434 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 09 09:09:58 crc kubenswrapper[4861]: I0309 09:09:58.352248 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 09 09:09:58 crc kubenswrapper[4861]: I0309 09:09:58.403154 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 09 09:09:58 crc kubenswrapper[4861]: I0309 09:09:58.513558 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 09 09:09:58 crc kubenswrapper[4861]: I0309 09:09:58.601552 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 09 09:09:58 crc kubenswrapper[4861]: I0309 09:09:58.703330 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 09 09:09:58 crc kubenswrapper[4861]: I0309 09:09:58.714535 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 09 09:09:58 crc kubenswrapper[4861]: I0309 09:09:58.784427 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 09 09:09:58 crc kubenswrapper[4861]: I0309 09:09:58.787466 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 09 09:09:58 crc kubenswrapper[4861]: I0309 09:09:58.801118 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 09 09:09:58 crc kubenswrapper[4861]: I0309 09:09:58.815895 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 09 09:09:58 crc kubenswrapper[4861]: I0309 09:09:58.924465 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 09 09:09:58 crc kubenswrapper[4861]: I0309 09:09:58.924502 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 09 09:09:59 crc kubenswrapper[4861]: I0309 09:09:59.052967 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 09 09:09:59 crc kubenswrapper[4861]: I0309 09:09:59.082140 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 09 09:09:59 crc kubenswrapper[4861]: I0309 09:09:59.095338 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 09 09:09:59 crc kubenswrapper[4861]: I0309 09:09:59.191696 4861 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 09 09:09:59 crc kubenswrapper[4861]: I0309 09:09:59.191934 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://cdf9a28bbaea69794beeb97b690b764813f5b952e0a3bf72856c6ce7db2fba36" gracePeriod=5 Mar 09 09:09:59 crc kubenswrapper[4861]: I0309 09:09:59.196815 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 09 09:09:59 crc kubenswrapper[4861]: I0309 09:09:59.215691 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 09 09:09:59 crc kubenswrapper[4861]: I0309 09:09:59.239689 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 09 09:09:59 crc kubenswrapper[4861]: I0309 09:09:59.256288 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 09 09:09:59 crc kubenswrapper[4861]: I0309 09:09:59.262578 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 09 09:09:59 crc kubenswrapper[4861]: I0309 09:09:59.265545 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 09 09:09:59 crc kubenswrapper[4861]: I0309 09:09:59.277993 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 09 09:09:59 crc kubenswrapper[4861]: I0309 09:09:59.391958 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 09 09:09:59 crc kubenswrapper[4861]: I0309 09:09:59.410296 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 09 09:09:59 crc kubenswrapper[4861]: I0309 09:09:59.464664 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 09 09:09:59 crc kubenswrapper[4861]: I0309 09:09:59.587848 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 09 09:09:59 crc kubenswrapper[4861]: I0309 09:09:59.624771 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 09 09:09:59 crc kubenswrapper[4861]: I0309 09:09:59.687911 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 09 09:09:59 crc kubenswrapper[4861]: I0309 09:09:59.909212 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 09 09:10:00 crc kubenswrapper[4861]: I0309 09:10:00.069887 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 09 09:10:00 crc kubenswrapper[4861]: I0309 09:10:00.128237 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 09 09:10:00 crc kubenswrapper[4861]: I0309 09:10:00.253803 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 09 09:10:00 crc kubenswrapper[4861]: I0309 09:10:00.257763 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 09 09:10:00 crc kubenswrapper[4861]: I0309 09:10:00.268821 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 09 09:10:00 crc kubenswrapper[4861]: I0309 09:10:00.346782 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 09 09:10:00 crc kubenswrapper[4861]: I0309 09:10:00.413898 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 09 09:10:00 crc kubenswrapper[4861]: I0309 09:10:00.444673 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 09 09:10:00 crc kubenswrapper[4861]: I0309 09:10:00.448176 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 09 09:10:00 crc kubenswrapper[4861]: I0309 09:10:00.511967 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 09 09:10:00 crc kubenswrapper[4861]: I0309 09:10:00.513850 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 09 09:10:00 crc kubenswrapper[4861]: I0309 09:10:00.518429 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 09 09:10:00 crc kubenswrapper[4861]: I0309 09:10:00.595135 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 09 09:10:00 crc kubenswrapper[4861]: I0309 09:10:00.643092 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 09 09:10:00 crc kubenswrapper[4861]: I0309 09:10:00.703111 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 09 09:10:00 crc kubenswrapper[4861]: I0309 09:10:00.919569 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 09 09:10:01 crc kubenswrapper[4861]: I0309 09:10:01.059120 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 09 09:10:01 crc kubenswrapper[4861]: I0309 09:10:01.093767 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 09 09:10:01 crc kubenswrapper[4861]: I0309 09:10:01.160101 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 09 09:10:01 crc kubenswrapper[4861]: I0309 09:10:01.195522 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 09 09:10:01 crc kubenswrapper[4861]: I0309 09:10:01.346492 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 09 09:10:01 crc kubenswrapper[4861]: I0309 09:10:01.383828 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 09 09:10:01 crc kubenswrapper[4861]: I0309 09:10:01.521870 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 09 09:10:01 crc kubenswrapper[4861]: I0309 09:10:01.558278 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 09 09:10:01 crc kubenswrapper[4861]: I0309 09:10:01.603083 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 09 09:10:01 crc kubenswrapper[4861]: I0309 09:10:01.706662 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 09 09:10:01 crc kubenswrapper[4861]: I0309 09:10:01.747791 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 09 09:10:01 crc kubenswrapper[4861]: I0309 09:10:01.811840 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 09 09:10:01 crc kubenswrapper[4861]: I0309 09:10:01.951147 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 09 09:10:01 crc kubenswrapper[4861]: I0309 09:10:01.993142 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 09 09:10:02 crc kubenswrapper[4861]: I0309 09:10:02.100112 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 09 09:10:02 crc kubenswrapper[4861]: I0309 09:10:02.251664 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 09 09:10:02 crc kubenswrapper[4861]: I0309 09:10:02.290766 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 09 09:10:02 crc kubenswrapper[4861]: I0309 09:10:02.415434 4861 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 09 09:10:02 crc kubenswrapper[4861]: I0309 09:10:02.969093 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 09 09:10:02 crc kubenswrapper[4861]: I0309 09:10:02.984927 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 09 09:10:03 crc kubenswrapper[4861]: I0309 09:10:03.179462 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 09 09:10:03 crc kubenswrapper[4861]: I0309 09:10:03.323682 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550790-dmhn9"] Mar 09 09:10:03 crc kubenswrapper[4861]: E0309 09:10:03.323921 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 09 09:10:03 crc kubenswrapper[4861]: I0309 09:10:03.323935 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 09 09:10:03 crc kubenswrapper[4861]: E0309 09:10:03.323948 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6be593ec-c05d-4500-961d-74aa43fa5ff7" containerName="installer" Mar 09 09:10:03 crc kubenswrapper[4861]: I0309 09:10:03.323953 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="6be593ec-c05d-4500-961d-74aa43fa5ff7" containerName="installer" Mar 09 09:10:03 crc kubenswrapper[4861]: I0309 09:10:03.324042 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 09 09:10:03 crc kubenswrapper[4861]: I0309 09:10:03.324055 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="6be593ec-c05d-4500-961d-74aa43fa5ff7" containerName="installer" Mar 09 09:10:03 crc kubenswrapper[4861]: I0309 09:10:03.324431 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550790-dmhn9" Mar 09 09:10:03 crc kubenswrapper[4861]: I0309 09:10:03.326570 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k86l8" Mar 09 09:10:03 crc kubenswrapper[4861]: I0309 09:10:03.326695 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:10:03 crc kubenswrapper[4861]: I0309 09:10:03.327006 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:10:03 crc kubenswrapper[4861]: I0309 09:10:03.336045 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550790-dmhn9"] Mar 09 09:10:03 crc kubenswrapper[4861]: I0309 09:10:03.390937 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65bzb\" (UniqueName: \"kubernetes.io/projected/ccc4b52c-e5fd-4766-8b3f-674190065ed0-kube-api-access-65bzb\") pod \"auto-csr-approver-29550790-dmhn9\" (UID: \"ccc4b52c-e5fd-4766-8b3f-674190065ed0\") " pod="openshift-infra/auto-csr-approver-29550790-dmhn9" Mar 09 09:10:03 crc kubenswrapper[4861]: I0309 09:10:03.492129 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65bzb\" (UniqueName: \"kubernetes.io/projected/ccc4b52c-e5fd-4766-8b3f-674190065ed0-kube-api-access-65bzb\") pod \"auto-csr-approver-29550790-dmhn9\" (UID: \"ccc4b52c-e5fd-4766-8b3f-674190065ed0\") " pod="openshift-infra/auto-csr-approver-29550790-dmhn9" Mar 09 09:10:03 crc kubenswrapper[4861]: I0309 09:10:03.518259 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65bzb\" (UniqueName: \"kubernetes.io/projected/ccc4b52c-e5fd-4766-8b3f-674190065ed0-kube-api-access-65bzb\") pod \"auto-csr-approver-29550790-dmhn9\" (UID: \"ccc4b52c-e5fd-4766-8b3f-674190065ed0\") " pod="openshift-infra/auto-csr-approver-29550790-dmhn9" Mar 09 09:10:03 crc kubenswrapper[4861]: I0309 09:10:03.638927 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550790-dmhn9" Mar 09 09:10:04 crc kubenswrapper[4861]: I0309 09:10:04.067036 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550790-dmhn9"] Mar 09 09:10:04 crc kubenswrapper[4861]: I0309 09:10:04.654635 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 09 09:10:04 crc kubenswrapper[4861]: I0309 09:10:04.654697 4861 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="cdf9a28bbaea69794beeb97b690b764813f5b952e0a3bf72856c6ce7db2fba36" exitCode=137 Mar 09 09:10:04 crc kubenswrapper[4861]: I0309 09:10:04.657360 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550790-dmhn9" event={"ID":"ccc4b52c-e5fd-4766-8b3f-674190065ed0","Type":"ContainerStarted","Data":"9db9160f2cb02f82cef149c33f7a37356e989aed65d57048bda02a146426e0c0"} Mar 09 09:10:04 crc kubenswrapper[4861]: I0309 09:10:04.772648 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 09 09:10:04 crc kubenswrapper[4861]: I0309 09:10:04.773016 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 09:10:04 crc kubenswrapper[4861]: I0309 09:10:04.914469 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 09 09:10:04 crc kubenswrapper[4861]: I0309 09:10:04.914637 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 09 09:10:04 crc kubenswrapper[4861]: I0309 09:10:04.914672 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 09 09:10:04 crc kubenswrapper[4861]: I0309 09:10:04.914694 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:10:04 crc kubenswrapper[4861]: I0309 09:10:04.914742 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 09 09:10:04 crc kubenswrapper[4861]: I0309 09:10:04.914774 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 09 09:10:04 crc kubenswrapper[4861]: I0309 09:10:04.914819 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:10:04 crc kubenswrapper[4861]: I0309 09:10:04.914819 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:10:04 crc kubenswrapper[4861]: I0309 09:10:04.914949 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:10:04 crc kubenswrapper[4861]: I0309 09:10:04.915074 4861 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 09 09:10:04 crc kubenswrapper[4861]: I0309 09:10:04.915087 4861 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 09 09:10:04 crc kubenswrapper[4861]: I0309 09:10:04.915097 4861 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 09 09:10:04 crc kubenswrapper[4861]: I0309 09:10:04.915104 4861 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 09 09:10:04 crc kubenswrapper[4861]: I0309 09:10:04.923103 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:10:05 crc kubenswrapper[4861]: I0309 09:10:05.016647 4861 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 09 09:10:05 crc kubenswrapper[4861]: I0309 09:10:05.662502 4861 generic.go:334] "Generic (PLEG): container finished" podID="ccc4b52c-e5fd-4766-8b3f-674190065ed0" containerID="0bc369a221bc8d05ce04836901195db6e7e34c2e26724a0c1eff037b8c1ee848" exitCode=0 Mar 09 09:10:05 crc kubenswrapper[4861]: I0309 09:10:05.664039 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 09 09:10:05 crc kubenswrapper[4861]: I0309 09:10:05.664155 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 09:10:05 crc kubenswrapper[4861]: I0309 09:10:05.666254 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 09 09:10:05 crc kubenswrapper[4861]: I0309 09:10:05.666630 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550790-dmhn9" event={"ID":"ccc4b52c-e5fd-4766-8b3f-674190065ed0","Type":"ContainerDied","Data":"0bc369a221bc8d05ce04836901195db6e7e34c2e26724a0c1eff037b8c1ee848"} Mar 09 09:10:05 crc kubenswrapper[4861]: I0309 09:10:05.666664 4861 scope.go:117] "RemoveContainer" containerID="cdf9a28bbaea69794beeb97b690b764813f5b952e0a3bf72856c6ce7db2fba36" Mar 09 09:10:07 crc kubenswrapper[4861]: I0309 09:10:07.006717 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550790-dmhn9" Mar 09 09:10:07 crc kubenswrapper[4861]: I0309 09:10:07.142231 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65bzb\" (UniqueName: \"kubernetes.io/projected/ccc4b52c-e5fd-4766-8b3f-674190065ed0-kube-api-access-65bzb\") pod \"ccc4b52c-e5fd-4766-8b3f-674190065ed0\" (UID: \"ccc4b52c-e5fd-4766-8b3f-674190065ed0\") " Mar 09 09:10:07 crc kubenswrapper[4861]: I0309 09:10:07.146820 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccc4b52c-e5fd-4766-8b3f-674190065ed0-kube-api-access-65bzb" (OuterVolumeSpecName: "kube-api-access-65bzb") pod "ccc4b52c-e5fd-4766-8b3f-674190065ed0" (UID: "ccc4b52c-e5fd-4766-8b3f-674190065ed0"). InnerVolumeSpecName "kube-api-access-65bzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:10:07 crc kubenswrapper[4861]: I0309 09:10:07.243994 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65bzb\" (UniqueName: \"kubernetes.io/projected/ccc4b52c-e5fd-4766-8b3f-674190065ed0-kube-api-access-65bzb\") on node \"crc\" DevicePath \"\"" Mar 09 09:10:07 crc kubenswrapper[4861]: I0309 09:10:07.677078 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550790-dmhn9" event={"ID":"ccc4b52c-e5fd-4766-8b3f-674190065ed0","Type":"ContainerDied","Data":"9db9160f2cb02f82cef149c33f7a37356e989aed65d57048bda02a146426e0c0"} Mar 09 09:10:07 crc kubenswrapper[4861]: I0309 09:10:07.677142 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9db9160f2cb02f82cef149c33f7a37356e989aed65d57048bda02a146426e0c0" Mar 09 09:10:07 crc kubenswrapper[4861]: I0309 09:10:07.677230 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550790-dmhn9" Mar 09 09:10:10 crc kubenswrapper[4861]: I0309 09:10:10.294639 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 09 09:10:15 crc kubenswrapper[4861]: I0309 09:10:15.723530 4861 generic.go:334] "Generic (PLEG): container finished" podID="3d2a0407-66d1-4d05-9623-fe968aa3b516" containerID="33d40b96522231571b97073f146c87cba1bd5191afe63d1eae02ecdf7d93b9ea" exitCode=0 Mar 09 09:10:15 crc kubenswrapper[4861]: I0309 09:10:15.723601 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-k2z5n" event={"ID":"3d2a0407-66d1-4d05-9623-fe968aa3b516","Type":"ContainerDied","Data":"33d40b96522231571b97073f146c87cba1bd5191afe63d1eae02ecdf7d93b9ea"} Mar 09 09:10:15 crc kubenswrapper[4861]: I0309 09:10:15.724420 4861 scope.go:117] "RemoveContainer" containerID="33d40b96522231571b97073f146c87cba1bd5191afe63d1eae02ecdf7d93b9ea" Mar 09 09:10:16 crc kubenswrapper[4861]: I0309 09:10:16.733260 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-k2z5n" event={"ID":"3d2a0407-66d1-4d05-9623-fe968aa3b516","Type":"ContainerStarted","Data":"116417a542abd8e40ae5d0e121e85d6655b957a9eaf43f73903fc187fd061560"} Mar 09 09:10:16 crc kubenswrapper[4861]: I0309 09:10:16.733684 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-k2z5n" Mar 09 09:10:16 crc kubenswrapper[4861]: I0309 09:10:16.736763 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-k2z5n" Mar 09 09:10:20 crc kubenswrapper[4861]: I0309 09:10:20.681762 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 09 09:10:21 crc kubenswrapper[4861]: I0309 09:10:21.472702 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 09 09:10:23 crc kubenswrapper[4861]: I0309 09:10:23.498979 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 09 09:10:24 crc kubenswrapper[4861]: I0309 09:10:24.606961 4861 patch_prober.go:28] interesting pod/machine-config-daemon-5g7gc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:10:24 crc kubenswrapper[4861]: I0309 09:10:24.607045 4861 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:10:24 crc kubenswrapper[4861]: I0309 09:10:24.607112 4861 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" Mar 09 09:10:24 crc kubenswrapper[4861]: I0309 09:10:24.608025 4861 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c970ace96d4c918f6e61a749abffe084d175df04b5393bf6029d502cdda837af"} pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 09:10:24 crc kubenswrapper[4861]: I0309 09:10:24.608110 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" containerID="cri-o://c970ace96d4c918f6e61a749abffe084d175df04b5393bf6029d502cdda837af" gracePeriod=600 Mar 09 09:10:24 crc kubenswrapper[4861]: I0309 09:10:24.789642 4861 generic.go:334] "Generic (PLEG): container finished" podID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerID="c970ace96d4c918f6e61a749abffe084d175df04b5393bf6029d502cdda837af" exitCode=0 Mar 09 09:10:24 crc kubenswrapper[4861]: I0309 09:10:24.789705 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" event={"ID":"6f7875e3-174f-4c67-8675-d878de74aa4f","Type":"ContainerDied","Data":"c970ace96d4c918f6e61a749abffe084d175df04b5393bf6029d502cdda837af"} Mar 09 09:10:25 crc kubenswrapper[4861]: I0309 09:10:25.799978 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" event={"ID":"6f7875e3-174f-4c67-8675-d878de74aa4f","Type":"ContainerStarted","Data":"cc00f8c91adc84713b416ee4eb89ff4342a32c6c79a3ecec7efa8bfd9fbb3202"} Mar 09 09:10:28 crc kubenswrapper[4861]: I0309 09:10:28.051614 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 09 09:10:40 crc kubenswrapper[4861]: I0309 09:10:40.449487 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 09 09:10:40 crc kubenswrapper[4861]: I0309 09:10:40.545156 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 09 09:11:37 crc kubenswrapper[4861]: I0309 09:11:37.287389 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-n4xz4"] Mar 09 09:11:37 crc kubenswrapper[4861]: E0309 09:11:37.288162 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccc4b52c-e5fd-4766-8b3f-674190065ed0" containerName="oc" Mar 09 09:11:37 crc kubenswrapper[4861]: I0309 09:11:37.288181 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccc4b52c-e5fd-4766-8b3f-674190065ed0" containerName="oc" Mar 09 09:11:37 crc kubenswrapper[4861]: I0309 09:11:37.288305 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccc4b52c-e5fd-4766-8b3f-674190065ed0" containerName="oc" Mar 09 09:11:37 crc kubenswrapper[4861]: I0309 09:11:37.288799 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-n4xz4" Mar 09 09:11:37 crc kubenswrapper[4861]: I0309 09:11:37.306265 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-n4xz4"] Mar 09 09:11:37 crc kubenswrapper[4861]: I0309 09:11:37.446202 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a911d424-aa5a-4204-b384-7e8c34b88fd1-bound-sa-token\") pod \"image-registry-66df7c8f76-n4xz4\" (UID: \"a911d424-aa5a-4204-b384-7e8c34b88fd1\") " pod="openshift-image-registry/image-registry-66df7c8f76-n4xz4" Mar 09 09:11:37 crc kubenswrapper[4861]: I0309 09:11:37.446365 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a911d424-aa5a-4204-b384-7e8c34b88fd1-registry-tls\") pod \"image-registry-66df7c8f76-n4xz4\" (UID: \"a911d424-aa5a-4204-b384-7e8c34b88fd1\") " pod="openshift-image-registry/image-registry-66df7c8f76-n4xz4" Mar 09 09:11:37 crc kubenswrapper[4861]: I0309 09:11:37.446531 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a911d424-aa5a-4204-b384-7e8c34b88fd1-trusted-ca\") pod \"image-registry-66df7c8f76-n4xz4\" (UID: \"a911d424-aa5a-4204-b384-7e8c34b88fd1\") " pod="openshift-image-registry/image-registry-66df7c8f76-n4xz4" Mar 09 09:11:37 crc kubenswrapper[4861]: I0309 09:11:37.446629 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a911d424-aa5a-4204-b384-7e8c34b88fd1-installation-pull-secrets\") pod \"image-registry-66df7c8f76-n4xz4\" (UID: \"a911d424-aa5a-4204-b384-7e8c34b88fd1\") " pod="openshift-image-registry/image-registry-66df7c8f76-n4xz4" Mar 09 09:11:37 crc kubenswrapper[4861]: I0309 09:11:37.446725 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a911d424-aa5a-4204-b384-7e8c34b88fd1-registry-certificates\") pod \"image-registry-66df7c8f76-n4xz4\" (UID: \"a911d424-aa5a-4204-b384-7e8c34b88fd1\") " pod="openshift-image-registry/image-registry-66df7c8f76-n4xz4" Mar 09 09:11:37 crc kubenswrapper[4861]: I0309 09:11:37.446774 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a911d424-aa5a-4204-b384-7e8c34b88fd1-ca-trust-extracted\") pod \"image-registry-66df7c8f76-n4xz4\" (UID: \"a911d424-aa5a-4204-b384-7e8c34b88fd1\") " pod="openshift-image-registry/image-registry-66df7c8f76-n4xz4" Mar 09 09:11:37 crc kubenswrapper[4861]: I0309 09:11:37.446947 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-n4xz4\" (UID: \"a911d424-aa5a-4204-b384-7e8c34b88fd1\") " pod="openshift-image-registry/image-registry-66df7c8f76-n4xz4" Mar 09 09:11:37 crc kubenswrapper[4861]: I0309 09:11:37.447077 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4k8hh\" (UniqueName: \"kubernetes.io/projected/a911d424-aa5a-4204-b384-7e8c34b88fd1-kube-api-access-4k8hh\") pod \"image-registry-66df7c8f76-n4xz4\" (UID: \"a911d424-aa5a-4204-b384-7e8c34b88fd1\") " pod="openshift-image-registry/image-registry-66df7c8f76-n4xz4" Mar 09 09:11:37 crc kubenswrapper[4861]: I0309 09:11:37.477939 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-n4xz4\" (UID: \"a911d424-aa5a-4204-b384-7e8c34b88fd1\") " pod="openshift-image-registry/image-registry-66df7c8f76-n4xz4" Mar 09 09:11:37 crc kubenswrapper[4861]: I0309 09:11:37.547917 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4k8hh\" (UniqueName: \"kubernetes.io/projected/a911d424-aa5a-4204-b384-7e8c34b88fd1-kube-api-access-4k8hh\") pod \"image-registry-66df7c8f76-n4xz4\" (UID: \"a911d424-aa5a-4204-b384-7e8c34b88fd1\") " pod="openshift-image-registry/image-registry-66df7c8f76-n4xz4" Mar 09 09:11:37 crc kubenswrapper[4861]: I0309 09:11:37.547982 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a911d424-aa5a-4204-b384-7e8c34b88fd1-bound-sa-token\") pod \"image-registry-66df7c8f76-n4xz4\" (UID: \"a911d424-aa5a-4204-b384-7e8c34b88fd1\") " pod="openshift-image-registry/image-registry-66df7c8f76-n4xz4" Mar 09 09:11:37 crc kubenswrapper[4861]: I0309 09:11:37.548012 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a911d424-aa5a-4204-b384-7e8c34b88fd1-registry-tls\") pod \"image-registry-66df7c8f76-n4xz4\" (UID: \"a911d424-aa5a-4204-b384-7e8c34b88fd1\") " pod="openshift-image-registry/image-registry-66df7c8f76-n4xz4" Mar 09 09:11:37 crc kubenswrapper[4861]: I0309 09:11:37.548082 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a911d424-aa5a-4204-b384-7e8c34b88fd1-trusted-ca\") pod \"image-registry-66df7c8f76-n4xz4\" (UID: \"a911d424-aa5a-4204-b384-7e8c34b88fd1\") " pod="openshift-image-registry/image-registry-66df7c8f76-n4xz4" Mar 09 09:11:37 crc kubenswrapper[4861]: I0309 09:11:37.548140 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a911d424-aa5a-4204-b384-7e8c34b88fd1-installation-pull-secrets\") pod \"image-registry-66df7c8f76-n4xz4\" (UID: \"a911d424-aa5a-4204-b384-7e8c34b88fd1\") " pod="openshift-image-registry/image-registry-66df7c8f76-n4xz4" Mar 09 09:11:37 crc kubenswrapper[4861]: I0309 09:11:37.548183 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a911d424-aa5a-4204-b384-7e8c34b88fd1-registry-certificates\") pod \"image-registry-66df7c8f76-n4xz4\" (UID: \"a911d424-aa5a-4204-b384-7e8c34b88fd1\") " pod="openshift-image-registry/image-registry-66df7c8f76-n4xz4" Mar 09 09:11:37 crc kubenswrapper[4861]: I0309 09:11:37.548239 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a911d424-aa5a-4204-b384-7e8c34b88fd1-ca-trust-extracted\") pod \"image-registry-66df7c8f76-n4xz4\" (UID: \"a911d424-aa5a-4204-b384-7e8c34b88fd1\") " pod="openshift-image-registry/image-registry-66df7c8f76-n4xz4" Mar 09 09:11:37 crc kubenswrapper[4861]: I0309 09:11:37.549070 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a911d424-aa5a-4204-b384-7e8c34b88fd1-ca-trust-extracted\") pod \"image-registry-66df7c8f76-n4xz4\" (UID: \"a911d424-aa5a-4204-b384-7e8c34b88fd1\") " pod="openshift-image-registry/image-registry-66df7c8f76-n4xz4" Mar 09 09:11:37 crc kubenswrapper[4861]: I0309 09:11:37.550261 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a911d424-aa5a-4204-b384-7e8c34b88fd1-registry-certificates\") pod \"image-registry-66df7c8f76-n4xz4\" (UID: \"a911d424-aa5a-4204-b384-7e8c34b88fd1\") " pod="openshift-image-registry/image-registry-66df7c8f76-n4xz4" Mar 09 09:11:37 crc kubenswrapper[4861]: I0309 09:11:37.551294 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a911d424-aa5a-4204-b384-7e8c34b88fd1-trusted-ca\") pod \"image-registry-66df7c8f76-n4xz4\" (UID: \"a911d424-aa5a-4204-b384-7e8c34b88fd1\") " pod="openshift-image-registry/image-registry-66df7c8f76-n4xz4" Mar 09 09:11:37 crc kubenswrapper[4861]: I0309 09:11:37.557207 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a911d424-aa5a-4204-b384-7e8c34b88fd1-registry-tls\") pod \"image-registry-66df7c8f76-n4xz4\" (UID: \"a911d424-aa5a-4204-b384-7e8c34b88fd1\") " pod="openshift-image-registry/image-registry-66df7c8f76-n4xz4" Mar 09 09:11:37 crc kubenswrapper[4861]: I0309 09:11:37.557359 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a911d424-aa5a-4204-b384-7e8c34b88fd1-installation-pull-secrets\") pod \"image-registry-66df7c8f76-n4xz4\" (UID: \"a911d424-aa5a-4204-b384-7e8c34b88fd1\") " pod="openshift-image-registry/image-registry-66df7c8f76-n4xz4" Mar 09 09:11:37 crc kubenswrapper[4861]: I0309 09:11:37.572978 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4k8hh\" (UniqueName: \"kubernetes.io/projected/a911d424-aa5a-4204-b384-7e8c34b88fd1-kube-api-access-4k8hh\") pod \"image-registry-66df7c8f76-n4xz4\" (UID: \"a911d424-aa5a-4204-b384-7e8c34b88fd1\") " pod="openshift-image-registry/image-registry-66df7c8f76-n4xz4" Mar 09 09:11:37 crc kubenswrapper[4861]: I0309 09:11:37.577263 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a911d424-aa5a-4204-b384-7e8c34b88fd1-bound-sa-token\") pod \"image-registry-66df7c8f76-n4xz4\" (UID: \"a911d424-aa5a-4204-b384-7e8c34b88fd1\") " pod="openshift-image-registry/image-registry-66df7c8f76-n4xz4" Mar 09 09:11:37 crc kubenswrapper[4861]: I0309 09:11:37.614078 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-n4xz4" Mar 09 09:11:37 crc kubenswrapper[4861]: I0309 09:11:37.792898 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-n4xz4"] Mar 09 09:11:38 crc kubenswrapper[4861]: I0309 09:11:38.240664 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-n4xz4" event={"ID":"a911d424-aa5a-4204-b384-7e8c34b88fd1","Type":"ContainerStarted","Data":"d14948c5b2294a7de31cf55f6697c77075675cbed060d21084951a9752d76b05"} Mar 09 09:11:38 crc kubenswrapper[4861]: I0309 09:11:38.240720 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-n4xz4" event={"ID":"a911d424-aa5a-4204-b384-7e8c34b88fd1","Type":"ContainerStarted","Data":"92dc7c7759a1cad4833ae4799bf9e071764ca1ec143385411cbe1e0847674f89"} Mar 09 09:11:38 crc kubenswrapper[4861]: I0309 09:11:38.241640 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-n4xz4" Mar 09 09:11:38 crc kubenswrapper[4861]: I0309 09:11:38.265898 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-n4xz4" podStartSLOduration=1.265873238 podStartE2EDuration="1.265873238s" podCreationTimestamp="2026-03-09 09:11:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:11:38.259203173 +0000 UTC m=+341.344242614" watchObservedRunningTime="2026-03-09 09:11:38.265873238 +0000 UTC m=+341.350912699" Mar 09 09:11:49 crc kubenswrapper[4861]: I0309 09:11:49.952672 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gcb6x"] Mar 09 09:11:49 crc kubenswrapper[4861]: I0309 09:11:49.954123 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gcb6x" podUID="245d74cf-545f-43d3-ad40-5260aef18260" containerName="registry-server" containerID="cri-o://5cc27015cd8e7901d288e5fdcef535feb4e96147fb697254c6ae19cea8c1f72e" gracePeriod=30 Mar 09 09:11:49 crc kubenswrapper[4861]: I0309 09:11:49.986330 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cjwtv"] Mar 09 09:11:49 crc kubenswrapper[4861]: I0309 09:11:49.986860 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cjwtv" podUID="3ad8332f-9ca2-4dd0-903f-2bf5723aa51e" containerName="registry-server" containerID="cri-o://5fc761449bed87a2d5eb929cd50f3ddd1ddbaa199788d60e479ae409c70f1227" gracePeriod=30 Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.002975 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-k2z5n"] Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.003232 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-k2z5n" podUID="3d2a0407-66d1-4d05-9623-fe968aa3b516" containerName="marketplace-operator" containerID="cri-o://116417a542abd8e40ae5d0e121e85d6655b957a9eaf43f73903fc187fd061560" gracePeriod=30 Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.012872 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fnk25"] Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.013256 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fnk25" podUID="0eaf35dc-b198-4fbb-9e43-ddac97f1f62b" containerName="registry-server" containerID="cri-o://525d4211dcbfe48d20481224873f6a6e2f939f304d71679efe94cf1c25471299" gracePeriod=30 Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.028931 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j6fcf"] Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.042115 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9l4hx"] Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.050259 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-j6fcf" podUID="33158a78-8c1c-4aa1-9c51-66e21d0e8ae6" containerName="registry-server" containerID="cri-o://146a4ed3edae2c1ad1b1d9047b6e1bdc80e7e837c091f12a65496e962099ee40" gracePeriod=30 Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.051025 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-9l4hx" Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.055931 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9l4hx"] Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.141631 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8qb6\" (UniqueName: \"kubernetes.io/projected/5c398209-0537-461f-a2a8-b626abd10525-kube-api-access-x8qb6\") pod \"marketplace-operator-79b997595-9l4hx\" (UID: \"5c398209-0537-461f-a2a8-b626abd10525\") " pod="openshift-marketplace/marketplace-operator-79b997595-9l4hx" Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.141727 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5c398209-0537-461f-a2a8-b626abd10525-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-9l4hx\" (UID: \"5c398209-0537-461f-a2a8-b626abd10525\") " pod="openshift-marketplace/marketplace-operator-79b997595-9l4hx" Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.141966 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5c398209-0537-461f-a2a8-b626abd10525-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-9l4hx\" (UID: \"5c398209-0537-461f-a2a8-b626abd10525\") " pod="openshift-marketplace/marketplace-operator-79b997595-9l4hx" Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.242934 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5c398209-0537-461f-a2a8-b626abd10525-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-9l4hx\" (UID: \"5c398209-0537-461f-a2a8-b626abd10525\") " pod="openshift-marketplace/marketplace-operator-79b997595-9l4hx" Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.243335 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5c398209-0537-461f-a2a8-b626abd10525-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-9l4hx\" (UID: \"5c398209-0537-461f-a2a8-b626abd10525\") " pod="openshift-marketplace/marketplace-operator-79b997595-9l4hx" Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.243379 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8qb6\" (UniqueName: \"kubernetes.io/projected/5c398209-0537-461f-a2a8-b626abd10525-kube-api-access-x8qb6\") pod \"marketplace-operator-79b997595-9l4hx\" (UID: \"5c398209-0537-461f-a2a8-b626abd10525\") " pod="openshift-marketplace/marketplace-operator-79b997595-9l4hx" Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.245189 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5c398209-0537-461f-a2a8-b626abd10525-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-9l4hx\" (UID: \"5c398209-0537-461f-a2a8-b626abd10525\") " pod="openshift-marketplace/marketplace-operator-79b997595-9l4hx" Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.252231 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5c398209-0537-461f-a2a8-b626abd10525-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-9l4hx\" (UID: \"5c398209-0537-461f-a2a8-b626abd10525\") " pod="openshift-marketplace/marketplace-operator-79b997595-9l4hx" Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.260761 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8qb6\" (UniqueName: \"kubernetes.io/projected/5c398209-0537-461f-a2a8-b626abd10525-kube-api-access-x8qb6\") pod \"marketplace-operator-79b997595-9l4hx\" (UID: \"5c398209-0537-461f-a2a8-b626abd10525\") " pod="openshift-marketplace/marketplace-operator-79b997595-9l4hx" Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.315670 4861 generic.go:334] "Generic (PLEG): container finished" podID="3ad8332f-9ca2-4dd0-903f-2bf5723aa51e" containerID="5fc761449bed87a2d5eb929cd50f3ddd1ddbaa199788d60e479ae409c70f1227" exitCode=0 Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.315740 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cjwtv" event={"ID":"3ad8332f-9ca2-4dd0-903f-2bf5723aa51e","Type":"ContainerDied","Data":"5fc761449bed87a2d5eb929cd50f3ddd1ddbaa199788d60e479ae409c70f1227"} Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.323052 4861 generic.go:334] "Generic (PLEG): container finished" podID="33158a78-8c1c-4aa1-9c51-66e21d0e8ae6" containerID="146a4ed3edae2c1ad1b1d9047b6e1bdc80e7e837c091f12a65496e962099ee40" exitCode=0 Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.323120 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j6fcf" event={"ID":"33158a78-8c1c-4aa1-9c51-66e21d0e8ae6","Type":"ContainerDied","Data":"146a4ed3edae2c1ad1b1d9047b6e1bdc80e7e837c091f12a65496e962099ee40"} Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.325488 4861 generic.go:334] "Generic (PLEG): container finished" podID="3d2a0407-66d1-4d05-9623-fe968aa3b516" containerID="116417a542abd8e40ae5d0e121e85d6655b957a9eaf43f73903fc187fd061560" exitCode=0 Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.325524 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-k2z5n" event={"ID":"3d2a0407-66d1-4d05-9623-fe968aa3b516","Type":"ContainerDied","Data":"116417a542abd8e40ae5d0e121e85d6655b957a9eaf43f73903fc187fd061560"} Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.325547 4861 scope.go:117] "RemoveContainer" containerID="33d40b96522231571b97073f146c87cba1bd5191afe63d1eae02ecdf7d93b9ea" Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.328898 4861 generic.go:334] "Generic (PLEG): container finished" podID="245d74cf-545f-43d3-ad40-5260aef18260" containerID="5cc27015cd8e7901d288e5fdcef535feb4e96147fb697254c6ae19cea8c1f72e" exitCode=0 Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.328932 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gcb6x" event={"ID":"245d74cf-545f-43d3-ad40-5260aef18260","Type":"ContainerDied","Data":"5cc27015cd8e7901d288e5fdcef535feb4e96147fb697254c6ae19cea8c1f72e"} Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.328949 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gcb6x" event={"ID":"245d74cf-545f-43d3-ad40-5260aef18260","Type":"ContainerDied","Data":"5abe4c2768b598d5b864e92b23d9817f646b967b24e4afc434a15d4ea8ba4db2"} Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.328959 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5abe4c2768b598d5b864e92b23d9817f646b967b24e4afc434a15d4ea8ba4db2" Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.330676 4861 generic.go:334] "Generic (PLEG): container finished" podID="0eaf35dc-b198-4fbb-9e43-ddac97f1f62b" containerID="525d4211dcbfe48d20481224873f6a6e2f939f304d71679efe94cf1c25471299" exitCode=0 Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.330702 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fnk25" event={"ID":"0eaf35dc-b198-4fbb-9e43-ddac97f1f62b","Type":"ContainerDied","Data":"525d4211dcbfe48d20481224873f6a6e2f939f304d71679efe94cf1c25471299"} Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.434599 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-9l4hx" Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.444187 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gcb6x" Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.457875 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fnk25" Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.462977 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cjwtv" Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.480351 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j6fcf" Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.487162 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-k2z5n" Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.546812 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsg92\" (UniqueName: \"kubernetes.io/projected/3ad8332f-9ca2-4dd0-903f-2bf5723aa51e-kube-api-access-zsg92\") pod \"3ad8332f-9ca2-4dd0-903f-2bf5723aa51e\" (UID: \"3ad8332f-9ca2-4dd0-903f-2bf5723aa51e\") " Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.547059 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/245d74cf-545f-43d3-ad40-5260aef18260-catalog-content\") pod \"245d74cf-545f-43d3-ad40-5260aef18260\" (UID: \"245d74cf-545f-43d3-ad40-5260aef18260\") " Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.547086 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhs5c\" (UniqueName: \"kubernetes.io/projected/0eaf35dc-b198-4fbb-9e43-ddac97f1f62b-kube-api-access-fhs5c\") pod \"0eaf35dc-b198-4fbb-9e43-ddac97f1f62b\" (UID: \"0eaf35dc-b198-4fbb-9e43-ddac97f1f62b\") " Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.551777 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ad8332f-9ca2-4dd0-903f-2bf5723aa51e-kube-api-access-zsg92" (OuterVolumeSpecName: "kube-api-access-zsg92") pod "3ad8332f-9ca2-4dd0-903f-2bf5723aa51e" (UID: "3ad8332f-9ca2-4dd0-903f-2bf5723aa51e"). InnerVolumeSpecName "kube-api-access-zsg92". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.552360 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0eaf35dc-b198-4fbb-9e43-ddac97f1f62b-kube-api-access-fhs5c" (OuterVolumeSpecName: "kube-api-access-fhs5c") pod "0eaf35dc-b198-4fbb-9e43-ddac97f1f62b" (UID: "0eaf35dc-b198-4fbb-9e43-ddac97f1f62b"). InnerVolumeSpecName "kube-api-access-fhs5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.559608 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0eaf35dc-b198-4fbb-9e43-ddac97f1f62b-catalog-content\") pod \"0eaf35dc-b198-4fbb-9e43-ddac97f1f62b\" (UID: \"0eaf35dc-b198-4fbb-9e43-ddac97f1f62b\") " Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.559706 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpbm2\" (UniqueName: \"kubernetes.io/projected/3d2a0407-66d1-4d05-9623-fe968aa3b516-kube-api-access-dpbm2\") pod \"3d2a0407-66d1-4d05-9623-fe968aa3b516\" (UID: \"3d2a0407-66d1-4d05-9623-fe968aa3b516\") " Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.559738 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33158a78-8c1c-4aa1-9c51-66e21d0e8ae6-utilities\") pod \"33158a78-8c1c-4aa1-9c51-66e21d0e8ae6\" (UID: \"33158a78-8c1c-4aa1-9c51-66e21d0e8ae6\") " Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.560153 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3d2a0407-66d1-4d05-9623-fe968aa3b516-marketplace-operator-metrics\") pod \"3d2a0407-66d1-4d05-9623-fe968aa3b516\" (UID: \"3d2a0407-66d1-4d05-9623-fe968aa3b516\") " Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.560183 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3d2a0407-66d1-4d05-9623-fe968aa3b516-marketplace-trusted-ca\") pod \"3d2a0407-66d1-4d05-9623-fe968aa3b516\" (UID: \"3d2a0407-66d1-4d05-9623-fe968aa3b516\") " Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.560200 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7595\" (UniqueName: \"kubernetes.io/projected/245d74cf-545f-43d3-ad40-5260aef18260-kube-api-access-t7595\") pod \"245d74cf-545f-43d3-ad40-5260aef18260\" (UID: \"245d74cf-545f-43d3-ad40-5260aef18260\") " Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.560216 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0eaf35dc-b198-4fbb-9e43-ddac97f1f62b-utilities\") pod \"0eaf35dc-b198-4fbb-9e43-ddac97f1f62b\" (UID: \"0eaf35dc-b198-4fbb-9e43-ddac97f1f62b\") " Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.560236 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ad8332f-9ca2-4dd0-903f-2bf5723aa51e-utilities\") pod \"3ad8332f-9ca2-4dd0-903f-2bf5723aa51e\" (UID: \"3ad8332f-9ca2-4dd0-903f-2bf5723aa51e\") " Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.560261 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-md8cx\" (UniqueName: \"kubernetes.io/projected/33158a78-8c1c-4aa1-9c51-66e21d0e8ae6-kube-api-access-md8cx\") pod \"33158a78-8c1c-4aa1-9c51-66e21d0e8ae6\" (UID: \"33158a78-8c1c-4aa1-9c51-66e21d0e8ae6\") " Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.560322 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ad8332f-9ca2-4dd0-903f-2bf5723aa51e-catalog-content\") pod \"3ad8332f-9ca2-4dd0-903f-2bf5723aa51e\" (UID: \"3ad8332f-9ca2-4dd0-903f-2bf5723aa51e\") " Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.560337 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33158a78-8c1c-4aa1-9c51-66e21d0e8ae6-catalog-content\") pod \"33158a78-8c1c-4aa1-9c51-66e21d0e8ae6\" (UID: \"33158a78-8c1c-4aa1-9c51-66e21d0e8ae6\") " Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.560353 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/245d74cf-545f-43d3-ad40-5260aef18260-utilities\") pod \"245d74cf-545f-43d3-ad40-5260aef18260\" (UID: \"245d74cf-545f-43d3-ad40-5260aef18260\") " Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.561064 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ad8332f-9ca2-4dd0-903f-2bf5723aa51e-utilities" (OuterVolumeSpecName: "utilities") pod "3ad8332f-9ca2-4dd0-903f-2bf5723aa51e" (UID: "3ad8332f-9ca2-4dd0-903f-2bf5723aa51e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.561107 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d2a0407-66d1-4d05-9623-fe968aa3b516-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "3d2a0407-66d1-4d05-9623-fe968aa3b516" (UID: "3d2a0407-66d1-4d05-9623-fe968aa3b516"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.561237 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0eaf35dc-b198-4fbb-9e43-ddac97f1f62b-utilities" (OuterVolumeSpecName: "utilities") pod "0eaf35dc-b198-4fbb-9e43-ddac97f1f62b" (UID: "0eaf35dc-b198-4fbb-9e43-ddac97f1f62b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.561736 4861 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3d2a0407-66d1-4d05-9623-fe968aa3b516-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.561760 4861 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0eaf35dc-b198-4fbb-9e43-ddac97f1f62b-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.561772 4861 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ad8332f-9ca2-4dd0-903f-2bf5723aa51e-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.561781 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zsg92\" (UniqueName: \"kubernetes.io/projected/3ad8332f-9ca2-4dd0-903f-2bf5723aa51e-kube-api-access-zsg92\") on node \"crc\" DevicePath \"\"" Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.561790 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhs5c\" (UniqueName: \"kubernetes.io/projected/0eaf35dc-b198-4fbb-9e43-ddac97f1f62b-kube-api-access-fhs5c\") on node \"crc\" DevicePath \"\"" Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.562574 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/245d74cf-545f-43d3-ad40-5260aef18260-utilities" (OuterVolumeSpecName: "utilities") pod "245d74cf-545f-43d3-ad40-5260aef18260" (UID: "245d74cf-545f-43d3-ad40-5260aef18260"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.562960 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33158a78-8c1c-4aa1-9c51-66e21d0e8ae6-utilities" (OuterVolumeSpecName: "utilities") pod "33158a78-8c1c-4aa1-9c51-66e21d0e8ae6" (UID: "33158a78-8c1c-4aa1-9c51-66e21d0e8ae6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.565070 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/245d74cf-545f-43d3-ad40-5260aef18260-kube-api-access-t7595" (OuterVolumeSpecName: "kube-api-access-t7595") pod "245d74cf-545f-43d3-ad40-5260aef18260" (UID: "245d74cf-545f-43d3-ad40-5260aef18260"). InnerVolumeSpecName "kube-api-access-t7595". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.565167 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d2a0407-66d1-4d05-9623-fe968aa3b516-kube-api-access-dpbm2" (OuterVolumeSpecName: "kube-api-access-dpbm2") pod "3d2a0407-66d1-4d05-9623-fe968aa3b516" (UID: "3d2a0407-66d1-4d05-9623-fe968aa3b516"). InnerVolumeSpecName "kube-api-access-dpbm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.566894 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33158a78-8c1c-4aa1-9c51-66e21d0e8ae6-kube-api-access-md8cx" (OuterVolumeSpecName: "kube-api-access-md8cx") pod "33158a78-8c1c-4aa1-9c51-66e21d0e8ae6" (UID: "33158a78-8c1c-4aa1-9c51-66e21d0e8ae6"). InnerVolumeSpecName "kube-api-access-md8cx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.567207 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d2a0407-66d1-4d05-9623-fe968aa3b516-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "3d2a0407-66d1-4d05-9623-fe968aa3b516" (UID: "3d2a0407-66d1-4d05-9623-fe968aa3b516"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.587398 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0eaf35dc-b198-4fbb-9e43-ddac97f1f62b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0eaf35dc-b198-4fbb-9e43-ddac97f1f62b" (UID: "0eaf35dc-b198-4fbb-9e43-ddac97f1f62b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.601570 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/245d74cf-545f-43d3-ad40-5260aef18260-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "245d74cf-545f-43d3-ad40-5260aef18260" (UID: "245d74cf-545f-43d3-ad40-5260aef18260"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.660214 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ad8332f-9ca2-4dd0-903f-2bf5723aa51e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3ad8332f-9ca2-4dd0-903f-2bf5723aa51e" (UID: "3ad8332f-9ca2-4dd0-903f-2bf5723aa51e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.663244 4861 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/245d74cf-545f-43d3-ad40-5260aef18260-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.663291 4861 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0eaf35dc-b198-4fbb-9e43-ddac97f1f62b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.663312 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpbm2\" (UniqueName: \"kubernetes.io/projected/3d2a0407-66d1-4d05-9623-fe968aa3b516-kube-api-access-dpbm2\") on node \"crc\" DevicePath \"\"" Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.663333 4861 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33158a78-8c1c-4aa1-9c51-66e21d0e8ae6-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.663352 4861 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3d2a0407-66d1-4d05-9623-fe968aa3b516-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.663395 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7595\" (UniqueName: \"kubernetes.io/projected/245d74cf-545f-43d3-ad40-5260aef18260-kube-api-access-t7595\") on node \"crc\" DevicePath \"\"" Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.663413 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-md8cx\" (UniqueName: \"kubernetes.io/projected/33158a78-8c1c-4aa1-9c51-66e21d0e8ae6-kube-api-access-md8cx\") on node \"crc\" DevicePath \"\"" Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.663429 4861 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ad8332f-9ca2-4dd0-903f-2bf5723aa51e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.663446 4861 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/245d74cf-545f-43d3-ad40-5260aef18260-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.718606 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33158a78-8c1c-4aa1-9c51-66e21d0e8ae6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "33158a78-8c1c-4aa1-9c51-66e21d0e8ae6" (UID: "33158a78-8c1c-4aa1-9c51-66e21d0e8ae6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.764310 4861 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33158a78-8c1c-4aa1-9c51-66e21d0e8ae6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:11:50 crc kubenswrapper[4861]: I0309 09:11:50.847190 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9l4hx"] Mar 09 09:11:51 crc kubenswrapper[4861]: I0309 09:11:51.337972 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-9l4hx" event={"ID":"5c398209-0537-461f-a2a8-b626abd10525","Type":"ContainerStarted","Data":"9a947d28ef58673f339c583876bd4bf32eee00216e2dd2fa33e5e2aec3a51e34"} Mar 09 09:11:51 crc kubenswrapper[4861]: I0309 09:11:51.338047 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-9l4hx" event={"ID":"5c398209-0537-461f-a2a8-b626abd10525","Type":"ContainerStarted","Data":"ad89ad548e3098afbcb843f8689ce8140d0fd4fcbf32358fe598eb984a4eefbd"} Mar 09 09:11:51 crc kubenswrapper[4861]: I0309 09:11:51.338224 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-9l4hx" Mar 09 09:11:51 crc kubenswrapper[4861]: I0309 09:11:51.341181 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-9l4hx" Mar 09 09:11:51 crc kubenswrapper[4861]: I0309 09:11:51.341224 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-j6fcf" event={"ID":"33158a78-8c1c-4aa1-9c51-66e21d0e8ae6","Type":"ContainerDied","Data":"836787c6e1446ba45b92a6a440d102a09879f66c9140757741215da862ee2b2f"} Mar 09 09:11:51 crc kubenswrapper[4861]: I0309 09:11:51.341264 4861 scope.go:117] "RemoveContainer" containerID="146a4ed3edae2c1ad1b1d9047b6e1bdc80e7e837c091f12a65496e962099ee40" Mar 09 09:11:51 crc kubenswrapper[4861]: I0309 09:11:51.341308 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-j6fcf" Mar 09 09:11:51 crc kubenswrapper[4861]: I0309 09:11:51.345055 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-k2z5n" event={"ID":"3d2a0407-66d1-4d05-9623-fe968aa3b516","Type":"ContainerDied","Data":"16e36988113813a7e5ac0aeca14109571e1840ea060a7df6bbed04c2a67dde39"} Mar 09 09:11:51 crc kubenswrapper[4861]: I0309 09:11:51.345072 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-k2z5n" Mar 09 09:11:51 crc kubenswrapper[4861]: I0309 09:11:51.348068 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fnk25" event={"ID":"0eaf35dc-b198-4fbb-9e43-ddac97f1f62b","Type":"ContainerDied","Data":"e0f982a3110fccbe81488a0747b1400ad370f4b47fa76b07317c97e98cc950e5"} Mar 09 09:11:51 crc kubenswrapper[4861]: I0309 09:11:51.348199 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fnk25" Mar 09 09:11:51 crc kubenswrapper[4861]: I0309 09:11:51.357961 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gcb6x" Mar 09 09:11:51 crc kubenswrapper[4861]: I0309 09:11:51.358290 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cjwtv" event={"ID":"3ad8332f-9ca2-4dd0-903f-2bf5723aa51e","Type":"ContainerDied","Data":"a27e8c3afe1afaa886e4e2322205334f8246ea013e7a8dec44b7b6f5eeb2e3e9"} Mar 09 09:11:51 crc kubenswrapper[4861]: I0309 09:11:51.358313 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cjwtv" Mar 09 09:11:51 crc kubenswrapper[4861]: I0309 09:11:51.374427 4861 scope.go:117] "RemoveContainer" containerID="d054bf6ca4163e20e2b1e243eb355bf50d30f6de8f2c9cdf71ca57b9d18f600c" Mar 09 09:11:51 crc kubenswrapper[4861]: I0309 09:11:51.377127 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-9l4hx" podStartSLOduration=2.377098658 podStartE2EDuration="2.377098658s" podCreationTimestamp="2026-03-09 09:11:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:11:51.369981392 +0000 UTC m=+354.455020803" watchObservedRunningTime="2026-03-09 09:11:51.377098658 +0000 UTC m=+354.462138099" Mar 09 09:11:51 crc kubenswrapper[4861]: I0309 09:11:51.405122 4861 scope.go:117] "RemoveContainer" containerID="7e1bb6d2bee4e18b8b866e5e2b2fcf1b9f4a74322d6ea9aa7af13b82e7b69b3c" Mar 09 09:11:51 crc kubenswrapper[4861]: I0309 09:11:51.426934 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-j6fcf"] Mar 09 09:11:51 crc kubenswrapper[4861]: I0309 09:11:51.428906 4861 scope.go:117] "RemoveContainer" containerID="116417a542abd8e40ae5d0e121e85d6655b957a9eaf43f73903fc187fd061560" Mar 09 09:11:51 crc kubenswrapper[4861]: I0309 09:11:51.448043 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-j6fcf"] Mar 09 09:11:51 crc kubenswrapper[4861]: I0309 09:11:51.453102 4861 scope.go:117] "RemoveContainer" containerID="525d4211dcbfe48d20481224873f6a6e2f939f304d71679efe94cf1c25471299" Mar 09 09:11:51 crc kubenswrapper[4861]: I0309 09:11:51.455379 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-k2z5n"] Mar 09 09:11:51 crc kubenswrapper[4861]: I0309 09:11:51.462584 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-k2z5n"] Mar 09 09:11:51 crc kubenswrapper[4861]: I0309 09:11:51.468139 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fnk25"] Mar 09 09:11:51 crc kubenswrapper[4861]: I0309 09:11:51.471464 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fnk25"] Mar 09 09:11:51 crc kubenswrapper[4861]: I0309 09:11:51.473716 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cjwtv"] Mar 09 09:11:51 crc kubenswrapper[4861]: I0309 09:11:51.477577 4861 scope.go:117] "RemoveContainer" containerID="116063dd19dc24cf5aa601ea2099ce41cc034ee16f5034f138594a61ea756abe" Mar 09 09:11:51 crc kubenswrapper[4861]: I0309 09:11:51.482701 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cjwtv"] Mar 09 09:11:51 crc kubenswrapper[4861]: I0309 09:11:51.497469 4861 scope.go:117] "RemoveContainer" containerID="4aeb6ff17984f2b9890f04f5eda9b1b52a4da37cd6841d01f3aae50d87d0ac2e" Mar 09 09:11:51 crc kubenswrapper[4861]: I0309 09:11:51.498551 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gcb6x"] Mar 09 09:11:51 crc kubenswrapper[4861]: I0309 09:11:51.503708 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gcb6x"] Mar 09 09:11:51 crc kubenswrapper[4861]: I0309 09:11:51.510242 4861 scope.go:117] "RemoveContainer" containerID="5fc761449bed87a2d5eb929cd50f3ddd1ddbaa199788d60e479ae409c70f1227" Mar 09 09:11:51 crc kubenswrapper[4861]: I0309 09:11:51.524446 4861 scope.go:117] "RemoveContainer" containerID="0a3af934f28ecbd012dfa212f71d8fbd8229c69c129928c202abb78cdeb3bfef" Mar 09 09:11:51 crc kubenswrapper[4861]: I0309 09:11:51.535337 4861 scope.go:117] "RemoveContainer" containerID="c405bdf8a6e7ed5d4413daa15ca151b9128a666746fe5186517f5506ec79d414" Mar 09 09:11:51 crc kubenswrapper[4861]: I0309 09:11:51.666577 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0eaf35dc-b198-4fbb-9e43-ddac97f1f62b" path="/var/lib/kubelet/pods/0eaf35dc-b198-4fbb-9e43-ddac97f1f62b/volumes" Mar 09 09:11:51 crc kubenswrapper[4861]: I0309 09:11:51.667310 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="245d74cf-545f-43d3-ad40-5260aef18260" path="/var/lib/kubelet/pods/245d74cf-545f-43d3-ad40-5260aef18260/volumes" Mar 09 09:11:51 crc kubenswrapper[4861]: I0309 09:11:51.668011 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33158a78-8c1c-4aa1-9c51-66e21d0e8ae6" path="/var/lib/kubelet/pods/33158a78-8c1c-4aa1-9c51-66e21d0e8ae6/volumes" Mar 09 09:11:51 crc kubenswrapper[4861]: I0309 09:11:51.669156 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ad8332f-9ca2-4dd0-903f-2bf5723aa51e" path="/var/lib/kubelet/pods/3ad8332f-9ca2-4dd0-903f-2bf5723aa51e/volumes" Mar 09 09:11:51 crc kubenswrapper[4861]: I0309 09:11:51.669862 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d2a0407-66d1-4d05-9623-fe968aa3b516" path="/var/lib/kubelet/pods/3d2a0407-66d1-4d05-9623-fe968aa3b516/volumes" Mar 09 09:11:52 crc kubenswrapper[4861]: I0309 09:11:52.167783 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-f2cbr"] Mar 09 09:11:52 crc kubenswrapper[4861]: E0309 09:11:52.168091 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d2a0407-66d1-4d05-9623-fe968aa3b516" containerName="marketplace-operator" Mar 09 09:11:52 crc kubenswrapper[4861]: I0309 09:11:52.168112 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d2a0407-66d1-4d05-9623-fe968aa3b516" containerName="marketplace-operator" Mar 09 09:11:52 crc kubenswrapper[4861]: E0309 09:11:52.168132 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="245d74cf-545f-43d3-ad40-5260aef18260" containerName="registry-server" Mar 09 09:11:52 crc kubenswrapper[4861]: I0309 09:11:52.168144 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="245d74cf-545f-43d3-ad40-5260aef18260" containerName="registry-server" Mar 09 09:11:52 crc kubenswrapper[4861]: E0309 09:11:52.168166 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ad8332f-9ca2-4dd0-903f-2bf5723aa51e" containerName="extract-content" Mar 09 09:11:52 crc kubenswrapper[4861]: I0309 09:11:52.168180 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ad8332f-9ca2-4dd0-903f-2bf5723aa51e" containerName="extract-content" Mar 09 09:11:52 crc kubenswrapper[4861]: E0309 09:11:52.168198 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0eaf35dc-b198-4fbb-9e43-ddac97f1f62b" containerName="extract-utilities" Mar 09 09:11:52 crc kubenswrapper[4861]: I0309 09:11:52.168209 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="0eaf35dc-b198-4fbb-9e43-ddac97f1f62b" containerName="extract-utilities" Mar 09 09:11:52 crc kubenswrapper[4861]: E0309 09:11:52.168224 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0eaf35dc-b198-4fbb-9e43-ddac97f1f62b" containerName="extract-content" Mar 09 09:11:52 crc kubenswrapper[4861]: I0309 09:11:52.168236 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="0eaf35dc-b198-4fbb-9e43-ddac97f1f62b" containerName="extract-content" Mar 09 09:11:52 crc kubenswrapper[4861]: E0309 09:11:52.168251 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ad8332f-9ca2-4dd0-903f-2bf5723aa51e" containerName="extract-utilities" Mar 09 09:11:52 crc kubenswrapper[4861]: I0309 09:11:52.168263 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ad8332f-9ca2-4dd0-903f-2bf5723aa51e" containerName="extract-utilities" Mar 09 09:11:52 crc kubenswrapper[4861]: E0309 09:11:52.168277 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="245d74cf-545f-43d3-ad40-5260aef18260" containerName="extract-utilities" Mar 09 09:11:52 crc kubenswrapper[4861]: I0309 09:11:52.168290 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="245d74cf-545f-43d3-ad40-5260aef18260" containerName="extract-utilities" Mar 09 09:11:52 crc kubenswrapper[4861]: E0309 09:11:52.168305 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ad8332f-9ca2-4dd0-903f-2bf5723aa51e" containerName="registry-server" Mar 09 09:11:52 crc kubenswrapper[4861]: I0309 09:11:52.168317 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ad8332f-9ca2-4dd0-903f-2bf5723aa51e" containerName="registry-server" Mar 09 09:11:52 crc kubenswrapper[4861]: E0309 09:11:52.168338 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33158a78-8c1c-4aa1-9c51-66e21d0e8ae6" containerName="extract-utilities" Mar 09 09:11:52 crc kubenswrapper[4861]: I0309 09:11:52.168352 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="33158a78-8c1c-4aa1-9c51-66e21d0e8ae6" containerName="extract-utilities" Mar 09 09:11:52 crc kubenswrapper[4861]: E0309 09:11:52.168365 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0eaf35dc-b198-4fbb-9e43-ddac97f1f62b" containerName="registry-server" Mar 09 09:11:52 crc kubenswrapper[4861]: I0309 09:11:52.168411 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="0eaf35dc-b198-4fbb-9e43-ddac97f1f62b" containerName="registry-server" Mar 09 09:11:52 crc kubenswrapper[4861]: E0309 09:11:52.168428 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="245d74cf-545f-43d3-ad40-5260aef18260" containerName="extract-content" Mar 09 09:11:52 crc kubenswrapper[4861]: I0309 09:11:52.168440 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="245d74cf-545f-43d3-ad40-5260aef18260" containerName="extract-content" Mar 09 09:11:52 crc kubenswrapper[4861]: E0309 09:11:52.168456 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33158a78-8c1c-4aa1-9c51-66e21d0e8ae6" containerName="registry-server" Mar 09 09:11:52 crc kubenswrapper[4861]: I0309 09:11:52.168467 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="33158a78-8c1c-4aa1-9c51-66e21d0e8ae6" containerName="registry-server" Mar 09 09:11:52 crc kubenswrapper[4861]: E0309 09:11:52.168488 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33158a78-8c1c-4aa1-9c51-66e21d0e8ae6" containerName="extract-content" Mar 09 09:11:52 crc kubenswrapper[4861]: I0309 09:11:52.168500 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="33158a78-8c1c-4aa1-9c51-66e21d0e8ae6" containerName="extract-content" Mar 09 09:11:52 crc kubenswrapper[4861]: I0309 09:11:52.168660 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d2a0407-66d1-4d05-9623-fe968aa3b516" containerName="marketplace-operator" Mar 09 09:11:52 crc kubenswrapper[4861]: I0309 09:11:52.168678 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="33158a78-8c1c-4aa1-9c51-66e21d0e8ae6" containerName="registry-server" Mar 09 09:11:52 crc kubenswrapper[4861]: I0309 09:11:52.168696 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="0eaf35dc-b198-4fbb-9e43-ddac97f1f62b" containerName="registry-server" Mar 09 09:11:52 crc kubenswrapper[4861]: I0309 09:11:52.168712 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ad8332f-9ca2-4dd0-903f-2bf5723aa51e" containerName="registry-server" Mar 09 09:11:52 crc kubenswrapper[4861]: I0309 09:11:52.168736 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="245d74cf-545f-43d3-ad40-5260aef18260" containerName="registry-server" Mar 09 09:11:52 crc kubenswrapper[4861]: E0309 09:11:52.168915 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d2a0407-66d1-4d05-9623-fe968aa3b516" containerName="marketplace-operator" Mar 09 09:11:52 crc kubenswrapper[4861]: I0309 09:11:52.168932 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d2a0407-66d1-4d05-9623-fe968aa3b516" containerName="marketplace-operator" Mar 09 09:11:52 crc kubenswrapper[4861]: I0309 09:11:52.169091 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d2a0407-66d1-4d05-9623-fe968aa3b516" containerName="marketplace-operator" Mar 09 09:11:52 crc kubenswrapper[4861]: I0309 09:11:52.170013 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f2cbr" Mar 09 09:11:52 crc kubenswrapper[4861]: I0309 09:11:52.172564 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 09 09:11:52 crc kubenswrapper[4861]: I0309 09:11:52.189744 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f2cbr"] Mar 09 09:11:52 crc kubenswrapper[4861]: I0309 09:11:52.285598 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5118375-51c4-460f-a7e4-4b0dc454daa2-catalog-content\") pod \"redhat-marketplace-f2cbr\" (UID: \"a5118375-51c4-460f-a7e4-4b0dc454daa2\") " pod="openshift-marketplace/redhat-marketplace-f2cbr" Mar 09 09:11:52 crc kubenswrapper[4861]: I0309 09:11:52.285908 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5118375-51c4-460f-a7e4-4b0dc454daa2-utilities\") pod \"redhat-marketplace-f2cbr\" (UID: \"a5118375-51c4-460f-a7e4-4b0dc454daa2\") " pod="openshift-marketplace/redhat-marketplace-f2cbr" Mar 09 09:11:52 crc kubenswrapper[4861]: I0309 09:11:52.286221 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cc8cr\" (UniqueName: \"kubernetes.io/projected/a5118375-51c4-460f-a7e4-4b0dc454daa2-kube-api-access-cc8cr\") pod \"redhat-marketplace-f2cbr\" (UID: \"a5118375-51c4-460f-a7e4-4b0dc454daa2\") " pod="openshift-marketplace/redhat-marketplace-f2cbr" Mar 09 09:11:52 crc kubenswrapper[4861]: I0309 09:11:52.371538 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gcl57"] Mar 09 09:11:52 crc kubenswrapper[4861]: I0309 09:11:52.374527 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gcl57" Mar 09 09:11:52 crc kubenswrapper[4861]: I0309 09:11:52.378413 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 09 09:11:52 crc kubenswrapper[4861]: I0309 09:11:52.394359 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5118375-51c4-460f-a7e4-4b0dc454daa2-catalog-content\") pod \"redhat-marketplace-f2cbr\" (UID: \"a5118375-51c4-460f-a7e4-4b0dc454daa2\") " pod="openshift-marketplace/redhat-marketplace-f2cbr" Mar 09 09:11:52 crc kubenswrapper[4861]: I0309 09:11:52.394734 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5118375-51c4-460f-a7e4-4b0dc454daa2-utilities\") pod \"redhat-marketplace-f2cbr\" (UID: \"a5118375-51c4-460f-a7e4-4b0dc454daa2\") " pod="openshift-marketplace/redhat-marketplace-f2cbr" Mar 09 09:11:52 crc kubenswrapper[4861]: I0309 09:11:52.395130 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cc8cr\" (UniqueName: \"kubernetes.io/projected/a5118375-51c4-460f-a7e4-4b0dc454daa2-kube-api-access-cc8cr\") pod \"redhat-marketplace-f2cbr\" (UID: \"a5118375-51c4-460f-a7e4-4b0dc454daa2\") " pod="openshift-marketplace/redhat-marketplace-f2cbr" Mar 09 09:11:52 crc kubenswrapper[4861]: I0309 09:11:52.395142 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a5118375-51c4-460f-a7e4-4b0dc454daa2-catalog-content\") pod \"redhat-marketplace-f2cbr\" (UID: \"a5118375-51c4-460f-a7e4-4b0dc454daa2\") " pod="openshift-marketplace/redhat-marketplace-f2cbr" Mar 09 09:11:52 crc kubenswrapper[4861]: I0309 09:11:52.395506 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a5118375-51c4-460f-a7e4-4b0dc454daa2-utilities\") pod \"redhat-marketplace-f2cbr\" (UID: \"a5118375-51c4-460f-a7e4-4b0dc454daa2\") " pod="openshift-marketplace/redhat-marketplace-f2cbr" Mar 09 09:11:52 crc kubenswrapper[4861]: I0309 09:11:52.405860 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gcl57"] Mar 09 09:11:52 crc kubenswrapper[4861]: I0309 09:11:52.419312 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cc8cr\" (UniqueName: \"kubernetes.io/projected/a5118375-51c4-460f-a7e4-4b0dc454daa2-kube-api-access-cc8cr\") pod \"redhat-marketplace-f2cbr\" (UID: \"a5118375-51c4-460f-a7e4-4b0dc454daa2\") " pod="openshift-marketplace/redhat-marketplace-f2cbr" Mar 09 09:11:52 crc kubenswrapper[4861]: I0309 09:11:52.496695 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87e64062-b33e-43aa-b264-7a26f9a3e0a0-catalog-content\") pod \"certified-operators-gcl57\" (UID: \"87e64062-b33e-43aa-b264-7a26f9a3e0a0\") " pod="openshift-marketplace/certified-operators-gcl57" Mar 09 09:11:52 crc kubenswrapper[4861]: I0309 09:11:52.496757 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87e64062-b33e-43aa-b264-7a26f9a3e0a0-utilities\") pod \"certified-operators-gcl57\" (UID: \"87e64062-b33e-43aa-b264-7a26f9a3e0a0\") " pod="openshift-marketplace/certified-operators-gcl57" Mar 09 09:11:52 crc kubenswrapper[4861]: I0309 09:11:52.496793 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dth4\" (UniqueName: \"kubernetes.io/projected/87e64062-b33e-43aa-b264-7a26f9a3e0a0-kube-api-access-2dth4\") pod \"certified-operators-gcl57\" (UID: \"87e64062-b33e-43aa-b264-7a26f9a3e0a0\") " pod="openshift-marketplace/certified-operators-gcl57" Mar 09 09:11:52 crc kubenswrapper[4861]: I0309 09:11:52.504321 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f2cbr" Mar 09 09:11:52 crc kubenswrapper[4861]: I0309 09:11:52.598226 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87e64062-b33e-43aa-b264-7a26f9a3e0a0-catalog-content\") pod \"certified-operators-gcl57\" (UID: \"87e64062-b33e-43aa-b264-7a26f9a3e0a0\") " pod="openshift-marketplace/certified-operators-gcl57" Mar 09 09:11:52 crc kubenswrapper[4861]: I0309 09:11:52.598274 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87e64062-b33e-43aa-b264-7a26f9a3e0a0-utilities\") pod \"certified-operators-gcl57\" (UID: \"87e64062-b33e-43aa-b264-7a26f9a3e0a0\") " pod="openshift-marketplace/certified-operators-gcl57" Mar 09 09:11:52 crc kubenswrapper[4861]: I0309 09:11:52.598312 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dth4\" (UniqueName: \"kubernetes.io/projected/87e64062-b33e-43aa-b264-7a26f9a3e0a0-kube-api-access-2dth4\") pod \"certified-operators-gcl57\" (UID: \"87e64062-b33e-43aa-b264-7a26f9a3e0a0\") " pod="openshift-marketplace/certified-operators-gcl57" Mar 09 09:11:52 crc kubenswrapper[4861]: I0309 09:11:52.598914 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87e64062-b33e-43aa-b264-7a26f9a3e0a0-catalog-content\") pod \"certified-operators-gcl57\" (UID: \"87e64062-b33e-43aa-b264-7a26f9a3e0a0\") " pod="openshift-marketplace/certified-operators-gcl57" Mar 09 09:11:52 crc kubenswrapper[4861]: I0309 09:11:52.598914 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87e64062-b33e-43aa-b264-7a26f9a3e0a0-utilities\") pod \"certified-operators-gcl57\" (UID: \"87e64062-b33e-43aa-b264-7a26f9a3e0a0\") " pod="openshift-marketplace/certified-operators-gcl57" Mar 09 09:11:52 crc kubenswrapper[4861]: I0309 09:11:52.614524 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dth4\" (UniqueName: \"kubernetes.io/projected/87e64062-b33e-43aa-b264-7a26f9a3e0a0-kube-api-access-2dth4\") pod \"certified-operators-gcl57\" (UID: \"87e64062-b33e-43aa-b264-7a26f9a3e0a0\") " pod="openshift-marketplace/certified-operators-gcl57" Mar 09 09:11:52 crc kubenswrapper[4861]: I0309 09:11:52.706941 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gcl57" Mar 09 09:11:52 crc kubenswrapper[4861]: I0309 09:11:52.885270 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f2cbr"] Mar 09 09:11:52 crc kubenswrapper[4861]: I0309 09:11:52.905934 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gcl57"] Mar 09 09:11:52 crc kubenswrapper[4861]: W0309 09:11:52.915026 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87e64062_b33e_43aa_b264_7a26f9a3e0a0.slice/crio-a60d6382f432b017d1d52131c2ee6b6396be4f3beae6f988a2304868d72a2868 WatchSource:0}: Error finding container a60d6382f432b017d1d52131c2ee6b6396be4f3beae6f988a2304868d72a2868: Status 404 returned error can't find the container with id a60d6382f432b017d1d52131c2ee6b6396be4f3beae6f988a2304868d72a2868 Mar 09 09:11:53 crc kubenswrapper[4861]: I0309 09:11:53.384226 4861 generic.go:334] "Generic (PLEG): container finished" podID="87e64062-b33e-43aa-b264-7a26f9a3e0a0" containerID="bf5fd894ea5dbe01cbc0f22437f2bde19f6966780d70e368cb8ba137aa43c337" exitCode=0 Mar 09 09:11:53 crc kubenswrapper[4861]: I0309 09:11:53.384388 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gcl57" event={"ID":"87e64062-b33e-43aa-b264-7a26f9a3e0a0","Type":"ContainerDied","Data":"bf5fd894ea5dbe01cbc0f22437f2bde19f6966780d70e368cb8ba137aa43c337"} Mar 09 09:11:53 crc kubenswrapper[4861]: I0309 09:11:53.384428 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gcl57" event={"ID":"87e64062-b33e-43aa-b264-7a26f9a3e0a0","Type":"ContainerStarted","Data":"a60d6382f432b017d1d52131c2ee6b6396be4f3beae6f988a2304868d72a2868"} Mar 09 09:11:53 crc kubenswrapper[4861]: I0309 09:11:53.386112 4861 generic.go:334] "Generic (PLEG): container finished" podID="a5118375-51c4-460f-a7e4-4b0dc454daa2" containerID="f638cf1548aaf4f7fa8ac3d707726097ded507fc3072ba4480240b27854b4288" exitCode=0 Mar 09 09:11:53 crc kubenswrapper[4861]: I0309 09:11:53.387013 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f2cbr" event={"ID":"a5118375-51c4-460f-a7e4-4b0dc454daa2","Type":"ContainerDied","Data":"f638cf1548aaf4f7fa8ac3d707726097ded507fc3072ba4480240b27854b4288"} Mar 09 09:11:53 crc kubenswrapper[4861]: I0309 09:11:53.387046 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f2cbr" event={"ID":"a5118375-51c4-460f-a7e4-4b0dc454daa2","Type":"ContainerStarted","Data":"22d898a5d3e46fcd5227a8cb5d0e5ca0ec88d022fbe90be10837c601e6f3cb6c"} Mar 09 09:11:54 crc kubenswrapper[4861]: I0309 09:11:54.394258 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gcl57" event={"ID":"87e64062-b33e-43aa-b264-7a26f9a3e0a0","Type":"ContainerStarted","Data":"f140b4fde4f93b3da6c9b0200dbd8160dbe46c4cff59a849c0f0517ae92a34cb"} Mar 09 09:11:54 crc kubenswrapper[4861]: I0309 09:11:54.568109 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ktkhz"] Mar 09 09:11:54 crc kubenswrapper[4861]: I0309 09:11:54.569085 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ktkhz" Mar 09 09:11:54 crc kubenswrapper[4861]: I0309 09:11:54.571269 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 09 09:11:54 crc kubenswrapper[4861]: I0309 09:11:54.587740 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ktkhz"] Mar 09 09:11:54 crc kubenswrapper[4861]: I0309 09:11:54.736468 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f16d7e47-0abb-4811-9e99-3f68d1fd64ab-utilities\") pod \"redhat-operators-ktkhz\" (UID: \"f16d7e47-0abb-4811-9e99-3f68d1fd64ab\") " pod="openshift-marketplace/redhat-operators-ktkhz" Mar 09 09:11:54 crc kubenswrapper[4861]: I0309 09:11:54.737324 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwhq6\" (UniqueName: \"kubernetes.io/projected/f16d7e47-0abb-4811-9e99-3f68d1fd64ab-kube-api-access-bwhq6\") pod \"redhat-operators-ktkhz\" (UID: \"f16d7e47-0abb-4811-9e99-3f68d1fd64ab\") " pod="openshift-marketplace/redhat-operators-ktkhz" Mar 09 09:11:54 crc kubenswrapper[4861]: I0309 09:11:54.737433 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f16d7e47-0abb-4811-9e99-3f68d1fd64ab-catalog-content\") pod \"redhat-operators-ktkhz\" (UID: \"f16d7e47-0abb-4811-9e99-3f68d1fd64ab\") " pod="openshift-marketplace/redhat-operators-ktkhz" Mar 09 09:11:54 crc kubenswrapper[4861]: I0309 09:11:54.765669 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-c7tjq"] Mar 09 09:11:54 crc kubenswrapper[4861]: I0309 09:11:54.766849 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c7tjq" Mar 09 09:11:54 crc kubenswrapper[4861]: I0309 09:11:54.769585 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 09 09:11:54 crc kubenswrapper[4861]: I0309 09:11:54.772408 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-c7tjq"] Mar 09 09:11:54 crc kubenswrapper[4861]: I0309 09:11:54.838843 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f16d7e47-0abb-4811-9e99-3f68d1fd64ab-utilities\") pod \"redhat-operators-ktkhz\" (UID: \"f16d7e47-0abb-4811-9e99-3f68d1fd64ab\") " pod="openshift-marketplace/redhat-operators-ktkhz" Mar 09 09:11:54 crc kubenswrapper[4861]: I0309 09:11:54.838909 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwhq6\" (UniqueName: \"kubernetes.io/projected/f16d7e47-0abb-4811-9e99-3f68d1fd64ab-kube-api-access-bwhq6\") pod \"redhat-operators-ktkhz\" (UID: \"f16d7e47-0abb-4811-9e99-3f68d1fd64ab\") " pod="openshift-marketplace/redhat-operators-ktkhz" Mar 09 09:11:54 crc kubenswrapper[4861]: I0309 09:11:54.838952 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f16d7e47-0abb-4811-9e99-3f68d1fd64ab-catalog-content\") pod \"redhat-operators-ktkhz\" (UID: \"f16d7e47-0abb-4811-9e99-3f68d1fd64ab\") " pod="openshift-marketplace/redhat-operators-ktkhz" Mar 09 09:11:54 crc kubenswrapper[4861]: I0309 09:11:54.839422 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f16d7e47-0abb-4811-9e99-3f68d1fd64ab-catalog-content\") pod \"redhat-operators-ktkhz\" (UID: \"f16d7e47-0abb-4811-9e99-3f68d1fd64ab\") " pod="openshift-marketplace/redhat-operators-ktkhz" Mar 09 09:11:54 crc kubenswrapper[4861]: I0309 09:11:54.839512 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f16d7e47-0abb-4811-9e99-3f68d1fd64ab-utilities\") pod \"redhat-operators-ktkhz\" (UID: \"f16d7e47-0abb-4811-9e99-3f68d1fd64ab\") " pod="openshift-marketplace/redhat-operators-ktkhz" Mar 09 09:11:54 crc kubenswrapper[4861]: I0309 09:11:54.858289 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwhq6\" (UniqueName: \"kubernetes.io/projected/f16d7e47-0abb-4811-9e99-3f68d1fd64ab-kube-api-access-bwhq6\") pod \"redhat-operators-ktkhz\" (UID: \"f16d7e47-0abb-4811-9e99-3f68d1fd64ab\") " pod="openshift-marketplace/redhat-operators-ktkhz" Mar 09 09:11:54 crc kubenswrapper[4861]: I0309 09:11:54.940971 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28409ae5-743b-4e9a-a432-7527ad656038-utilities\") pod \"community-operators-c7tjq\" (UID: \"28409ae5-743b-4e9a-a432-7527ad656038\") " pod="openshift-marketplace/community-operators-c7tjq" Mar 09 09:11:54 crc kubenswrapper[4861]: I0309 09:11:54.941078 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsg94\" (UniqueName: \"kubernetes.io/projected/28409ae5-743b-4e9a-a432-7527ad656038-kube-api-access-nsg94\") pod \"community-operators-c7tjq\" (UID: \"28409ae5-743b-4e9a-a432-7527ad656038\") " pod="openshift-marketplace/community-operators-c7tjq" Mar 09 09:11:54 crc kubenswrapper[4861]: I0309 09:11:54.941285 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28409ae5-743b-4e9a-a432-7527ad656038-catalog-content\") pod \"community-operators-c7tjq\" (UID: \"28409ae5-743b-4e9a-a432-7527ad656038\") " pod="openshift-marketplace/community-operators-c7tjq" Mar 09 09:11:54 crc kubenswrapper[4861]: I0309 09:11:54.961509 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ktkhz" Mar 09 09:11:55 crc kubenswrapper[4861]: I0309 09:11:55.042830 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28409ae5-743b-4e9a-a432-7527ad656038-utilities\") pod \"community-operators-c7tjq\" (UID: \"28409ae5-743b-4e9a-a432-7527ad656038\") " pod="openshift-marketplace/community-operators-c7tjq" Mar 09 09:11:55 crc kubenswrapper[4861]: I0309 09:11:55.042890 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsg94\" (UniqueName: \"kubernetes.io/projected/28409ae5-743b-4e9a-a432-7527ad656038-kube-api-access-nsg94\") pod \"community-operators-c7tjq\" (UID: \"28409ae5-743b-4e9a-a432-7527ad656038\") " pod="openshift-marketplace/community-operators-c7tjq" Mar 09 09:11:55 crc kubenswrapper[4861]: I0309 09:11:55.042943 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28409ae5-743b-4e9a-a432-7527ad656038-catalog-content\") pod \"community-operators-c7tjq\" (UID: \"28409ae5-743b-4e9a-a432-7527ad656038\") " pod="openshift-marketplace/community-operators-c7tjq" Mar 09 09:11:55 crc kubenswrapper[4861]: I0309 09:11:55.043629 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28409ae5-743b-4e9a-a432-7527ad656038-catalog-content\") pod \"community-operators-c7tjq\" (UID: \"28409ae5-743b-4e9a-a432-7527ad656038\") " pod="openshift-marketplace/community-operators-c7tjq" Mar 09 09:11:55 crc kubenswrapper[4861]: I0309 09:11:55.043632 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28409ae5-743b-4e9a-a432-7527ad656038-utilities\") pod \"community-operators-c7tjq\" (UID: \"28409ae5-743b-4e9a-a432-7527ad656038\") " pod="openshift-marketplace/community-operators-c7tjq" Mar 09 09:11:55 crc kubenswrapper[4861]: I0309 09:11:55.072556 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsg94\" (UniqueName: \"kubernetes.io/projected/28409ae5-743b-4e9a-a432-7527ad656038-kube-api-access-nsg94\") pod \"community-operators-c7tjq\" (UID: \"28409ae5-743b-4e9a-a432-7527ad656038\") " pod="openshift-marketplace/community-operators-c7tjq" Mar 09 09:11:55 crc kubenswrapper[4861]: I0309 09:11:55.098103 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c7tjq" Mar 09 09:11:55 crc kubenswrapper[4861]: I0309 09:11:55.290569 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-c7tjq"] Mar 09 09:11:55 crc kubenswrapper[4861]: W0309 09:11:55.299041 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28409ae5_743b_4e9a_a432_7527ad656038.slice/crio-abab6ab16dedab94a71262e85f2e109e9dfc1f0909d09c1369d17858a96c60bf WatchSource:0}: Error finding container abab6ab16dedab94a71262e85f2e109e9dfc1f0909d09c1369d17858a96c60bf: Status 404 returned error can't find the container with id abab6ab16dedab94a71262e85f2e109e9dfc1f0909d09c1369d17858a96c60bf Mar 09 09:11:55 crc kubenswrapper[4861]: I0309 09:11:55.370484 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ktkhz"] Mar 09 09:11:55 crc kubenswrapper[4861]: W0309 09:11:55.385449 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf16d7e47_0abb_4811_9e99_3f68d1fd64ab.slice/crio-cf7a1fd8edb90385039d4efe4868772a78f7030eb0ac7fc07d6aacebc3cff5da WatchSource:0}: Error finding container cf7a1fd8edb90385039d4efe4868772a78f7030eb0ac7fc07d6aacebc3cff5da: Status 404 returned error can't find the container with id cf7a1fd8edb90385039d4efe4868772a78f7030eb0ac7fc07d6aacebc3cff5da Mar 09 09:11:55 crc kubenswrapper[4861]: I0309 09:11:55.402541 4861 generic.go:334] "Generic (PLEG): container finished" podID="87e64062-b33e-43aa-b264-7a26f9a3e0a0" containerID="f140b4fde4f93b3da6c9b0200dbd8160dbe46c4cff59a849c0f0517ae92a34cb" exitCode=0 Mar 09 09:11:55 crc kubenswrapper[4861]: I0309 09:11:55.402616 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gcl57" event={"ID":"87e64062-b33e-43aa-b264-7a26f9a3e0a0","Type":"ContainerDied","Data":"f140b4fde4f93b3da6c9b0200dbd8160dbe46c4cff59a849c0f0517ae92a34cb"} Mar 09 09:11:55 crc kubenswrapper[4861]: I0309 09:11:55.404404 4861 generic.go:334] "Generic (PLEG): container finished" podID="a5118375-51c4-460f-a7e4-4b0dc454daa2" containerID="5a60e95da13de1b3366b7eaa922fa5f235b34dd68eddf3b5bee34e7675b2d872" exitCode=0 Mar 09 09:11:55 crc kubenswrapper[4861]: I0309 09:11:55.404492 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f2cbr" event={"ID":"a5118375-51c4-460f-a7e4-4b0dc454daa2","Type":"ContainerDied","Data":"5a60e95da13de1b3366b7eaa922fa5f235b34dd68eddf3b5bee34e7675b2d872"} Mar 09 09:11:55 crc kubenswrapper[4861]: I0309 09:11:55.405483 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ktkhz" event={"ID":"f16d7e47-0abb-4811-9e99-3f68d1fd64ab","Type":"ContainerStarted","Data":"cf7a1fd8edb90385039d4efe4868772a78f7030eb0ac7fc07d6aacebc3cff5da"} Mar 09 09:11:55 crc kubenswrapper[4861]: I0309 09:11:55.409738 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c7tjq" event={"ID":"28409ae5-743b-4e9a-a432-7527ad656038","Type":"ContainerStarted","Data":"abab6ab16dedab94a71262e85f2e109e9dfc1f0909d09c1369d17858a96c60bf"} Mar 09 09:11:56 crc kubenswrapper[4861]: I0309 09:11:56.416179 4861 generic.go:334] "Generic (PLEG): container finished" podID="28409ae5-743b-4e9a-a432-7527ad656038" containerID="49201029376bd3de9273d910d2d3937761ed9b1ade2b44032a47f919695bde9a" exitCode=0 Mar 09 09:11:56 crc kubenswrapper[4861]: I0309 09:11:56.416253 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c7tjq" event={"ID":"28409ae5-743b-4e9a-a432-7527ad656038","Type":"ContainerDied","Data":"49201029376bd3de9273d910d2d3937761ed9b1ade2b44032a47f919695bde9a"} Mar 09 09:11:56 crc kubenswrapper[4861]: I0309 09:11:56.418286 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gcl57" event={"ID":"87e64062-b33e-43aa-b264-7a26f9a3e0a0","Type":"ContainerStarted","Data":"2e282372bb9b49de7a0d1abbba728cb99d734ea818eada10092d1c6cdeccccb3"} Mar 09 09:11:56 crc kubenswrapper[4861]: I0309 09:11:56.421268 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f2cbr" event={"ID":"a5118375-51c4-460f-a7e4-4b0dc454daa2","Type":"ContainerStarted","Data":"5c75257241d4052e01ddae482536290fc2c573af70946cd0c9eeb961d456b97c"} Mar 09 09:11:56 crc kubenswrapper[4861]: I0309 09:11:56.422670 4861 generic.go:334] "Generic (PLEG): container finished" podID="f16d7e47-0abb-4811-9e99-3f68d1fd64ab" containerID="163625b5a8dba99e49b66b66b719c7e98a9a1e2af150588c6d30c172c9c19601" exitCode=0 Mar 09 09:11:56 crc kubenswrapper[4861]: I0309 09:11:56.422708 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ktkhz" event={"ID":"f16d7e47-0abb-4811-9e99-3f68d1fd64ab","Type":"ContainerDied","Data":"163625b5a8dba99e49b66b66b719c7e98a9a1e2af150588c6d30c172c9c19601"} Mar 09 09:11:56 crc kubenswrapper[4861]: I0309 09:11:56.459989 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-f2cbr" podStartSLOduration=2.015983702 podStartE2EDuration="4.459963407s" podCreationTimestamp="2026-03-09 09:11:52 +0000 UTC" firstStartedPulling="2026-03-09 09:11:53.387678878 +0000 UTC m=+356.472718279" lastFinishedPulling="2026-03-09 09:11:55.831658553 +0000 UTC m=+358.916697984" observedRunningTime="2026-03-09 09:11:56.457343468 +0000 UTC m=+359.542382869" watchObservedRunningTime="2026-03-09 09:11:56.459963407 +0000 UTC m=+359.545002818" Mar 09 09:11:56 crc kubenswrapper[4861]: I0309 09:11:56.489176 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gcl57" podStartSLOduration=2.07083636 podStartE2EDuration="4.489152133s" podCreationTimestamp="2026-03-09 09:11:52 +0000 UTC" firstStartedPulling="2026-03-09 09:11:53.387109182 +0000 UTC m=+356.472148583" lastFinishedPulling="2026-03-09 09:11:55.805424915 +0000 UTC m=+358.890464356" observedRunningTime="2026-03-09 09:11:56.485316642 +0000 UTC m=+359.570356043" watchObservedRunningTime="2026-03-09 09:11:56.489152133 +0000 UTC m=+359.574191564" Mar 09 09:11:57 crc kubenswrapper[4861]: I0309 09:11:57.430518 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ktkhz" event={"ID":"f16d7e47-0abb-4811-9e99-3f68d1fd64ab","Type":"ContainerStarted","Data":"3dc416d81781c1526759c3a67a1004b0eee17b0d3d2dcd80976e905730342ee7"} Mar 09 09:11:57 crc kubenswrapper[4861]: I0309 09:11:57.432168 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c7tjq" event={"ID":"28409ae5-743b-4e9a-a432-7527ad656038","Type":"ContainerStarted","Data":"8167fe13a58e1a39990999833cbb0a4dace9670a61d2856f419c2e3301e4f4b0"} Mar 09 09:11:57 crc kubenswrapper[4861]: I0309 09:11:57.621948 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-n4xz4" Mar 09 09:11:57 crc kubenswrapper[4861]: I0309 09:11:57.695014 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-cjkj6"] Mar 09 09:11:58 crc kubenswrapper[4861]: I0309 09:11:58.442643 4861 generic.go:334] "Generic (PLEG): container finished" podID="f16d7e47-0abb-4811-9e99-3f68d1fd64ab" containerID="3dc416d81781c1526759c3a67a1004b0eee17b0d3d2dcd80976e905730342ee7" exitCode=0 Mar 09 09:11:58 crc kubenswrapper[4861]: I0309 09:11:58.442764 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ktkhz" event={"ID":"f16d7e47-0abb-4811-9e99-3f68d1fd64ab","Type":"ContainerDied","Data":"3dc416d81781c1526759c3a67a1004b0eee17b0d3d2dcd80976e905730342ee7"} Mar 09 09:11:58 crc kubenswrapper[4861]: I0309 09:11:58.447200 4861 generic.go:334] "Generic (PLEG): container finished" podID="28409ae5-743b-4e9a-a432-7527ad656038" containerID="8167fe13a58e1a39990999833cbb0a4dace9670a61d2856f419c2e3301e4f4b0" exitCode=0 Mar 09 09:11:58 crc kubenswrapper[4861]: I0309 09:11:58.447240 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c7tjq" event={"ID":"28409ae5-743b-4e9a-a432-7527ad656038","Type":"ContainerDied","Data":"8167fe13a58e1a39990999833cbb0a4dace9670a61d2856f419c2e3301e4f4b0"} Mar 09 09:11:59 crc kubenswrapper[4861]: I0309 09:11:59.455467 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ktkhz" event={"ID":"f16d7e47-0abb-4811-9e99-3f68d1fd64ab","Type":"ContainerStarted","Data":"6d3197590150674db4ee7dc73429404f0f6dfd41dc568c4134d92708ea9ca1d4"} Mar 09 09:11:59 crc kubenswrapper[4861]: I0309 09:11:59.470010 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c7tjq" event={"ID":"28409ae5-743b-4e9a-a432-7527ad656038","Type":"ContainerStarted","Data":"2476df60bfe8edebf5fbd478fc535b995acada9081305854ca705fbd49d8e73b"} Mar 09 09:11:59 crc kubenswrapper[4861]: I0309 09:11:59.481593 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ktkhz" podStartSLOduration=3.039291004 podStartE2EDuration="5.481566896s" podCreationTimestamp="2026-03-09 09:11:54 +0000 UTC" firstStartedPulling="2026-03-09 09:11:56.424018843 +0000 UTC m=+359.509058264" lastFinishedPulling="2026-03-09 09:11:58.866294715 +0000 UTC m=+361.951334156" observedRunningTime="2026-03-09 09:11:59.480636591 +0000 UTC m=+362.565676012" watchObservedRunningTime="2026-03-09 09:11:59.481566896 +0000 UTC m=+362.566606337" Mar 09 09:11:59 crc kubenswrapper[4861]: I0309 09:11:59.503871 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-c7tjq" podStartSLOduration=2.836291926 podStartE2EDuration="5.503839731s" podCreationTimestamp="2026-03-09 09:11:54 +0000 UTC" firstStartedPulling="2026-03-09 09:11:56.417766209 +0000 UTC m=+359.502805650" lastFinishedPulling="2026-03-09 09:11:59.085314054 +0000 UTC m=+362.170353455" observedRunningTime="2026-03-09 09:11:59.4988794 +0000 UTC m=+362.583918811" watchObservedRunningTime="2026-03-09 09:11:59.503839731 +0000 UTC m=+362.588879172" Mar 09 09:12:00 crc kubenswrapper[4861]: I0309 09:12:00.150471 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550792-9bqqs"] Mar 09 09:12:00 crc kubenswrapper[4861]: I0309 09:12:00.151323 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550792-9bqqs" Mar 09 09:12:00 crc kubenswrapper[4861]: I0309 09:12:00.153430 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:12:00 crc kubenswrapper[4861]: I0309 09:12:00.153563 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:12:00 crc kubenswrapper[4861]: I0309 09:12:00.153723 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k86l8" Mar 09 09:12:00 crc kubenswrapper[4861]: I0309 09:12:00.164001 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550792-9bqqs"] Mar 09 09:12:00 crc kubenswrapper[4861]: I0309 09:12:00.308631 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zsr4\" (UniqueName: \"kubernetes.io/projected/0326dab3-8e2c-4b7c-8bf0-6a7686493119-kube-api-access-2zsr4\") pod \"auto-csr-approver-29550792-9bqqs\" (UID: \"0326dab3-8e2c-4b7c-8bf0-6a7686493119\") " pod="openshift-infra/auto-csr-approver-29550792-9bqqs" Mar 09 09:12:00 crc kubenswrapper[4861]: I0309 09:12:00.409964 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zsr4\" (UniqueName: \"kubernetes.io/projected/0326dab3-8e2c-4b7c-8bf0-6a7686493119-kube-api-access-2zsr4\") pod \"auto-csr-approver-29550792-9bqqs\" (UID: \"0326dab3-8e2c-4b7c-8bf0-6a7686493119\") " pod="openshift-infra/auto-csr-approver-29550792-9bqqs" Mar 09 09:12:00 crc kubenswrapper[4861]: I0309 09:12:00.444889 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zsr4\" (UniqueName: \"kubernetes.io/projected/0326dab3-8e2c-4b7c-8bf0-6a7686493119-kube-api-access-2zsr4\") pod \"auto-csr-approver-29550792-9bqqs\" (UID: \"0326dab3-8e2c-4b7c-8bf0-6a7686493119\") " pod="openshift-infra/auto-csr-approver-29550792-9bqqs" Mar 09 09:12:00 crc kubenswrapper[4861]: I0309 09:12:00.483853 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550792-9bqqs" Mar 09 09:12:00 crc kubenswrapper[4861]: I0309 09:12:00.750812 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550792-9bqqs"] Mar 09 09:12:01 crc kubenswrapper[4861]: I0309 09:12:01.482017 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550792-9bqqs" event={"ID":"0326dab3-8e2c-4b7c-8bf0-6a7686493119","Type":"ContainerStarted","Data":"55c64fec9f0828e6d0d1deba3e854165077b61276ef16b4ec9df6d5aa2d935fe"} Mar 09 09:12:02 crc kubenswrapper[4861]: I0309 09:12:02.490577 4861 generic.go:334] "Generic (PLEG): container finished" podID="0326dab3-8e2c-4b7c-8bf0-6a7686493119" containerID="32762a937646933984976064473504d2b6624d005ad9b09d571b815645b18e11" exitCode=0 Mar 09 09:12:02 crc kubenswrapper[4861]: I0309 09:12:02.490665 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550792-9bqqs" event={"ID":"0326dab3-8e2c-4b7c-8bf0-6a7686493119","Type":"ContainerDied","Data":"32762a937646933984976064473504d2b6624d005ad9b09d571b815645b18e11"} Mar 09 09:12:02 crc kubenswrapper[4861]: I0309 09:12:02.505919 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-f2cbr" Mar 09 09:12:02 crc kubenswrapper[4861]: I0309 09:12:02.506652 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-f2cbr" Mar 09 09:12:02 crc kubenswrapper[4861]: I0309 09:12:02.565901 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-f2cbr" Mar 09 09:12:02 crc kubenswrapper[4861]: I0309 09:12:02.708582 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gcl57" Mar 09 09:12:02 crc kubenswrapper[4861]: I0309 09:12:02.708653 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gcl57" Mar 09 09:12:02 crc kubenswrapper[4861]: I0309 09:12:02.755282 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gcl57" Mar 09 09:12:03 crc kubenswrapper[4861]: I0309 09:12:03.539100 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gcl57" Mar 09 09:12:03 crc kubenswrapper[4861]: I0309 09:12:03.545506 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-f2cbr" Mar 09 09:12:04 crc kubenswrapper[4861]: I0309 09:12:03.754911 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550792-9bqqs" Mar 09 09:12:04 crc kubenswrapper[4861]: I0309 09:12:03.871254 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zsr4\" (UniqueName: \"kubernetes.io/projected/0326dab3-8e2c-4b7c-8bf0-6a7686493119-kube-api-access-2zsr4\") pod \"0326dab3-8e2c-4b7c-8bf0-6a7686493119\" (UID: \"0326dab3-8e2c-4b7c-8bf0-6a7686493119\") " Mar 09 09:12:04 crc kubenswrapper[4861]: I0309 09:12:03.877342 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0326dab3-8e2c-4b7c-8bf0-6a7686493119-kube-api-access-2zsr4" (OuterVolumeSpecName: "kube-api-access-2zsr4") pod "0326dab3-8e2c-4b7c-8bf0-6a7686493119" (UID: "0326dab3-8e2c-4b7c-8bf0-6a7686493119"). InnerVolumeSpecName "kube-api-access-2zsr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:12:04 crc kubenswrapper[4861]: I0309 09:12:03.973148 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zsr4\" (UniqueName: \"kubernetes.io/projected/0326dab3-8e2c-4b7c-8bf0-6a7686493119-kube-api-access-2zsr4\") on node \"crc\" DevicePath \"\"" Mar 09 09:12:04 crc kubenswrapper[4861]: I0309 09:12:04.509404 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550792-9bqqs" event={"ID":"0326dab3-8e2c-4b7c-8bf0-6a7686493119","Type":"ContainerDied","Data":"55c64fec9f0828e6d0d1deba3e854165077b61276ef16b4ec9df6d5aa2d935fe"} Mar 09 09:12:04 crc kubenswrapper[4861]: I0309 09:12:04.509472 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55c64fec9f0828e6d0d1deba3e854165077b61276ef16b4ec9df6d5aa2d935fe" Mar 09 09:12:04 crc kubenswrapper[4861]: I0309 09:12:04.509558 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550792-9bqqs" Mar 09 09:12:04 crc kubenswrapper[4861]: I0309 09:12:04.962390 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ktkhz" Mar 09 09:12:04 crc kubenswrapper[4861]: I0309 09:12:04.963346 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ktkhz" Mar 09 09:12:05 crc kubenswrapper[4861]: I0309 09:12:05.099153 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-c7tjq" Mar 09 09:12:05 crc kubenswrapper[4861]: I0309 09:12:05.099465 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-c7tjq" Mar 09 09:12:05 crc kubenswrapper[4861]: I0309 09:12:05.165706 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-c7tjq" Mar 09 09:12:05 crc kubenswrapper[4861]: I0309 09:12:05.585996 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-c7tjq" Mar 09 09:12:06 crc kubenswrapper[4861]: I0309 09:12:06.016508 4861 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ktkhz" podUID="f16d7e47-0abb-4811-9e99-3f68d1fd64ab" containerName="registry-server" probeResult="failure" output=< Mar 09 09:12:06 crc kubenswrapper[4861]: timeout: failed to connect service ":50051" within 1s Mar 09 09:12:06 crc kubenswrapper[4861]: > Mar 09 09:12:15 crc kubenswrapper[4861]: I0309 09:12:15.032617 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ktkhz" Mar 09 09:12:15 crc kubenswrapper[4861]: I0309 09:12:15.096489 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ktkhz" Mar 09 09:12:22 crc kubenswrapper[4861]: I0309 09:12:22.726554 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" podUID="de2fac75-67e1-47c9-9507-b8b5e5857c32" containerName="registry" containerID="cri-o://177973148e294570ba55cc2ef874b58295e992029884438b97035e4bcf3d4111" gracePeriod=30 Mar 09 09:12:23 crc kubenswrapper[4861]: I0309 09:12:23.090587 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:12:23 crc kubenswrapper[4861]: I0309 09:12:23.254174 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/de2fac75-67e1-47c9-9507-b8b5e5857c32-registry-certificates\") pod \"de2fac75-67e1-47c9-9507-b8b5e5857c32\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " Mar 09 09:12:23 crc kubenswrapper[4861]: I0309 09:12:23.254796 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"de2fac75-67e1-47c9-9507-b8b5e5857c32\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " Mar 09 09:12:23 crc kubenswrapper[4861]: I0309 09:12:23.254895 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/de2fac75-67e1-47c9-9507-b8b5e5857c32-ca-trust-extracted\") pod \"de2fac75-67e1-47c9-9507-b8b5e5857c32\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " Mar 09 09:12:23 crc kubenswrapper[4861]: I0309 09:12:23.254968 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/de2fac75-67e1-47c9-9507-b8b5e5857c32-installation-pull-secrets\") pod \"de2fac75-67e1-47c9-9507-b8b5e5857c32\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " Mar 09 09:12:23 crc kubenswrapper[4861]: I0309 09:12:23.255953 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de2fac75-67e1-47c9-9507-b8b5e5857c32-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "de2fac75-67e1-47c9-9507-b8b5e5857c32" (UID: "de2fac75-67e1-47c9-9507-b8b5e5857c32"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:12:23 crc kubenswrapper[4861]: I0309 09:12:23.256005 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/de2fac75-67e1-47c9-9507-b8b5e5857c32-bound-sa-token\") pod \"de2fac75-67e1-47c9-9507-b8b5e5857c32\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " Mar 09 09:12:23 crc kubenswrapper[4861]: I0309 09:12:23.256157 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jw22\" (UniqueName: \"kubernetes.io/projected/de2fac75-67e1-47c9-9507-b8b5e5857c32-kube-api-access-6jw22\") pod \"de2fac75-67e1-47c9-9507-b8b5e5857c32\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " Mar 09 09:12:23 crc kubenswrapper[4861]: I0309 09:12:23.256233 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/de2fac75-67e1-47c9-9507-b8b5e5857c32-trusted-ca\") pod \"de2fac75-67e1-47c9-9507-b8b5e5857c32\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " Mar 09 09:12:23 crc kubenswrapper[4861]: I0309 09:12:23.256294 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/de2fac75-67e1-47c9-9507-b8b5e5857c32-registry-tls\") pod \"de2fac75-67e1-47c9-9507-b8b5e5857c32\" (UID: \"de2fac75-67e1-47c9-9507-b8b5e5857c32\") " Mar 09 09:12:23 crc kubenswrapper[4861]: I0309 09:12:23.256993 4861 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/de2fac75-67e1-47c9-9507-b8b5e5857c32-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 09 09:12:23 crc kubenswrapper[4861]: I0309 09:12:23.258053 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de2fac75-67e1-47c9-9507-b8b5e5857c32-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "de2fac75-67e1-47c9-9507-b8b5e5857c32" (UID: "de2fac75-67e1-47c9-9507-b8b5e5857c32"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:12:23 crc kubenswrapper[4861]: I0309 09:12:23.261048 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de2fac75-67e1-47c9-9507-b8b5e5857c32-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "de2fac75-67e1-47c9-9507-b8b5e5857c32" (UID: "de2fac75-67e1-47c9-9507-b8b5e5857c32"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:12:23 crc kubenswrapper[4861]: I0309 09:12:23.261331 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de2fac75-67e1-47c9-9507-b8b5e5857c32-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "de2fac75-67e1-47c9-9507-b8b5e5857c32" (UID: "de2fac75-67e1-47c9-9507-b8b5e5857c32"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:12:23 crc kubenswrapper[4861]: I0309 09:12:23.262087 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de2fac75-67e1-47c9-9507-b8b5e5857c32-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "de2fac75-67e1-47c9-9507-b8b5e5857c32" (UID: "de2fac75-67e1-47c9-9507-b8b5e5857c32"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:12:23 crc kubenswrapper[4861]: I0309 09:12:23.263861 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de2fac75-67e1-47c9-9507-b8b5e5857c32-kube-api-access-6jw22" (OuterVolumeSpecName: "kube-api-access-6jw22") pod "de2fac75-67e1-47c9-9507-b8b5e5857c32" (UID: "de2fac75-67e1-47c9-9507-b8b5e5857c32"). InnerVolumeSpecName "kube-api-access-6jw22". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:12:23 crc kubenswrapper[4861]: I0309 09:12:23.270912 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "de2fac75-67e1-47c9-9507-b8b5e5857c32" (UID: "de2fac75-67e1-47c9-9507-b8b5e5857c32"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 09 09:12:23 crc kubenswrapper[4861]: I0309 09:12:23.273998 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de2fac75-67e1-47c9-9507-b8b5e5857c32-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "de2fac75-67e1-47c9-9507-b8b5e5857c32" (UID: "de2fac75-67e1-47c9-9507-b8b5e5857c32"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:12:23 crc kubenswrapper[4861]: I0309 09:12:23.358036 4861 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/de2fac75-67e1-47c9-9507-b8b5e5857c32-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 09 09:12:23 crc kubenswrapper[4861]: I0309 09:12:23.358076 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jw22\" (UniqueName: \"kubernetes.io/projected/de2fac75-67e1-47c9-9507-b8b5e5857c32-kube-api-access-6jw22\") on node \"crc\" DevicePath \"\"" Mar 09 09:12:23 crc kubenswrapper[4861]: I0309 09:12:23.358092 4861 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/de2fac75-67e1-47c9-9507-b8b5e5857c32-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:12:23 crc kubenswrapper[4861]: I0309 09:12:23.358106 4861 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/de2fac75-67e1-47c9-9507-b8b5e5857c32-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 09 09:12:23 crc kubenswrapper[4861]: I0309 09:12:23.358119 4861 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/de2fac75-67e1-47c9-9507-b8b5e5857c32-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 09 09:12:23 crc kubenswrapper[4861]: I0309 09:12:23.358130 4861 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/de2fac75-67e1-47c9-9507-b8b5e5857c32-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 09 09:12:23 crc kubenswrapper[4861]: I0309 09:12:23.631005 4861 generic.go:334] "Generic (PLEG): container finished" podID="de2fac75-67e1-47c9-9507-b8b5e5857c32" containerID="177973148e294570ba55cc2ef874b58295e992029884438b97035e4bcf3d4111" exitCode=0 Mar 09 09:12:23 crc kubenswrapper[4861]: I0309 09:12:23.631059 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" event={"ID":"de2fac75-67e1-47c9-9507-b8b5e5857c32","Type":"ContainerDied","Data":"177973148e294570ba55cc2ef874b58295e992029884438b97035e4bcf3d4111"} Mar 09 09:12:23 crc kubenswrapper[4861]: I0309 09:12:23.631094 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" event={"ID":"de2fac75-67e1-47c9-9507-b8b5e5857c32","Type":"ContainerDied","Data":"b621a4388d578ca3799f69b1114c535158e30133e975e54e5ce33728b658a1e2"} Mar 09 09:12:23 crc kubenswrapper[4861]: I0309 09:12:23.631114 4861 scope.go:117] "RemoveContainer" containerID="177973148e294570ba55cc2ef874b58295e992029884438b97035e4bcf3d4111" Mar 09 09:12:23 crc kubenswrapper[4861]: I0309 09:12:23.631150 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:12:24 crc kubenswrapper[4861]: I0309 09:12:24.606583 4861 patch_prober.go:28] interesting pod/machine-config-daemon-5g7gc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:12:24 crc kubenswrapper[4861]: I0309 09:12:24.606882 4861 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:12:24 crc kubenswrapper[4861]: I0309 09:12:24.688907 4861 scope.go:117] "RemoveContainer" containerID="177973148e294570ba55cc2ef874b58295e992029884438b97035e4bcf3d4111" Mar 09 09:12:24 crc kubenswrapper[4861]: E0309 09:12:24.689635 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"177973148e294570ba55cc2ef874b58295e992029884438b97035e4bcf3d4111\": container with ID starting with 177973148e294570ba55cc2ef874b58295e992029884438b97035e4bcf3d4111 not found: ID does not exist" containerID="177973148e294570ba55cc2ef874b58295e992029884438b97035e4bcf3d4111" Mar 09 09:12:24 crc kubenswrapper[4861]: I0309 09:12:24.689768 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"177973148e294570ba55cc2ef874b58295e992029884438b97035e4bcf3d4111"} err="failed to get container status \"177973148e294570ba55cc2ef874b58295e992029884438b97035e4bcf3d4111\": rpc error: code = NotFound desc = could not find container \"177973148e294570ba55cc2ef874b58295e992029884438b97035e4bcf3d4111\": container with ID starting with 177973148e294570ba55cc2ef874b58295e992029884438b97035e4bcf3d4111 not found: ID does not exist" Mar 09 09:12:54 crc kubenswrapper[4861]: I0309 09:12:54.606549 4861 patch_prober.go:28] interesting pod/machine-config-daemon-5g7gc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:12:54 crc kubenswrapper[4861]: I0309 09:12:54.607212 4861 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:12:54 crc kubenswrapper[4861]: I0309 09:12:54.677842 4861 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","burstable","podde2fac75-67e1-47c9-9507-b8b5e5857c32"] err="unable to destroy cgroup paths for cgroup [kubepods burstable podde2fac75-67e1-47c9-9507-b8b5e5857c32] : Timed out while waiting for systemd to remove kubepods-burstable-podde2fac75_67e1_47c9_9507_b8b5e5857c32.slice" Mar 09 09:12:54 crc kubenswrapper[4861]: E0309 09:12:54.677927 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods burstable podde2fac75-67e1-47c9-9507-b8b5e5857c32] : unable to destroy cgroup paths for cgroup [kubepods burstable podde2fac75-67e1-47c9-9507-b8b5e5857c32] : Timed out while waiting for systemd to remove kubepods-burstable-podde2fac75_67e1_47c9_9507_b8b5e5857c32.slice" pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" podUID="de2fac75-67e1-47c9-9507-b8b5e5857c32" Mar 09 09:12:54 crc kubenswrapper[4861]: I0309 09:12:54.839943 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-cjkj6" Mar 09 09:12:54 crc kubenswrapper[4861]: I0309 09:12:54.873559 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-cjkj6"] Mar 09 09:12:54 crc kubenswrapper[4861]: I0309 09:12:54.887505 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-cjkj6"] Mar 09 09:12:55 crc kubenswrapper[4861]: I0309 09:12:55.670237 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de2fac75-67e1-47c9-9507-b8b5e5857c32" path="/var/lib/kubelet/pods/de2fac75-67e1-47c9-9507-b8b5e5857c32/volumes" Mar 09 09:13:24 crc kubenswrapper[4861]: I0309 09:13:24.605998 4861 patch_prober.go:28] interesting pod/machine-config-daemon-5g7gc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:13:24 crc kubenswrapper[4861]: I0309 09:13:24.606510 4861 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:13:24 crc kubenswrapper[4861]: I0309 09:13:24.606558 4861 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" Mar 09 09:13:24 crc kubenswrapper[4861]: I0309 09:13:24.607088 4861 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cc00f8c91adc84713b416ee4eb89ff4342a32c6c79a3ecec7efa8bfd9fbb3202"} pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 09:13:24 crc kubenswrapper[4861]: I0309 09:13:24.607147 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" containerID="cri-o://cc00f8c91adc84713b416ee4eb89ff4342a32c6c79a3ecec7efa8bfd9fbb3202" gracePeriod=600 Mar 09 09:13:25 crc kubenswrapper[4861]: I0309 09:13:25.037593 4861 generic.go:334] "Generic (PLEG): container finished" podID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerID="cc00f8c91adc84713b416ee4eb89ff4342a32c6c79a3ecec7efa8bfd9fbb3202" exitCode=0 Mar 09 09:13:25 crc kubenswrapper[4861]: I0309 09:13:25.037705 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" event={"ID":"6f7875e3-174f-4c67-8675-d878de74aa4f","Type":"ContainerDied","Data":"cc00f8c91adc84713b416ee4eb89ff4342a32c6c79a3ecec7efa8bfd9fbb3202"} Mar 09 09:13:25 crc kubenswrapper[4861]: I0309 09:13:25.038296 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" event={"ID":"6f7875e3-174f-4c67-8675-d878de74aa4f","Type":"ContainerStarted","Data":"647e93428b4f8c4a06ed5cd023dd7ae9e5817d85b57d9999c6b9891ba5cdb78e"} Mar 09 09:13:25 crc kubenswrapper[4861]: I0309 09:13:25.038323 4861 scope.go:117] "RemoveContainer" containerID="c970ace96d4c918f6e61a749abffe084d175df04b5393bf6029d502cdda837af" Mar 09 09:14:00 crc kubenswrapper[4861]: I0309 09:14:00.142024 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550794-4fbmp"] Mar 09 09:14:00 crc kubenswrapper[4861]: E0309 09:14:00.143851 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de2fac75-67e1-47c9-9507-b8b5e5857c32" containerName="registry" Mar 09 09:14:00 crc kubenswrapper[4861]: I0309 09:14:00.143957 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="de2fac75-67e1-47c9-9507-b8b5e5857c32" containerName="registry" Mar 09 09:14:00 crc kubenswrapper[4861]: E0309 09:14:00.144017 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0326dab3-8e2c-4b7c-8bf0-6a7686493119" containerName="oc" Mar 09 09:14:00 crc kubenswrapper[4861]: I0309 09:14:00.144065 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="0326dab3-8e2c-4b7c-8bf0-6a7686493119" containerName="oc" Mar 09 09:14:00 crc kubenswrapper[4861]: I0309 09:14:00.144194 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="0326dab3-8e2c-4b7c-8bf0-6a7686493119" containerName="oc" Mar 09 09:14:00 crc kubenswrapper[4861]: I0309 09:14:00.144249 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="de2fac75-67e1-47c9-9507-b8b5e5857c32" containerName="registry" Mar 09 09:14:00 crc kubenswrapper[4861]: I0309 09:14:00.145183 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550794-4fbmp" Mar 09 09:14:00 crc kubenswrapper[4861]: I0309 09:14:00.149478 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k86l8" Mar 09 09:14:00 crc kubenswrapper[4861]: I0309 09:14:00.149544 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:14:00 crc kubenswrapper[4861]: I0309 09:14:00.149576 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:14:00 crc kubenswrapper[4861]: I0309 09:14:00.152882 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550794-4fbmp"] Mar 09 09:14:00 crc kubenswrapper[4861]: I0309 09:14:00.280643 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5phgl\" (UniqueName: \"kubernetes.io/projected/78d6ee67-9e4d-4e04-a77d-e8d6fb552bc2-kube-api-access-5phgl\") pod \"auto-csr-approver-29550794-4fbmp\" (UID: \"78d6ee67-9e4d-4e04-a77d-e8d6fb552bc2\") " pod="openshift-infra/auto-csr-approver-29550794-4fbmp" Mar 09 09:14:00 crc kubenswrapper[4861]: I0309 09:14:00.382090 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5phgl\" (UniqueName: \"kubernetes.io/projected/78d6ee67-9e4d-4e04-a77d-e8d6fb552bc2-kube-api-access-5phgl\") pod \"auto-csr-approver-29550794-4fbmp\" (UID: \"78d6ee67-9e4d-4e04-a77d-e8d6fb552bc2\") " pod="openshift-infra/auto-csr-approver-29550794-4fbmp" Mar 09 09:14:00 crc kubenswrapper[4861]: I0309 09:14:00.416189 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5phgl\" (UniqueName: \"kubernetes.io/projected/78d6ee67-9e4d-4e04-a77d-e8d6fb552bc2-kube-api-access-5phgl\") pod \"auto-csr-approver-29550794-4fbmp\" (UID: \"78d6ee67-9e4d-4e04-a77d-e8d6fb552bc2\") " pod="openshift-infra/auto-csr-approver-29550794-4fbmp" Mar 09 09:14:00 crc kubenswrapper[4861]: I0309 09:14:00.472530 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550794-4fbmp" Mar 09 09:14:00 crc kubenswrapper[4861]: I0309 09:14:00.704879 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550794-4fbmp"] Mar 09 09:14:00 crc kubenswrapper[4861]: I0309 09:14:00.722401 4861 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 09:14:01 crc kubenswrapper[4861]: I0309 09:14:01.258922 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550794-4fbmp" event={"ID":"78d6ee67-9e4d-4e04-a77d-e8d6fb552bc2","Type":"ContainerStarted","Data":"23ca036e7ae5f456447aaa79b4f8f330ce6315d148c66ac9891cccf3d1a5e1d4"} Mar 09 09:14:02 crc kubenswrapper[4861]: I0309 09:14:02.266568 4861 generic.go:334] "Generic (PLEG): container finished" podID="78d6ee67-9e4d-4e04-a77d-e8d6fb552bc2" containerID="359b847262f54581c00f715fa1d567f5271b3130f3dc14a31f8df8eb7ab9860c" exitCode=0 Mar 09 09:14:02 crc kubenswrapper[4861]: I0309 09:14:02.266627 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550794-4fbmp" event={"ID":"78d6ee67-9e4d-4e04-a77d-e8d6fb552bc2","Type":"ContainerDied","Data":"359b847262f54581c00f715fa1d567f5271b3130f3dc14a31f8df8eb7ab9860c"} Mar 09 09:14:02 crc kubenswrapper[4861]: I0309 09:14:02.400333 4861 scope.go:117] "RemoveContainer" containerID="dfbeac2019ba0e86c10d28619a8d9daf60a74580546b76ca5ed8562bcbb84c7f" Mar 09 09:14:03 crc kubenswrapper[4861]: I0309 09:14:03.591898 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550794-4fbmp" Mar 09 09:14:03 crc kubenswrapper[4861]: I0309 09:14:03.732764 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5phgl\" (UniqueName: \"kubernetes.io/projected/78d6ee67-9e4d-4e04-a77d-e8d6fb552bc2-kube-api-access-5phgl\") pod \"78d6ee67-9e4d-4e04-a77d-e8d6fb552bc2\" (UID: \"78d6ee67-9e4d-4e04-a77d-e8d6fb552bc2\") " Mar 09 09:14:03 crc kubenswrapper[4861]: I0309 09:14:03.738671 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78d6ee67-9e4d-4e04-a77d-e8d6fb552bc2-kube-api-access-5phgl" (OuterVolumeSpecName: "kube-api-access-5phgl") pod "78d6ee67-9e4d-4e04-a77d-e8d6fb552bc2" (UID: "78d6ee67-9e4d-4e04-a77d-e8d6fb552bc2"). InnerVolumeSpecName "kube-api-access-5phgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:14:03 crc kubenswrapper[4861]: I0309 09:14:03.834168 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5phgl\" (UniqueName: \"kubernetes.io/projected/78d6ee67-9e4d-4e04-a77d-e8d6fb552bc2-kube-api-access-5phgl\") on node \"crc\" DevicePath \"\"" Mar 09 09:14:04 crc kubenswrapper[4861]: I0309 09:14:04.283342 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550794-4fbmp" event={"ID":"78d6ee67-9e4d-4e04-a77d-e8d6fb552bc2","Type":"ContainerDied","Data":"23ca036e7ae5f456447aaa79b4f8f330ce6315d148c66ac9891cccf3d1a5e1d4"} Mar 09 09:14:04 crc kubenswrapper[4861]: I0309 09:14:04.283409 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23ca036e7ae5f456447aaa79b4f8f330ce6315d148c66ac9891cccf3d1a5e1d4" Mar 09 09:14:04 crc kubenswrapper[4861]: I0309 09:14:04.283455 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550794-4fbmp" Mar 09 09:14:04 crc kubenswrapper[4861]: I0309 09:14:04.679073 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550788-wz6bg"] Mar 09 09:14:04 crc kubenswrapper[4861]: I0309 09:14:04.688505 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550788-wz6bg"] Mar 09 09:14:05 crc kubenswrapper[4861]: I0309 09:14:05.665559 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6effa8f1-34f2-4a9e-b5cb-71a02695603e" path="/var/lib/kubelet/pods/6effa8f1-34f2-4a9e-b5cb-71a02695603e/volumes" Mar 09 09:15:00 crc kubenswrapper[4861]: I0309 09:15:00.160950 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550795-k4z5f"] Mar 09 09:15:00 crc kubenswrapper[4861]: E0309 09:15:00.164258 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78d6ee67-9e4d-4e04-a77d-e8d6fb552bc2" containerName="oc" Mar 09 09:15:00 crc kubenswrapper[4861]: I0309 09:15:00.164324 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="78d6ee67-9e4d-4e04-a77d-e8d6fb552bc2" containerName="oc" Mar 09 09:15:00 crc kubenswrapper[4861]: I0309 09:15:00.164797 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="78d6ee67-9e4d-4e04-a77d-e8d6fb552bc2" containerName="oc" Mar 09 09:15:00 crc kubenswrapper[4861]: I0309 09:15:00.166019 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550795-k4z5f" Mar 09 09:15:00 crc kubenswrapper[4861]: I0309 09:15:00.173740 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 09 09:15:00 crc kubenswrapper[4861]: I0309 09:15:00.173939 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 09 09:15:00 crc kubenswrapper[4861]: I0309 09:15:00.180811 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550795-k4z5f"] Mar 09 09:15:00 crc kubenswrapper[4861]: I0309 09:15:00.217023 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cb0f3303-62a4-48b9-859e-0abd577ad301-secret-volume\") pod \"collect-profiles-29550795-k4z5f\" (UID: \"cb0f3303-62a4-48b9-859e-0abd577ad301\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550795-k4z5f" Mar 09 09:15:00 crc kubenswrapper[4861]: I0309 09:15:00.217198 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpr4l\" (UniqueName: \"kubernetes.io/projected/cb0f3303-62a4-48b9-859e-0abd577ad301-kube-api-access-kpr4l\") pod \"collect-profiles-29550795-k4z5f\" (UID: \"cb0f3303-62a4-48b9-859e-0abd577ad301\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550795-k4z5f" Mar 09 09:15:00 crc kubenswrapper[4861]: I0309 09:15:00.217240 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cb0f3303-62a4-48b9-859e-0abd577ad301-config-volume\") pod \"collect-profiles-29550795-k4z5f\" (UID: \"cb0f3303-62a4-48b9-859e-0abd577ad301\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550795-k4z5f" Mar 09 09:15:00 crc kubenswrapper[4861]: I0309 09:15:00.318036 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cb0f3303-62a4-48b9-859e-0abd577ad301-secret-volume\") pod \"collect-profiles-29550795-k4z5f\" (UID: \"cb0f3303-62a4-48b9-859e-0abd577ad301\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550795-k4z5f" Mar 09 09:15:00 crc kubenswrapper[4861]: I0309 09:15:00.318147 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpr4l\" (UniqueName: \"kubernetes.io/projected/cb0f3303-62a4-48b9-859e-0abd577ad301-kube-api-access-kpr4l\") pod \"collect-profiles-29550795-k4z5f\" (UID: \"cb0f3303-62a4-48b9-859e-0abd577ad301\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550795-k4z5f" Mar 09 09:15:00 crc kubenswrapper[4861]: I0309 09:15:00.318185 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cb0f3303-62a4-48b9-859e-0abd577ad301-config-volume\") pod \"collect-profiles-29550795-k4z5f\" (UID: \"cb0f3303-62a4-48b9-859e-0abd577ad301\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550795-k4z5f" Mar 09 09:15:00 crc kubenswrapper[4861]: I0309 09:15:00.319502 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cb0f3303-62a4-48b9-859e-0abd577ad301-config-volume\") pod \"collect-profiles-29550795-k4z5f\" (UID: \"cb0f3303-62a4-48b9-859e-0abd577ad301\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550795-k4z5f" Mar 09 09:15:00 crc kubenswrapper[4861]: I0309 09:15:00.329528 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cb0f3303-62a4-48b9-859e-0abd577ad301-secret-volume\") pod \"collect-profiles-29550795-k4z5f\" (UID: \"cb0f3303-62a4-48b9-859e-0abd577ad301\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550795-k4z5f" Mar 09 09:15:00 crc kubenswrapper[4861]: I0309 09:15:00.341769 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpr4l\" (UniqueName: \"kubernetes.io/projected/cb0f3303-62a4-48b9-859e-0abd577ad301-kube-api-access-kpr4l\") pod \"collect-profiles-29550795-k4z5f\" (UID: \"cb0f3303-62a4-48b9-859e-0abd577ad301\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550795-k4z5f" Mar 09 09:15:00 crc kubenswrapper[4861]: I0309 09:15:00.500242 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550795-k4z5f" Mar 09 09:15:00 crc kubenswrapper[4861]: I0309 09:15:00.946643 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550795-k4z5f"] Mar 09 09:15:01 crc kubenswrapper[4861]: I0309 09:15:01.712039 4861 generic.go:334] "Generic (PLEG): container finished" podID="cb0f3303-62a4-48b9-859e-0abd577ad301" containerID="21979918b0a41c070adc11f01f1153fc0640c1ad3848685946ff780bc76726d7" exitCode=0 Mar 09 09:15:01 crc kubenswrapper[4861]: I0309 09:15:01.712129 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550795-k4z5f" event={"ID":"cb0f3303-62a4-48b9-859e-0abd577ad301","Type":"ContainerDied","Data":"21979918b0a41c070adc11f01f1153fc0640c1ad3848685946ff780bc76726d7"} Mar 09 09:15:01 crc kubenswrapper[4861]: I0309 09:15:01.712427 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550795-k4z5f" event={"ID":"cb0f3303-62a4-48b9-859e-0abd577ad301","Type":"ContainerStarted","Data":"0c49cc35ba38fbb115ef57450983b2051d53f7bc2be3618501e1f208023098b7"} Mar 09 09:15:02 crc kubenswrapper[4861]: I0309 09:15:02.439480 4861 scope.go:117] "RemoveContainer" containerID="35bf0fee9e9c674042ed4c4fbdb5bad9779dc06cb0cae1aac4678ac51c9c8ed8" Mar 09 09:15:02 crc kubenswrapper[4861]: I0309 09:15:02.484569 4861 scope.go:117] "RemoveContainer" containerID="ab8a12c5254f416fb5a2a4a147b4cc56c5c02f1759f3c4b4d624ea331c9cf149" Mar 09 09:15:02 crc kubenswrapper[4861]: I0309 09:15:02.500843 4861 scope.go:117] "RemoveContainer" containerID="abf47dfe3aed929bdefcb4118d3105871b156bec1a89dac0e067eacb8a91a487" Mar 09 09:15:02 crc kubenswrapper[4861]: I0309 09:15:02.517007 4861 scope.go:117] "RemoveContainer" containerID="5cc27015cd8e7901d288e5fdcef535feb4e96147fb697254c6ae19cea8c1f72e" Mar 09 09:15:02 crc kubenswrapper[4861]: I0309 09:15:02.949101 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550795-k4z5f" Mar 09 09:15:03 crc kubenswrapper[4861]: I0309 09:15:03.052634 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cb0f3303-62a4-48b9-859e-0abd577ad301-config-volume\") pod \"cb0f3303-62a4-48b9-859e-0abd577ad301\" (UID: \"cb0f3303-62a4-48b9-859e-0abd577ad301\") " Mar 09 09:15:03 crc kubenswrapper[4861]: I0309 09:15:03.052734 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cb0f3303-62a4-48b9-859e-0abd577ad301-secret-volume\") pod \"cb0f3303-62a4-48b9-859e-0abd577ad301\" (UID: \"cb0f3303-62a4-48b9-859e-0abd577ad301\") " Mar 09 09:15:03 crc kubenswrapper[4861]: I0309 09:15:03.052822 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpr4l\" (UniqueName: \"kubernetes.io/projected/cb0f3303-62a4-48b9-859e-0abd577ad301-kube-api-access-kpr4l\") pod \"cb0f3303-62a4-48b9-859e-0abd577ad301\" (UID: \"cb0f3303-62a4-48b9-859e-0abd577ad301\") " Mar 09 09:15:03 crc kubenswrapper[4861]: I0309 09:15:03.053536 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb0f3303-62a4-48b9-859e-0abd577ad301-config-volume" (OuterVolumeSpecName: "config-volume") pod "cb0f3303-62a4-48b9-859e-0abd577ad301" (UID: "cb0f3303-62a4-48b9-859e-0abd577ad301"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:15:03 crc kubenswrapper[4861]: I0309 09:15:03.057742 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb0f3303-62a4-48b9-859e-0abd577ad301-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "cb0f3303-62a4-48b9-859e-0abd577ad301" (UID: "cb0f3303-62a4-48b9-859e-0abd577ad301"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:15:03 crc kubenswrapper[4861]: I0309 09:15:03.057931 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb0f3303-62a4-48b9-859e-0abd577ad301-kube-api-access-kpr4l" (OuterVolumeSpecName: "kube-api-access-kpr4l") pod "cb0f3303-62a4-48b9-859e-0abd577ad301" (UID: "cb0f3303-62a4-48b9-859e-0abd577ad301"). InnerVolumeSpecName "kube-api-access-kpr4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:15:03 crc kubenswrapper[4861]: I0309 09:15:03.154669 4861 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cb0f3303-62a4-48b9-859e-0abd577ad301-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 09 09:15:03 crc kubenswrapper[4861]: I0309 09:15:03.154714 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kpr4l\" (UniqueName: \"kubernetes.io/projected/cb0f3303-62a4-48b9-859e-0abd577ad301-kube-api-access-kpr4l\") on node \"crc\" DevicePath \"\"" Mar 09 09:15:03 crc kubenswrapper[4861]: I0309 09:15:03.154729 4861 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cb0f3303-62a4-48b9-859e-0abd577ad301-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 09:15:03 crc kubenswrapper[4861]: I0309 09:15:03.728254 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550795-k4z5f" event={"ID":"cb0f3303-62a4-48b9-859e-0abd577ad301","Type":"ContainerDied","Data":"0c49cc35ba38fbb115ef57450983b2051d53f7bc2be3618501e1f208023098b7"} Mar 09 09:15:03 crc kubenswrapper[4861]: I0309 09:15:03.728294 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c49cc35ba38fbb115ef57450983b2051d53f7bc2be3618501e1f208023098b7" Mar 09 09:15:03 crc kubenswrapper[4861]: I0309 09:15:03.728325 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550795-k4z5f" Mar 09 09:15:24 crc kubenswrapper[4861]: I0309 09:15:24.606298 4861 patch_prober.go:28] interesting pod/machine-config-daemon-5g7gc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:15:24 crc kubenswrapper[4861]: I0309 09:15:24.606786 4861 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:15:54 crc kubenswrapper[4861]: I0309 09:15:54.606155 4861 patch_prober.go:28] interesting pod/machine-config-daemon-5g7gc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:15:54 crc kubenswrapper[4861]: I0309 09:15:54.606991 4861 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:16:00 crc kubenswrapper[4861]: I0309 09:16:00.147015 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550796-fhd9r"] Mar 09 09:16:00 crc kubenswrapper[4861]: E0309 09:16:00.148000 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb0f3303-62a4-48b9-859e-0abd577ad301" containerName="collect-profiles" Mar 09 09:16:00 crc kubenswrapper[4861]: I0309 09:16:00.148055 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb0f3303-62a4-48b9-859e-0abd577ad301" containerName="collect-profiles" Mar 09 09:16:00 crc kubenswrapper[4861]: I0309 09:16:00.148245 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb0f3303-62a4-48b9-859e-0abd577ad301" containerName="collect-profiles" Mar 09 09:16:00 crc kubenswrapper[4861]: I0309 09:16:00.148959 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550796-fhd9r" Mar 09 09:16:00 crc kubenswrapper[4861]: I0309 09:16:00.151668 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:16:00 crc kubenswrapper[4861]: I0309 09:16:00.152653 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k86l8" Mar 09 09:16:00 crc kubenswrapper[4861]: I0309 09:16:00.157663 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:16:00 crc kubenswrapper[4861]: I0309 09:16:00.161998 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550796-fhd9r"] Mar 09 09:16:00 crc kubenswrapper[4861]: I0309 09:16:00.347030 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nf2cw\" (UniqueName: \"kubernetes.io/projected/6640fac9-5944-4811-9b15-15741cc9d35c-kube-api-access-nf2cw\") pod \"auto-csr-approver-29550796-fhd9r\" (UID: \"6640fac9-5944-4811-9b15-15741cc9d35c\") " pod="openshift-infra/auto-csr-approver-29550796-fhd9r" Mar 09 09:16:00 crc kubenswrapper[4861]: I0309 09:16:00.448042 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nf2cw\" (UniqueName: \"kubernetes.io/projected/6640fac9-5944-4811-9b15-15741cc9d35c-kube-api-access-nf2cw\") pod \"auto-csr-approver-29550796-fhd9r\" (UID: \"6640fac9-5944-4811-9b15-15741cc9d35c\") " pod="openshift-infra/auto-csr-approver-29550796-fhd9r" Mar 09 09:16:00 crc kubenswrapper[4861]: I0309 09:16:00.485465 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nf2cw\" (UniqueName: \"kubernetes.io/projected/6640fac9-5944-4811-9b15-15741cc9d35c-kube-api-access-nf2cw\") pod \"auto-csr-approver-29550796-fhd9r\" (UID: \"6640fac9-5944-4811-9b15-15741cc9d35c\") " pod="openshift-infra/auto-csr-approver-29550796-fhd9r" Mar 09 09:16:00 crc kubenswrapper[4861]: I0309 09:16:00.492743 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550796-fhd9r" Mar 09 09:16:00 crc kubenswrapper[4861]: I0309 09:16:00.687528 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550796-fhd9r"] Mar 09 09:16:01 crc kubenswrapper[4861]: I0309 09:16:01.111082 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550796-fhd9r" event={"ID":"6640fac9-5944-4811-9b15-15741cc9d35c","Type":"ContainerStarted","Data":"507515aefb680e951fb4e1e0797c01cd61bddb0e65a4e08f072fbd9eb7286b2e"} Mar 09 09:16:03 crc kubenswrapper[4861]: I0309 09:16:03.123915 4861 generic.go:334] "Generic (PLEG): container finished" podID="6640fac9-5944-4811-9b15-15741cc9d35c" containerID="4c0c9c8b6e4389c0650f8bf6aa40f2831f9be11b143f5e71b3c423f8298e4e6e" exitCode=0 Mar 09 09:16:03 crc kubenswrapper[4861]: I0309 09:16:03.124012 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550796-fhd9r" event={"ID":"6640fac9-5944-4811-9b15-15741cc9d35c","Type":"ContainerDied","Data":"4c0c9c8b6e4389c0650f8bf6aa40f2831f9be11b143f5e71b3c423f8298e4e6e"} Mar 09 09:16:04 crc kubenswrapper[4861]: I0309 09:16:04.408629 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550796-fhd9r" Mar 09 09:16:04 crc kubenswrapper[4861]: I0309 09:16:04.531727 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nf2cw\" (UniqueName: \"kubernetes.io/projected/6640fac9-5944-4811-9b15-15741cc9d35c-kube-api-access-nf2cw\") pod \"6640fac9-5944-4811-9b15-15741cc9d35c\" (UID: \"6640fac9-5944-4811-9b15-15741cc9d35c\") " Mar 09 09:16:04 crc kubenswrapper[4861]: I0309 09:16:04.537528 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6640fac9-5944-4811-9b15-15741cc9d35c-kube-api-access-nf2cw" (OuterVolumeSpecName: "kube-api-access-nf2cw") pod "6640fac9-5944-4811-9b15-15741cc9d35c" (UID: "6640fac9-5944-4811-9b15-15741cc9d35c"). InnerVolumeSpecName "kube-api-access-nf2cw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:16:04 crc kubenswrapper[4861]: I0309 09:16:04.632662 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nf2cw\" (UniqueName: \"kubernetes.io/projected/6640fac9-5944-4811-9b15-15741cc9d35c-kube-api-access-nf2cw\") on node \"crc\" DevicePath \"\"" Mar 09 09:16:05 crc kubenswrapper[4861]: I0309 09:16:05.137496 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550796-fhd9r" event={"ID":"6640fac9-5944-4811-9b15-15741cc9d35c","Type":"ContainerDied","Data":"507515aefb680e951fb4e1e0797c01cd61bddb0e65a4e08f072fbd9eb7286b2e"} Mar 09 09:16:05 crc kubenswrapper[4861]: I0309 09:16:05.137540 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="507515aefb680e951fb4e1e0797c01cd61bddb0e65a4e08f072fbd9eb7286b2e" Mar 09 09:16:05 crc kubenswrapper[4861]: I0309 09:16:05.137597 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550796-fhd9r" Mar 09 09:16:05 crc kubenswrapper[4861]: I0309 09:16:05.461911 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550790-dmhn9"] Mar 09 09:16:05 crc kubenswrapper[4861]: I0309 09:16:05.465025 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550790-dmhn9"] Mar 09 09:16:05 crc kubenswrapper[4861]: I0309 09:16:05.668764 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccc4b52c-e5fd-4766-8b3f-674190065ed0" path="/var/lib/kubelet/pods/ccc4b52c-e5fd-4766-8b3f-674190065ed0/volumes" Mar 09 09:16:24 crc kubenswrapper[4861]: I0309 09:16:24.605990 4861 patch_prober.go:28] interesting pod/machine-config-daemon-5g7gc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:16:24 crc kubenswrapper[4861]: I0309 09:16:24.606735 4861 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:16:24 crc kubenswrapper[4861]: I0309 09:16:24.606790 4861 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" Mar 09 09:16:24 crc kubenswrapper[4861]: I0309 09:16:24.607436 4861 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"647e93428b4f8c4a06ed5cd023dd7ae9e5817d85b57d9999c6b9891ba5cdb78e"} pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 09:16:24 crc kubenswrapper[4861]: I0309 09:16:24.607506 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" containerID="cri-o://647e93428b4f8c4a06ed5cd023dd7ae9e5817d85b57d9999c6b9891ba5cdb78e" gracePeriod=600 Mar 09 09:16:25 crc kubenswrapper[4861]: I0309 09:16:25.306425 4861 generic.go:334] "Generic (PLEG): container finished" podID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerID="647e93428b4f8c4a06ed5cd023dd7ae9e5817d85b57d9999c6b9891ba5cdb78e" exitCode=0 Mar 09 09:16:25 crc kubenswrapper[4861]: I0309 09:16:25.306555 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" event={"ID":"6f7875e3-174f-4c67-8675-d878de74aa4f","Type":"ContainerDied","Data":"647e93428b4f8c4a06ed5cd023dd7ae9e5817d85b57d9999c6b9891ba5cdb78e"} Mar 09 09:16:25 crc kubenswrapper[4861]: I0309 09:16:25.306898 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" event={"ID":"6f7875e3-174f-4c67-8675-d878de74aa4f","Type":"ContainerStarted","Data":"e0df4d92a9184d4707aae8be303bba70127fc6e2155c0877c558c19c847ce33b"} Mar 09 09:16:25 crc kubenswrapper[4861]: I0309 09:16:25.306931 4861 scope.go:117] "RemoveContainer" containerID="cc00f8c91adc84713b416ee4eb89ff4342a32c6c79a3ecec7efa8bfd9fbb3202" Mar 09 09:17:02 crc kubenswrapper[4861]: I0309 09:17:02.603466 4861 scope.go:117] "RemoveContainer" containerID="0bc369a221bc8d05ce04836901195db6e7e34c2e26724a0c1eff037b8c1ee848" Mar 09 09:17:12 crc kubenswrapper[4861]: I0309 09:17:12.629152 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-6wl44"] Mar 09 09:17:12 crc kubenswrapper[4861]: E0309 09:17:12.629931 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6640fac9-5944-4811-9b15-15741cc9d35c" containerName="oc" Mar 09 09:17:12 crc kubenswrapper[4861]: I0309 09:17:12.629946 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="6640fac9-5944-4811-9b15-15741cc9d35c" containerName="oc" Mar 09 09:17:12 crc kubenswrapper[4861]: I0309 09:17:12.630081 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="6640fac9-5944-4811-9b15-15741cc9d35c" containerName="oc" Mar 09 09:17:12 crc kubenswrapper[4861]: I0309 09:17:12.630586 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-6wl44" Mar 09 09:17:12 crc kubenswrapper[4861]: I0309 09:17:12.634331 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 09 09:17:12 crc kubenswrapper[4861]: I0309 09:17:12.634500 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 09 09:17:12 crc kubenswrapper[4861]: I0309 09:17:12.638812 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-nrqjp"] Mar 09 09:17:12 crc kubenswrapper[4861]: I0309 09:17:12.639704 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-nrqjp" Mar 09 09:17:12 crc kubenswrapper[4861]: I0309 09:17:12.642612 4861 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-7nlsh" Mar 09 09:17:12 crc kubenswrapper[4861]: I0309 09:17:12.643974 4861 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-lvtmf" Mar 09 09:17:12 crc kubenswrapper[4861]: I0309 09:17:12.659516 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-6wl44"] Mar 09 09:17:12 crc kubenswrapper[4861]: I0309 09:17:12.667456 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-nrqjp"] Mar 09 09:17:12 crc kubenswrapper[4861]: I0309 09:17:12.684802 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-g724f"] Mar 09 09:17:12 crc kubenswrapper[4861]: I0309 09:17:12.685531 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-g724f" Mar 09 09:17:12 crc kubenswrapper[4861]: I0309 09:17:12.689917 4861 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-7jg7b" Mar 09 09:17:12 crc kubenswrapper[4861]: I0309 09:17:12.698568 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-g724f"] Mar 09 09:17:12 crc kubenswrapper[4861]: I0309 09:17:12.712878 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85dw6\" (UniqueName: \"kubernetes.io/projected/6f869345-5b73-43d1-9617-bf883a753bb8-kube-api-access-85dw6\") pod \"cert-manager-cainjector-cf98fcc89-6wl44\" (UID: \"6f869345-5b73-43d1-9617-bf883a753bb8\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-6wl44" Mar 09 09:17:12 crc kubenswrapper[4861]: I0309 09:17:12.813844 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfdw6\" (UniqueName: \"kubernetes.io/projected/55473545-bf70-472a-96e5-18cc3bfac07d-kube-api-access-bfdw6\") pod \"cert-manager-webhook-687f57d79b-g724f\" (UID: \"55473545-bf70-472a-96e5-18cc3bfac07d\") " pod="cert-manager/cert-manager-webhook-687f57d79b-g724f" Mar 09 09:17:12 crc kubenswrapper[4861]: I0309 09:17:12.813921 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85dw6\" (UniqueName: \"kubernetes.io/projected/6f869345-5b73-43d1-9617-bf883a753bb8-kube-api-access-85dw6\") pod \"cert-manager-cainjector-cf98fcc89-6wl44\" (UID: \"6f869345-5b73-43d1-9617-bf883a753bb8\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-6wl44" Mar 09 09:17:12 crc kubenswrapper[4861]: I0309 09:17:12.813983 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zllmf\" (UniqueName: \"kubernetes.io/projected/07ac624a-3ef3-4179-96d7-aa49ff085d5e-kube-api-access-zllmf\") pod \"cert-manager-858654f9db-nrqjp\" (UID: \"07ac624a-3ef3-4179-96d7-aa49ff085d5e\") " pod="cert-manager/cert-manager-858654f9db-nrqjp" Mar 09 09:17:12 crc kubenswrapper[4861]: I0309 09:17:12.844363 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85dw6\" (UniqueName: \"kubernetes.io/projected/6f869345-5b73-43d1-9617-bf883a753bb8-kube-api-access-85dw6\") pod \"cert-manager-cainjector-cf98fcc89-6wl44\" (UID: \"6f869345-5b73-43d1-9617-bf883a753bb8\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-6wl44" Mar 09 09:17:12 crc kubenswrapper[4861]: I0309 09:17:12.915659 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zllmf\" (UniqueName: \"kubernetes.io/projected/07ac624a-3ef3-4179-96d7-aa49ff085d5e-kube-api-access-zllmf\") pod \"cert-manager-858654f9db-nrqjp\" (UID: \"07ac624a-3ef3-4179-96d7-aa49ff085d5e\") " pod="cert-manager/cert-manager-858654f9db-nrqjp" Mar 09 09:17:12 crc kubenswrapper[4861]: I0309 09:17:12.916155 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfdw6\" (UniqueName: \"kubernetes.io/projected/55473545-bf70-472a-96e5-18cc3bfac07d-kube-api-access-bfdw6\") pod \"cert-manager-webhook-687f57d79b-g724f\" (UID: \"55473545-bf70-472a-96e5-18cc3bfac07d\") " pod="cert-manager/cert-manager-webhook-687f57d79b-g724f" Mar 09 09:17:12 crc kubenswrapper[4861]: I0309 09:17:12.947765 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfdw6\" (UniqueName: \"kubernetes.io/projected/55473545-bf70-472a-96e5-18cc3bfac07d-kube-api-access-bfdw6\") pod \"cert-manager-webhook-687f57d79b-g724f\" (UID: \"55473545-bf70-472a-96e5-18cc3bfac07d\") " pod="cert-manager/cert-manager-webhook-687f57d79b-g724f" Mar 09 09:17:12 crc kubenswrapper[4861]: I0309 09:17:12.950548 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zllmf\" (UniqueName: \"kubernetes.io/projected/07ac624a-3ef3-4179-96d7-aa49ff085d5e-kube-api-access-zllmf\") pod \"cert-manager-858654f9db-nrqjp\" (UID: \"07ac624a-3ef3-4179-96d7-aa49ff085d5e\") " pod="cert-manager/cert-manager-858654f9db-nrqjp" Mar 09 09:17:12 crc kubenswrapper[4861]: I0309 09:17:12.967641 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-6wl44" Mar 09 09:17:12 crc kubenswrapper[4861]: I0309 09:17:12.974400 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-nrqjp" Mar 09 09:17:13 crc kubenswrapper[4861]: I0309 09:17:13.000008 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-g724f" Mar 09 09:17:13 crc kubenswrapper[4861]: I0309 09:17:13.241144 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-6wl44"] Mar 09 09:17:13 crc kubenswrapper[4861]: I0309 09:17:13.286827 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-nrqjp"] Mar 09 09:17:13 crc kubenswrapper[4861]: W0309 09:17:13.293258 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod07ac624a_3ef3_4179_96d7_aa49ff085d5e.slice/crio-145e9149529861e81310ddb2350be8cc7301d0cd3216ad01355f97d6b612b9c5 WatchSource:0}: Error finding container 145e9149529861e81310ddb2350be8cc7301d0cd3216ad01355f97d6b612b9c5: Status 404 returned error can't find the container with id 145e9149529861e81310ddb2350be8cc7301d0cd3216ad01355f97d6b612b9c5 Mar 09 09:17:13 crc kubenswrapper[4861]: I0309 09:17:13.318429 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-g724f"] Mar 09 09:17:13 crc kubenswrapper[4861]: W0309 09:17:13.325783 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55473545_bf70_472a_96e5_18cc3bfac07d.slice/crio-3a61c8e5716ba29b3e14fdf1fe5e045241c4c2b0345af8c56eb9209454dfbc46 WatchSource:0}: Error finding container 3a61c8e5716ba29b3e14fdf1fe5e045241c4c2b0345af8c56eb9209454dfbc46: Status 404 returned error can't find the container with id 3a61c8e5716ba29b3e14fdf1fe5e045241c4c2b0345af8c56eb9209454dfbc46 Mar 09 09:17:13 crc kubenswrapper[4861]: I0309 09:17:13.643088 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-6wl44" event={"ID":"6f869345-5b73-43d1-9617-bf883a753bb8","Type":"ContainerStarted","Data":"3ca3643307ebc9b805b4b2da275d85f712cc53fb3d1a436b24588aae64d4e426"} Mar 09 09:17:13 crc kubenswrapper[4861]: I0309 09:17:13.644787 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-nrqjp" event={"ID":"07ac624a-3ef3-4179-96d7-aa49ff085d5e","Type":"ContainerStarted","Data":"145e9149529861e81310ddb2350be8cc7301d0cd3216ad01355f97d6b612b9c5"} Mar 09 09:17:13 crc kubenswrapper[4861]: I0309 09:17:13.646731 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-g724f" event={"ID":"55473545-bf70-472a-96e5-18cc3bfac07d","Type":"ContainerStarted","Data":"3a61c8e5716ba29b3e14fdf1fe5e045241c4c2b0345af8c56eb9209454dfbc46"} Mar 09 09:17:16 crc kubenswrapper[4861]: I0309 09:17:16.664719 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-g724f" event={"ID":"55473545-bf70-472a-96e5-18cc3bfac07d","Type":"ContainerStarted","Data":"73c6c961f2e85c364a5af7001fb5d13de9c58c928dc257b072ea52151ba57477"} Mar 09 09:17:16 crc kubenswrapper[4861]: I0309 09:17:16.665648 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-g724f" Mar 09 09:17:16 crc kubenswrapper[4861]: I0309 09:17:16.665681 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-6wl44" event={"ID":"6f869345-5b73-43d1-9617-bf883a753bb8","Type":"ContainerStarted","Data":"761a2b3098258e57135f65d620ad4aee4c993745eea60403fe8881a1ab590c60"} Mar 09 09:17:16 crc kubenswrapper[4861]: I0309 09:17:16.688431 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-g724f" podStartSLOduration=1.9157694699999999 podStartE2EDuration="4.688413854s" podCreationTimestamp="2026-03-09 09:17:12 +0000 UTC" firstStartedPulling="2026-03-09 09:17:13.327936097 +0000 UTC m=+676.412975498" lastFinishedPulling="2026-03-09 09:17:16.100580481 +0000 UTC m=+679.185619882" observedRunningTime="2026-03-09 09:17:16.686037526 +0000 UTC m=+679.771076927" watchObservedRunningTime="2026-03-09 09:17:16.688413854 +0000 UTC m=+679.773453255" Mar 09 09:17:16 crc kubenswrapper[4861]: I0309 09:17:16.707569 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-6wl44" podStartSLOduration=1.9345154359999999 podStartE2EDuration="4.707550191s" podCreationTimestamp="2026-03-09 09:17:12 +0000 UTC" firstStartedPulling="2026-03-09 09:17:13.257535816 +0000 UTC m=+676.342575217" lastFinishedPulling="2026-03-09 09:17:16.030570571 +0000 UTC m=+679.115609972" observedRunningTime="2026-03-09 09:17:16.701883519 +0000 UTC m=+679.786922930" watchObservedRunningTime="2026-03-09 09:17:16.707550191 +0000 UTC m=+679.792589592" Mar 09 09:17:17 crc kubenswrapper[4861]: I0309 09:17:17.673906 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-nrqjp" event={"ID":"07ac624a-3ef3-4179-96d7-aa49ff085d5e","Type":"ContainerStarted","Data":"625eef83043ae3a264d873db9a002fb6ad646dc3a2ff5d8a679cdf109a4b7cf7"} Mar 09 09:17:17 crc kubenswrapper[4861]: I0309 09:17:17.703485 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-nrqjp" podStartSLOduration=1.837209624 podStartE2EDuration="5.70344824s" podCreationTimestamp="2026-03-09 09:17:12 +0000 UTC" firstStartedPulling="2026-03-09 09:17:13.296474877 +0000 UTC m=+676.381514278" lastFinishedPulling="2026-03-09 09:17:17.162713453 +0000 UTC m=+680.247752894" observedRunningTime="2026-03-09 09:17:17.699168387 +0000 UTC m=+680.784207818" watchObservedRunningTime="2026-03-09 09:17:17.70344824 +0000 UTC m=+680.788487721" Mar 09 09:17:22 crc kubenswrapper[4861]: I0309 09:17:22.844614 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-kmjsq"] Mar 09 09:17:22 crc kubenswrapper[4861]: I0309 09:17:22.852295 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" podUID="752be2d4-f338-4c5e-b51e-452fd8391c73" containerName="ovn-controller" containerID="cri-o://c73566e23363dcd42915bf2068aa8e97fe90f6c7afa409450cee62c683cbfa90" gracePeriod=30 Mar 09 09:17:22 crc kubenswrapper[4861]: I0309 09:17:22.852539 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" podUID="752be2d4-f338-4c5e-b51e-452fd8391c73" containerName="sbdb" containerID="cri-o://431c1bbb7e00b5b4c1615bff59cac3ec1f16e7a4bb52478bd3dccb0b441c24a2" gracePeriod=30 Mar 09 09:17:22 crc kubenswrapper[4861]: I0309 09:17:22.852555 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" podUID="752be2d4-f338-4c5e-b51e-452fd8391c73" containerName="northd" containerID="cri-o://77865966595bc249ebb7de4adca17e523e85c0dfab3b83bcc5ef662c7c31b0e0" gracePeriod=30 Mar 09 09:17:22 crc kubenswrapper[4861]: I0309 09:17:22.852588 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" podUID="752be2d4-f338-4c5e-b51e-452fd8391c73" containerName="kube-rbac-proxy-node" containerID="cri-o://35204d90abde22c56dbe4579e11f1da589271202a7e059926b799e4ddfb69f63" gracePeriod=30 Mar 09 09:17:22 crc kubenswrapper[4861]: I0309 09:17:22.852647 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" podUID="752be2d4-f338-4c5e-b51e-452fd8391c73" containerName="ovn-acl-logging" containerID="cri-o://04d44b4ad8719d9acaed799be8575081e6d326f27f3f02958ad30441df94d80a" gracePeriod=30 Mar 09 09:17:22 crc kubenswrapper[4861]: I0309 09:17:22.852471 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" podUID="752be2d4-f338-4c5e-b51e-452fd8391c73" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://3404d4ea05e503128dcf25a1314f47abdcd31cf9014df5b10889374c469707e8" gracePeriod=30 Mar 09 09:17:22 crc kubenswrapper[4861]: I0309 09:17:22.852355 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" podUID="752be2d4-f338-4c5e-b51e-452fd8391c73" containerName="nbdb" containerID="cri-o://c141660a01917112b004037f5bfc373a39b90e14dcb37a156d0b0dc60af56e0a" gracePeriod=30 Mar 09 09:17:22 crc kubenswrapper[4861]: I0309 09:17:22.898913 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" podUID="752be2d4-f338-4c5e-b51e-452fd8391c73" containerName="ovnkube-controller" containerID="cri-o://65c9df23fe9acb6770fb9f21b9a56a1b6551aa9c7787d2a41350815ce00dd460" gracePeriod=30 Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.002923 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-g724f" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.120559 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kmjsq_752be2d4-f338-4c5e-b51e-452fd8391c73/ovn-acl-logging/0.log" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.120987 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kmjsq_752be2d4-f338-4c5e-b51e-452fd8391c73/ovn-controller/0.log" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.121405 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.190819 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-jt6z4"] Mar 09 09:17:23 crc kubenswrapper[4861]: E0309 09:17:23.191277 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="752be2d4-f338-4c5e-b51e-452fd8391c73" containerName="ovnkube-controller" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.191398 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="752be2d4-f338-4c5e-b51e-452fd8391c73" containerName="ovnkube-controller" Mar 09 09:17:23 crc kubenswrapper[4861]: E0309 09:17:23.191481 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="752be2d4-f338-4c5e-b51e-452fd8391c73" containerName="kubecfg-setup" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.191560 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="752be2d4-f338-4c5e-b51e-452fd8391c73" containerName="kubecfg-setup" Mar 09 09:17:23 crc kubenswrapper[4861]: E0309 09:17:23.191628 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="752be2d4-f338-4c5e-b51e-452fd8391c73" containerName="ovn-controller" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.191730 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="752be2d4-f338-4c5e-b51e-452fd8391c73" containerName="ovn-controller" Mar 09 09:17:23 crc kubenswrapper[4861]: E0309 09:17:23.191903 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="752be2d4-f338-4c5e-b51e-452fd8391c73" containerName="kube-rbac-proxy-node" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.191980 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="752be2d4-f338-4c5e-b51e-452fd8391c73" containerName="kube-rbac-proxy-node" Mar 09 09:17:23 crc kubenswrapper[4861]: E0309 09:17:23.192057 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="752be2d4-f338-4c5e-b51e-452fd8391c73" containerName="kube-rbac-proxy-ovn-metrics" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.192131 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="752be2d4-f338-4c5e-b51e-452fd8391c73" containerName="kube-rbac-proxy-ovn-metrics" Mar 09 09:17:23 crc kubenswrapper[4861]: E0309 09:17:23.192201 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="752be2d4-f338-4c5e-b51e-452fd8391c73" containerName="sbdb" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.192264 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="752be2d4-f338-4c5e-b51e-452fd8391c73" containerName="sbdb" Mar 09 09:17:23 crc kubenswrapper[4861]: E0309 09:17:23.192336 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="752be2d4-f338-4c5e-b51e-452fd8391c73" containerName="ovn-acl-logging" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.192414 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="752be2d4-f338-4c5e-b51e-452fd8391c73" containerName="ovn-acl-logging" Mar 09 09:17:23 crc kubenswrapper[4861]: E0309 09:17:23.192478 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="752be2d4-f338-4c5e-b51e-452fd8391c73" containerName="nbdb" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.192545 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="752be2d4-f338-4c5e-b51e-452fd8391c73" containerName="nbdb" Mar 09 09:17:23 crc kubenswrapper[4861]: E0309 09:17:23.192613 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="752be2d4-f338-4c5e-b51e-452fd8391c73" containerName="northd" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.192671 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="752be2d4-f338-4c5e-b51e-452fd8391c73" containerName="northd" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.192835 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="752be2d4-f338-4c5e-b51e-452fd8391c73" containerName="ovn-controller" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.192903 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="752be2d4-f338-4c5e-b51e-452fd8391c73" containerName="sbdb" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.192968 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="752be2d4-f338-4c5e-b51e-452fd8391c73" containerName="kube-rbac-proxy-node" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.193036 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="752be2d4-f338-4c5e-b51e-452fd8391c73" containerName="ovnkube-controller" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.193101 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="752be2d4-f338-4c5e-b51e-452fd8391c73" containerName="nbdb" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.193160 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="752be2d4-f338-4c5e-b51e-452fd8391c73" containerName="ovn-acl-logging" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.193220 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="752be2d4-f338-4c5e-b51e-452fd8391c73" containerName="kube-rbac-proxy-ovn-metrics" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.193286 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="752be2d4-f338-4c5e-b51e-452fd8391c73" containerName="northd" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.195270 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.286982 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-run-ovn\") pod \"752be2d4-f338-4c5e-b51e-452fd8391c73\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.287091 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/752be2d4-f338-4c5e-b51e-452fd8391c73-env-overrides\") pod \"752be2d4-f338-4c5e-b51e-452fd8391c73\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.287142 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/752be2d4-f338-4c5e-b51e-452fd8391c73-ovnkube-config\") pod \"752be2d4-f338-4c5e-b51e-452fd8391c73\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.287179 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/752be2d4-f338-4c5e-b51e-452fd8391c73-ovn-node-metrics-cert\") pod \"752be2d4-f338-4c5e-b51e-452fd8391c73\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.287212 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxs46\" (UniqueName: \"kubernetes.io/projected/752be2d4-f338-4c5e-b51e-452fd8391c73-kube-api-access-fxs46\") pod \"752be2d4-f338-4c5e-b51e-452fd8391c73\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.287177 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "752be2d4-f338-4c5e-b51e-452fd8391c73" (UID: "752be2d4-f338-4c5e-b51e-452fd8391c73"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.287278 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-host-slash\") pod \"752be2d4-f338-4c5e-b51e-452fd8391c73\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.287305 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-etc-openvswitch\") pod \"752be2d4-f338-4c5e-b51e-452fd8391c73\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.287350 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-host-var-lib-cni-networks-ovn-kubernetes\") pod \"752be2d4-f338-4c5e-b51e-452fd8391c73\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.287423 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-host-cni-bin\") pod \"752be2d4-f338-4c5e-b51e-452fd8391c73\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.287454 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-run-openvswitch\") pod \"752be2d4-f338-4c5e-b51e-452fd8391c73\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.287480 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-host-run-netns\") pod \"752be2d4-f338-4c5e-b51e-452fd8391c73\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.287507 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-var-lib-openvswitch\") pod \"752be2d4-f338-4c5e-b51e-452fd8391c73\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.287542 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/752be2d4-f338-4c5e-b51e-452fd8391c73-ovnkube-script-lib\") pod \"752be2d4-f338-4c5e-b51e-452fd8391c73\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.287573 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-systemd-units\") pod \"752be2d4-f338-4c5e-b51e-452fd8391c73\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.287606 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-run-systemd\") pod \"752be2d4-f338-4c5e-b51e-452fd8391c73\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.287637 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-host-cni-netd\") pod \"752be2d4-f338-4c5e-b51e-452fd8391c73\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.287666 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-log-socket\") pod \"752be2d4-f338-4c5e-b51e-452fd8391c73\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.287693 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-host-run-ovn-kubernetes\") pod \"752be2d4-f338-4c5e-b51e-452fd8391c73\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.287732 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-node-log\") pod \"752be2d4-f338-4c5e-b51e-452fd8391c73\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.287762 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-host-kubelet\") pod \"752be2d4-f338-4c5e-b51e-452fd8391c73\" (UID: \"752be2d4-f338-4c5e-b51e-452fd8391c73\") " Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.287817 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/752be2d4-f338-4c5e-b51e-452fd8391c73-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "752be2d4-f338-4c5e-b51e-452fd8391c73" (UID: "752be2d4-f338-4c5e-b51e-452fd8391c73"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.287880 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "752be2d4-f338-4c5e-b51e-452fd8391c73" (UID: "752be2d4-f338-4c5e-b51e-452fd8391c73"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.288050 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/25f54e79-6851-4967-8fba-16c66a7ea099-host-kubelet\") pod \"ovnkube-node-jt6z4\" (UID: \"25f54e79-6851-4967-8fba-16c66a7ea099\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.288117 4861 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.288489 4861 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.288531 4861 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/752be2d4-f338-4c5e-b51e-452fd8391c73-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.288572 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "752be2d4-f338-4c5e-b51e-452fd8391c73" (UID: "752be2d4-f338-4c5e-b51e-452fd8391c73"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.288573 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "752be2d4-f338-4c5e-b51e-452fd8391c73" (UID: "752be2d4-f338-4c5e-b51e-452fd8391c73"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.288634 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "752be2d4-f338-4c5e-b51e-452fd8391c73" (UID: "752be2d4-f338-4c5e-b51e-452fd8391c73"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.288610 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "752be2d4-f338-4c5e-b51e-452fd8391c73" (UID: "752be2d4-f338-4c5e-b51e-452fd8391c73"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.288683 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-log-socket" (OuterVolumeSpecName: "log-socket") pod "752be2d4-f338-4c5e-b51e-452fd8391c73" (UID: "752be2d4-f338-4c5e-b51e-452fd8391c73"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.288708 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "752be2d4-f338-4c5e-b51e-452fd8391c73" (UID: "752be2d4-f338-4c5e-b51e-452fd8391c73"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.288747 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/752be2d4-f338-4c5e-b51e-452fd8391c73-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "752be2d4-f338-4c5e-b51e-452fd8391c73" (UID: "752be2d4-f338-4c5e-b51e-452fd8391c73"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.288801 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "752be2d4-f338-4c5e-b51e-452fd8391c73" (UID: "752be2d4-f338-4c5e-b51e-452fd8391c73"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.288837 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-node-log" (OuterVolumeSpecName: "node-log") pod "752be2d4-f338-4c5e-b51e-452fd8391c73" (UID: "752be2d4-f338-4c5e-b51e-452fd8391c73"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.288870 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "752be2d4-f338-4c5e-b51e-452fd8391c73" (UID: "752be2d4-f338-4c5e-b51e-452fd8391c73"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.288915 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "752be2d4-f338-4c5e-b51e-452fd8391c73" (UID: "752be2d4-f338-4c5e-b51e-452fd8391c73"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.288948 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-host-slash" (OuterVolumeSpecName: "host-slash") pod "752be2d4-f338-4c5e-b51e-452fd8391c73" (UID: "752be2d4-f338-4c5e-b51e-452fd8391c73"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.288990 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "752be2d4-f338-4c5e-b51e-452fd8391c73" (UID: "752be2d4-f338-4c5e-b51e-452fd8391c73"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.288420 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/752be2d4-f338-4c5e-b51e-452fd8391c73-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "752be2d4-f338-4c5e-b51e-452fd8391c73" (UID: "752be2d4-f338-4c5e-b51e-452fd8391c73"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.301097 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/752be2d4-f338-4c5e-b51e-452fd8391c73-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "752be2d4-f338-4c5e-b51e-452fd8391c73" (UID: "752be2d4-f338-4c5e-b51e-452fd8391c73"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.301355 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/752be2d4-f338-4c5e-b51e-452fd8391c73-kube-api-access-fxs46" (OuterVolumeSpecName: "kube-api-access-fxs46") pod "752be2d4-f338-4c5e-b51e-452fd8391c73" (UID: "752be2d4-f338-4c5e-b51e-452fd8391c73"). InnerVolumeSpecName "kube-api-access-fxs46". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.310402 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "752be2d4-f338-4c5e-b51e-452fd8391c73" (UID: "752be2d4-f338-4c5e-b51e-452fd8391c73"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.389903 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/25f54e79-6851-4967-8fba-16c66a7ea099-host-cni-netd\") pod \"ovnkube-node-jt6z4\" (UID: \"25f54e79-6851-4967-8fba-16c66a7ea099\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.389985 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/25f54e79-6851-4967-8fba-16c66a7ea099-run-systemd\") pod \"ovnkube-node-jt6z4\" (UID: \"25f54e79-6851-4967-8fba-16c66a7ea099\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.390013 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/25f54e79-6851-4967-8fba-16c66a7ea099-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jt6z4\" (UID: \"25f54e79-6851-4967-8fba-16c66a7ea099\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.390036 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/25f54e79-6851-4967-8fba-16c66a7ea099-systemd-units\") pod \"ovnkube-node-jt6z4\" (UID: \"25f54e79-6851-4967-8fba-16c66a7ea099\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.390057 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/25f54e79-6851-4967-8fba-16c66a7ea099-run-openvswitch\") pod \"ovnkube-node-jt6z4\" (UID: \"25f54e79-6851-4967-8fba-16c66a7ea099\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.390088 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/25f54e79-6851-4967-8fba-16c66a7ea099-host-kubelet\") pod \"ovnkube-node-jt6z4\" (UID: \"25f54e79-6851-4967-8fba-16c66a7ea099\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.390111 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/25f54e79-6851-4967-8fba-16c66a7ea099-etc-openvswitch\") pod \"ovnkube-node-jt6z4\" (UID: \"25f54e79-6851-4967-8fba-16c66a7ea099\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.390158 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/25f54e79-6851-4967-8fba-16c66a7ea099-host-kubelet\") pod \"ovnkube-node-jt6z4\" (UID: \"25f54e79-6851-4967-8fba-16c66a7ea099\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.390254 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/25f54e79-6851-4967-8fba-16c66a7ea099-ovn-node-metrics-cert\") pod \"ovnkube-node-jt6z4\" (UID: \"25f54e79-6851-4967-8fba-16c66a7ea099\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.390306 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/25f54e79-6851-4967-8fba-16c66a7ea099-var-lib-openvswitch\") pod \"ovnkube-node-jt6z4\" (UID: \"25f54e79-6851-4967-8fba-16c66a7ea099\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.390322 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bb8t8\" (UniqueName: \"kubernetes.io/projected/25f54e79-6851-4967-8fba-16c66a7ea099-kube-api-access-bb8t8\") pod \"ovnkube-node-jt6z4\" (UID: \"25f54e79-6851-4967-8fba-16c66a7ea099\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.390400 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/25f54e79-6851-4967-8fba-16c66a7ea099-host-run-netns\") pod \"ovnkube-node-jt6z4\" (UID: \"25f54e79-6851-4967-8fba-16c66a7ea099\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.390433 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/25f54e79-6851-4967-8fba-16c66a7ea099-node-log\") pod \"ovnkube-node-jt6z4\" (UID: \"25f54e79-6851-4967-8fba-16c66a7ea099\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.390456 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/25f54e79-6851-4967-8fba-16c66a7ea099-run-ovn\") pod \"ovnkube-node-jt6z4\" (UID: \"25f54e79-6851-4967-8fba-16c66a7ea099\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.390481 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/25f54e79-6851-4967-8fba-16c66a7ea099-ovnkube-script-lib\") pod \"ovnkube-node-jt6z4\" (UID: \"25f54e79-6851-4967-8fba-16c66a7ea099\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.390502 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/25f54e79-6851-4967-8fba-16c66a7ea099-host-run-ovn-kubernetes\") pod \"ovnkube-node-jt6z4\" (UID: \"25f54e79-6851-4967-8fba-16c66a7ea099\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.390524 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/25f54e79-6851-4967-8fba-16c66a7ea099-host-cni-bin\") pod \"ovnkube-node-jt6z4\" (UID: \"25f54e79-6851-4967-8fba-16c66a7ea099\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.390584 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/25f54e79-6851-4967-8fba-16c66a7ea099-log-socket\") pod \"ovnkube-node-jt6z4\" (UID: \"25f54e79-6851-4967-8fba-16c66a7ea099\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.390622 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/25f54e79-6851-4967-8fba-16c66a7ea099-ovnkube-config\") pod \"ovnkube-node-jt6z4\" (UID: \"25f54e79-6851-4967-8fba-16c66a7ea099\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.390644 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/25f54e79-6851-4967-8fba-16c66a7ea099-env-overrides\") pod \"ovnkube-node-jt6z4\" (UID: \"25f54e79-6851-4967-8fba-16c66a7ea099\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.390712 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/25f54e79-6851-4967-8fba-16c66a7ea099-host-slash\") pod \"ovnkube-node-jt6z4\" (UID: \"25f54e79-6851-4967-8fba-16c66a7ea099\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.390787 4861 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/752be2d4-f338-4c5e-b51e-452fd8391c73-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.390803 4861 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/752be2d4-f338-4c5e-b51e-452fd8391c73-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.390815 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxs46\" (UniqueName: \"kubernetes.io/projected/752be2d4-f338-4c5e-b51e-452fd8391c73-kube-api-access-fxs46\") on node \"crc\" DevicePath \"\"" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.390835 4861 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-host-slash\") on node \"crc\" DevicePath \"\"" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.390846 4861 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.390859 4861 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.390871 4861 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.390882 4861 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.390891 4861 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.390901 4861 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/752be2d4-f338-4c5e-b51e-452fd8391c73-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.390911 4861 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.390921 4861 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.390932 4861 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.390942 4861 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-log-socket\") on node \"crc\" DevicePath \"\"" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.390954 4861 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.390966 4861 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-node-log\") on node \"crc\" DevicePath \"\"" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.390976 4861 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/752be2d4-f338-4c5e-b51e-452fd8391c73-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.492243 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/25f54e79-6851-4967-8fba-16c66a7ea099-host-slash\") pod \"ovnkube-node-jt6z4\" (UID: \"25f54e79-6851-4967-8fba-16c66a7ea099\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.492394 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/25f54e79-6851-4967-8fba-16c66a7ea099-host-cni-netd\") pod \"ovnkube-node-jt6z4\" (UID: \"25f54e79-6851-4967-8fba-16c66a7ea099\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.492416 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/25f54e79-6851-4967-8fba-16c66a7ea099-host-slash\") pod \"ovnkube-node-jt6z4\" (UID: \"25f54e79-6851-4967-8fba-16c66a7ea099\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.492430 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/25f54e79-6851-4967-8fba-16c66a7ea099-run-systemd\") pod \"ovnkube-node-jt6z4\" (UID: \"25f54e79-6851-4967-8fba-16c66a7ea099\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.492490 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/25f54e79-6851-4967-8fba-16c66a7ea099-run-systemd\") pod \"ovnkube-node-jt6z4\" (UID: \"25f54e79-6851-4967-8fba-16c66a7ea099\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.492503 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/25f54e79-6851-4967-8fba-16c66a7ea099-host-cni-netd\") pod \"ovnkube-node-jt6z4\" (UID: \"25f54e79-6851-4967-8fba-16c66a7ea099\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.492553 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/25f54e79-6851-4967-8fba-16c66a7ea099-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jt6z4\" (UID: \"25f54e79-6851-4967-8fba-16c66a7ea099\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.492591 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/25f54e79-6851-4967-8fba-16c66a7ea099-systemd-units\") pod \"ovnkube-node-jt6z4\" (UID: \"25f54e79-6851-4967-8fba-16c66a7ea099\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.492595 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/25f54e79-6851-4967-8fba-16c66a7ea099-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jt6z4\" (UID: \"25f54e79-6851-4967-8fba-16c66a7ea099\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.492632 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/25f54e79-6851-4967-8fba-16c66a7ea099-run-openvswitch\") pod \"ovnkube-node-jt6z4\" (UID: \"25f54e79-6851-4967-8fba-16c66a7ea099\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.492685 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/25f54e79-6851-4967-8fba-16c66a7ea099-etc-openvswitch\") pod \"ovnkube-node-jt6z4\" (UID: \"25f54e79-6851-4967-8fba-16c66a7ea099\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.492709 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/25f54e79-6851-4967-8fba-16c66a7ea099-run-openvswitch\") pod \"ovnkube-node-jt6z4\" (UID: \"25f54e79-6851-4967-8fba-16c66a7ea099\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.492704 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/25f54e79-6851-4967-8fba-16c66a7ea099-systemd-units\") pod \"ovnkube-node-jt6z4\" (UID: \"25f54e79-6851-4967-8fba-16c66a7ea099\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.492735 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/25f54e79-6851-4967-8fba-16c66a7ea099-ovn-node-metrics-cert\") pod \"ovnkube-node-jt6z4\" (UID: \"25f54e79-6851-4967-8fba-16c66a7ea099\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.492783 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bb8t8\" (UniqueName: \"kubernetes.io/projected/25f54e79-6851-4967-8fba-16c66a7ea099-kube-api-access-bb8t8\") pod \"ovnkube-node-jt6z4\" (UID: \"25f54e79-6851-4967-8fba-16c66a7ea099\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.492804 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/25f54e79-6851-4967-8fba-16c66a7ea099-etc-openvswitch\") pod \"ovnkube-node-jt6z4\" (UID: \"25f54e79-6851-4967-8fba-16c66a7ea099\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.492809 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/25f54e79-6851-4967-8fba-16c66a7ea099-var-lib-openvswitch\") pod \"ovnkube-node-jt6z4\" (UID: \"25f54e79-6851-4967-8fba-16c66a7ea099\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.492828 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/25f54e79-6851-4967-8fba-16c66a7ea099-var-lib-openvswitch\") pod \"ovnkube-node-jt6z4\" (UID: \"25f54e79-6851-4967-8fba-16c66a7ea099\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.492872 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/25f54e79-6851-4967-8fba-16c66a7ea099-host-run-netns\") pod \"ovnkube-node-jt6z4\" (UID: \"25f54e79-6851-4967-8fba-16c66a7ea099\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.492904 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/25f54e79-6851-4967-8fba-16c66a7ea099-node-log\") pod \"ovnkube-node-jt6z4\" (UID: \"25f54e79-6851-4967-8fba-16c66a7ea099\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.492920 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/25f54e79-6851-4967-8fba-16c66a7ea099-run-ovn\") pod \"ovnkube-node-jt6z4\" (UID: \"25f54e79-6851-4967-8fba-16c66a7ea099\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.492941 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/25f54e79-6851-4967-8fba-16c66a7ea099-ovnkube-script-lib\") pod \"ovnkube-node-jt6z4\" (UID: \"25f54e79-6851-4967-8fba-16c66a7ea099\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.492958 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/25f54e79-6851-4967-8fba-16c66a7ea099-host-cni-bin\") pod \"ovnkube-node-jt6z4\" (UID: \"25f54e79-6851-4967-8fba-16c66a7ea099\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.492973 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/25f54e79-6851-4967-8fba-16c66a7ea099-log-socket\") pod \"ovnkube-node-jt6z4\" (UID: \"25f54e79-6851-4967-8fba-16c66a7ea099\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.492992 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/25f54e79-6851-4967-8fba-16c66a7ea099-host-run-ovn-kubernetes\") pod \"ovnkube-node-jt6z4\" (UID: \"25f54e79-6851-4967-8fba-16c66a7ea099\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.493006 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/25f54e79-6851-4967-8fba-16c66a7ea099-ovnkube-config\") pod \"ovnkube-node-jt6z4\" (UID: \"25f54e79-6851-4967-8fba-16c66a7ea099\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.493024 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/25f54e79-6851-4967-8fba-16c66a7ea099-env-overrides\") pod \"ovnkube-node-jt6z4\" (UID: \"25f54e79-6851-4967-8fba-16c66a7ea099\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.493515 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/25f54e79-6851-4967-8fba-16c66a7ea099-node-log\") pod \"ovnkube-node-jt6z4\" (UID: \"25f54e79-6851-4967-8fba-16c66a7ea099\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.493556 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/25f54e79-6851-4967-8fba-16c66a7ea099-host-run-netns\") pod \"ovnkube-node-jt6z4\" (UID: \"25f54e79-6851-4967-8fba-16c66a7ea099\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.493581 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/25f54e79-6851-4967-8fba-16c66a7ea099-env-overrides\") pod \"ovnkube-node-jt6z4\" (UID: \"25f54e79-6851-4967-8fba-16c66a7ea099\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.493587 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/25f54e79-6851-4967-8fba-16c66a7ea099-log-socket\") pod \"ovnkube-node-jt6z4\" (UID: \"25f54e79-6851-4967-8fba-16c66a7ea099\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.493784 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/25f54e79-6851-4967-8fba-16c66a7ea099-host-cni-bin\") pod \"ovnkube-node-jt6z4\" (UID: \"25f54e79-6851-4967-8fba-16c66a7ea099\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.493785 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/25f54e79-6851-4967-8fba-16c66a7ea099-run-ovn\") pod \"ovnkube-node-jt6z4\" (UID: \"25f54e79-6851-4967-8fba-16c66a7ea099\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.493799 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/25f54e79-6851-4967-8fba-16c66a7ea099-host-run-ovn-kubernetes\") pod \"ovnkube-node-jt6z4\" (UID: \"25f54e79-6851-4967-8fba-16c66a7ea099\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.494243 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/25f54e79-6851-4967-8fba-16c66a7ea099-ovnkube-config\") pod \"ovnkube-node-jt6z4\" (UID: \"25f54e79-6851-4967-8fba-16c66a7ea099\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.494473 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/25f54e79-6851-4967-8fba-16c66a7ea099-ovnkube-script-lib\") pod \"ovnkube-node-jt6z4\" (UID: \"25f54e79-6851-4967-8fba-16c66a7ea099\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.498869 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/25f54e79-6851-4967-8fba-16c66a7ea099-ovn-node-metrics-cert\") pod \"ovnkube-node-jt6z4\" (UID: \"25f54e79-6851-4967-8fba-16c66a7ea099\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.511325 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bb8t8\" (UniqueName: \"kubernetes.io/projected/25f54e79-6851-4967-8fba-16c66a7ea099-kube-api-access-bb8t8\") pod \"ovnkube-node-jt6z4\" (UID: \"25f54e79-6851-4967-8fba-16c66a7ea099\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.520456 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.722642 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-dnjcp_2a7b6abe-370e-4514-9777-7483bb64e1f0/kube-multus/0.log" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.723168 4861 generic.go:334] "Generic (PLEG): container finished" podID="2a7b6abe-370e-4514-9777-7483bb64e1f0" containerID="8c38cd133d2ed36cdd55c33e0dbf2336c41cb7343be3c65e5b854b019ee99b4b" exitCode=2 Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.723239 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dnjcp" event={"ID":"2a7b6abe-370e-4514-9777-7483bb64e1f0","Type":"ContainerDied","Data":"8c38cd133d2ed36cdd55c33e0dbf2336c41cb7343be3c65e5b854b019ee99b4b"} Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.723906 4861 scope.go:117] "RemoveContainer" containerID="8c38cd133d2ed36cdd55c33e0dbf2336c41cb7343be3c65e5b854b019ee99b4b" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.725193 4861 generic.go:334] "Generic (PLEG): container finished" podID="25f54e79-6851-4967-8fba-16c66a7ea099" containerID="d288367c6be532fb3ca3b5ff034c9ec88e42902289b1f8fd7496b9caddcd9736" exitCode=0 Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.725256 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" event={"ID":"25f54e79-6851-4967-8fba-16c66a7ea099","Type":"ContainerDied","Data":"d288367c6be532fb3ca3b5ff034c9ec88e42902289b1f8fd7496b9caddcd9736"} Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.725292 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" event={"ID":"25f54e79-6851-4967-8fba-16c66a7ea099","Type":"ContainerStarted","Data":"182831289e0a16a95a98921af72d00a649cb29bcf674d2a517b076ce474044f9"} Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.734881 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kmjsq_752be2d4-f338-4c5e-b51e-452fd8391c73/ovn-acl-logging/0.log" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.735659 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kmjsq_752be2d4-f338-4c5e-b51e-452fd8391c73/ovn-controller/0.log" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.736522 4861 generic.go:334] "Generic (PLEG): container finished" podID="752be2d4-f338-4c5e-b51e-452fd8391c73" containerID="65c9df23fe9acb6770fb9f21b9a56a1b6551aa9c7787d2a41350815ce00dd460" exitCode=0 Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.736638 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" event={"ID":"752be2d4-f338-4c5e-b51e-452fd8391c73","Type":"ContainerDied","Data":"65c9df23fe9acb6770fb9f21b9a56a1b6551aa9c7787d2a41350815ce00dd460"} Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.736674 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.736699 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" event={"ID":"752be2d4-f338-4c5e-b51e-452fd8391c73","Type":"ContainerDied","Data":"431c1bbb7e00b5b4c1615bff59cac3ec1f16e7a4bb52478bd3dccb0b441c24a2"} Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.736727 4861 scope.go:117] "RemoveContainer" containerID="65c9df23fe9acb6770fb9f21b9a56a1b6551aa9c7787d2a41350815ce00dd460" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.736664 4861 generic.go:334] "Generic (PLEG): container finished" podID="752be2d4-f338-4c5e-b51e-452fd8391c73" containerID="431c1bbb7e00b5b4c1615bff59cac3ec1f16e7a4bb52478bd3dccb0b441c24a2" exitCode=0 Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.736952 4861 generic.go:334] "Generic (PLEG): container finished" podID="752be2d4-f338-4c5e-b51e-452fd8391c73" containerID="c141660a01917112b004037f5bfc373a39b90e14dcb37a156d0b0dc60af56e0a" exitCode=0 Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.737040 4861 generic.go:334] "Generic (PLEG): container finished" podID="752be2d4-f338-4c5e-b51e-452fd8391c73" containerID="77865966595bc249ebb7de4adca17e523e85c0dfab3b83bcc5ef662c7c31b0e0" exitCode=0 Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.737138 4861 generic.go:334] "Generic (PLEG): container finished" podID="752be2d4-f338-4c5e-b51e-452fd8391c73" containerID="3404d4ea05e503128dcf25a1314f47abdcd31cf9014df5b10889374c469707e8" exitCode=0 Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.737228 4861 generic.go:334] "Generic (PLEG): container finished" podID="752be2d4-f338-4c5e-b51e-452fd8391c73" containerID="35204d90abde22c56dbe4579e11f1da589271202a7e059926b799e4ddfb69f63" exitCode=0 Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.737344 4861 generic.go:334] "Generic (PLEG): container finished" podID="752be2d4-f338-4c5e-b51e-452fd8391c73" containerID="04d44b4ad8719d9acaed799be8575081e6d326f27f3f02958ad30441df94d80a" exitCode=143 Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.737466 4861 generic.go:334] "Generic (PLEG): container finished" podID="752be2d4-f338-4c5e-b51e-452fd8391c73" containerID="c73566e23363dcd42915bf2068aa8e97fe90f6c7afa409450cee62c683cbfa90" exitCode=143 Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.737043 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" event={"ID":"752be2d4-f338-4c5e-b51e-452fd8391c73","Type":"ContainerDied","Data":"c141660a01917112b004037f5bfc373a39b90e14dcb37a156d0b0dc60af56e0a"} Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.737679 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" event={"ID":"752be2d4-f338-4c5e-b51e-452fd8391c73","Type":"ContainerDied","Data":"77865966595bc249ebb7de4adca17e523e85c0dfab3b83bcc5ef662c7c31b0e0"} Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.737792 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" event={"ID":"752be2d4-f338-4c5e-b51e-452fd8391c73","Type":"ContainerDied","Data":"3404d4ea05e503128dcf25a1314f47abdcd31cf9014df5b10889374c469707e8"} Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.737894 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" event={"ID":"752be2d4-f338-4c5e-b51e-452fd8391c73","Type":"ContainerDied","Data":"35204d90abde22c56dbe4579e11f1da589271202a7e059926b799e4ddfb69f63"} Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.738126 4861 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"04d44b4ad8719d9acaed799be8575081e6d326f27f3f02958ad30441df94d80a"} Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.738233 4861 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c73566e23363dcd42915bf2068aa8e97fe90f6c7afa409450cee62c683cbfa90"} Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.738322 4861 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5d78e61177fed79e734e99406acb1d44f730e81d601827c7be96d7dec219dd93"} Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.738464 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" event={"ID":"752be2d4-f338-4c5e-b51e-452fd8391c73","Type":"ContainerDied","Data":"04d44b4ad8719d9acaed799be8575081e6d326f27f3f02958ad30441df94d80a"} Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.738564 4861 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"65c9df23fe9acb6770fb9f21b9a56a1b6551aa9c7787d2a41350815ce00dd460"} Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.738641 4861 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"431c1bbb7e00b5b4c1615bff59cac3ec1f16e7a4bb52478bd3dccb0b441c24a2"} Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.738695 4861 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c141660a01917112b004037f5bfc373a39b90e14dcb37a156d0b0dc60af56e0a"} Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.738776 4861 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"77865966595bc249ebb7de4adca17e523e85c0dfab3b83bcc5ef662c7c31b0e0"} Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.738847 4861 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3404d4ea05e503128dcf25a1314f47abdcd31cf9014df5b10889374c469707e8"} Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.738898 4861 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"35204d90abde22c56dbe4579e11f1da589271202a7e059926b799e4ddfb69f63"} Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.738970 4861 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"04d44b4ad8719d9acaed799be8575081e6d326f27f3f02958ad30441df94d80a"} Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.739022 4861 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c73566e23363dcd42915bf2068aa8e97fe90f6c7afa409450cee62c683cbfa90"} Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.739164 4861 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5d78e61177fed79e734e99406acb1d44f730e81d601827c7be96d7dec219dd93"} Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.739276 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" event={"ID":"752be2d4-f338-4c5e-b51e-452fd8391c73","Type":"ContainerDied","Data":"c73566e23363dcd42915bf2068aa8e97fe90f6c7afa409450cee62c683cbfa90"} Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.739357 4861 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"65c9df23fe9acb6770fb9f21b9a56a1b6551aa9c7787d2a41350815ce00dd460"} Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.739455 4861 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"431c1bbb7e00b5b4c1615bff59cac3ec1f16e7a4bb52478bd3dccb0b441c24a2"} Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.739534 4861 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c141660a01917112b004037f5bfc373a39b90e14dcb37a156d0b0dc60af56e0a"} Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.739617 4861 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"77865966595bc249ebb7de4adca17e523e85c0dfab3b83bcc5ef662c7c31b0e0"} Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.739705 4861 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3404d4ea05e503128dcf25a1314f47abdcd31cf9014df5b10889374c469707e8"} Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.739786 4861 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"35204d90abde22c56dbe4579e11f1da589271202a7e059926b799e4ddfb69f63"} Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.739862 4861 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"04d44b4ad8719d9acaed799be8575081e6d326f27f3f02958ad30441df94d80a"} Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.739951 4861 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c73566e23363dcd42915bf2068aa8e97fe90f6c7afa409450cee62c683cbfa90"} Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.740038 4861 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5d78e61177fed79e734e99406acb1d44f730e81d601827c7be96d7dec219dd93"} Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.740126 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kmjsq" event={"ID":"752be2d4-f338-4c5e-b51e-452fd8391c73","Type":"ContainerDied","Data":"69d927fb2c5fc26050c07a81d44ae2ec32cc0bcac8f5d9c11fb0c415816890a9"} Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.740206 4861 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"65c9df23fe9acb6770fb9f21b9a56a1b6551aa9c7787d2a41350815ce00dd460"} Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.740295 4861 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"431c1bbb7e00b5b4c1615bff59cac3ec1f16e7a4bb52478bd3dccb0b441c24a2"} Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.740424 4861 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c141660a01917112b004037f5bfc373a39b90e14dcb37a156d0b0dc60af56e0a"} Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.740502 4861 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"77865966595bc249ebb7de4adca17e523e85c0dfab3b83bcc5ef662c7c31b0e0"} Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.740796 4861 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3404d4ea05e503128dcf25a1314f47abdcd31cf9014df5b10889374c469707e8"} Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.741863 4861 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"35204d90abde22c56dbe4579e11f1da589271202a7e059926b799e4ddfb69f63"} Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.742008 4861 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"04d44b4ad8719d9acaed799be8575081e6d326f27f3f02958ad30441df94d80a"} Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.742120 4861 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c73566e23363dcd42915bf2068aa8e97fe90f6c7afa409450cee62c683cbfa90"} Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.742255 4861 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5d78e61177fed79e734e99406acb1d44f730e81d601827c7be96d7dec219dd93"} Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.761802 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-kmjsq"] Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.764459 4861 scope.go:117] "RemoveContainer" containerID="431c1bbb7e00b5b4c1615bff59cac3ec1f16e7a4bb52478bd3dccb0b441c24a2" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.766194 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-kmjsq"] Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.784668 4861 scope.go:117] "RemoveContainer" containerID="c141660a01917112b004037f5bfc373a39b90e14dcb37a156d0b0dc60af56e0a" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.815116 4861 scope.go:117] "RemoveContainer" containerID="77865966595bc249ebb7de4adca17e523e85c0dfab3b83bcc5ef662c7c31b0e0" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.832097 4861 scope.go:117] "RemoveContainer" containerID="3404d4ea05e503128dcf25a1314f47abdcd31cf9014df5b10889374c469707e8" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.864636 4861 scope.go:117] "RemoveContainer" containerID="35204d90abde22c56dbe4579e11f1da589271202a7e059926b799e4ddfb69f63" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.877662 4861 scope.go:117] "RemoveContainer" containerID="04d44b4ad8719d9acaed799be8575081e6d326f27f3f02958ad30441df94d80a" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.891783 4861 scope.go:117] "RemoveContainer" containerID="c73566e23363dcd42915bf2068aa8e97fe90f6c7afa409450cee62c683cbfa90" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.921034 4861 scope.go:117] "RemoveContainer" containerID="5d78e61177fed79e734e99406acb1d44f730e81d601827c7be96d7dec219dd93" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.936063 4861 scope.go:117] "RemoveContainer" containerID="65c9df23fe9acb6770fb9f21b9a56a1b6551aa9c7787d2a41350815ce00dd460" Mar 09 09:17:23 crc kubenswrapper[4861]: E0309 09:17:23.936501 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65c9df23fe9acb6770fb9f21b9a56a1b6551aa9c7787d2a41350815ce00dd460\": container with ID starting with 65c9df23fe9acb6770fb9f21b9a56a1b6551aa9c7787d2a41350815ce00dd460 not found: ID does not exist" containerID="65c9df23fe9acb6770fb9f21b9a56a1b6551aa9c7787d2a41350815ce00dd460" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.936532 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65c9df23fe9acb6770fb9f21b9a56a1b6551aa9c7787d2a41350815ce00dd460"} err="failed to get container status \"65c9df23fe9acb6770fb9f21b9a56a1b6551aa9c7787d2a41350815ce00dd460\": rpc error: code = NotFound desc = could not find container \"65c9df23fe9acb6770fb9f21b9a56a1b6551aa9c7787d2a41350815ce00dd460\": container with ID starting with 65c9df23fe9acb6770fb9f21b9a56a1b6551aa9c7787d2a41350815ce00dd460 not found: ID does not exist" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.936554 4861 scope.go:117] "RemoveContainer" containerID="431c1bbb7e00b5b4c1615bff59cac3ec1f16e7a4bb52478bd3dccb0b441c24a2" Mar 09 09:17:23 crc kubenswrapper[4861]: E0309 09:17:23.936972 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"431c1bbb7e00b5b4c1615bff59cac3ec1f16e7a4bb52478bd3dccb0b441c24a2\": container with ID starting with 431c1bbb7e00b5b4c1615bff59cac3ec1f16e7a4bb52478bd3dccb0b441c24a2 not found: ID does not exist" containerID="431c1bbb7e00b5b4c1615bff59cac3ec1f16e7a4bb52478bd3dccb0b441c24a2" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.937018 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"431c1bbb7e00b5b4c1615bff59cac3ec1f16e7a4bb52478bd3dccb0b441c24a2"} err="failed to get container status \"431c1bbb7e00b5b4c1615bff59cac3ec1f16e7a4bb52478bd3dccb0b441c24a2\": rpc error: code = NotFound desc = could not find container \"431c1bbb7e00b5b4c1615bff59cac3ec1f16e7a4bb52478bd3dccb0b441c24a2\": container with ID starting with 431c1bbb7e00b5b4c1615bff59cac3ec1f16e7a4bb52478bd3dccb0b441c24a2 not found: ID does not exist" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.937049 4861 scope.go:117] "RemoveContainer" containerID="c141660a01917112b004037f5bfc373a39b90e14dcb37a156d0b0dc60af56e0a" Mar 09 09:17:23 crc kubenswrapper[4861]: E0309 09:17:23.937437 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c141660a01917112b004037f5bfc373a39b90e14dcb37a156d0b0dc60af56e0a\": container with ID starting with c141660a01917112b004037f5bfc373a39b90e14dcb37a156d0b0dc60af56e0a not found: ID does not exist" containerID="c141660a01917112b004037f5bfc373a39b90e14dcb37a156d0b0dc60af56e0a" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.937463 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c141660a01917112b004037f5bfc373a39b90e14dcb37a156d0b0dc60af56e0a"} err="failed to get container status \"c141660a01917112b004037f5bfc373a39b90e14dcb37a156d0b0dc60af56e0a\": rpc error: code = NotFound desc = could not find container \"c141660a01917112b004037f5bfc373a39b90e14dcb37a156d0b0dc60af56e0a\": container with ID starting with c141660a01917112b004037f5bfc373a39b90e14dcb37a156d0b0dc60af56e0a not found: ID does not exist" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.937482 4861 scope.go:117] "RemoveContainer" containerID="77865966595bc249ebb7de4adca17e523e85c0dfab3b83bcc5ef662c7c31b0e0" Mar 09 09:17:23 crc kubenswrapper[4861]: E0309 09:17:23.937752 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77865966595bc249ebb7de4adca17e523e85c0dfab3b83bcc5ef662c7c31b0e0\": container with ID starting with 77865966595bc249ebb7de4adca17e523e85c0dfab3b83bcc5ef662c7c31b0e0 not found: ID does not exist" containerID="77865966595bc249ebb7de4adca17e523e85c0dfab3b83bcc5ef662c7c31b0e0" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.937780 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77865966595bc249ebb7de4adca17e523e85c0dfab3b83bcc5ef662c7c31b0e0"} err="failed to get container status \"77865966595bc249ebb7de4adca17e523e85c0dfab3b83bcc5ef662c7c31b0e0\": rpc error: code = NotFound desc = could not find container \"77865966595bc249ebb7de4adca17e523e85c0dfab3b83bcc5ef662c7c31b0e0\": container with ID starting with 77865966595bc249ebb7de4adca17e523e85c0dfab3b83bcc5ef662c7c31b0e0 not found: ID does not exist" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.937795 4861 scope.go:117] "RemoveContainer" containerID="3404d4ea05e503128dcf25a1314f47abdcd31cf9014df5b10889374c469707e8" Mar 09 09:17:23 crc kubenswrapper[4861]: E0309 09:17:23.938006 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3404d4ea05e503128dcf25a1314f47abdcd31cf9014df5b10889374c469707e8\": container with ID starting with 3404d4ea05e503128dcf25a1314f47abdcd31cf9014df5b10889374c469707e8 not found: ID does not exist" containerID="3404d4ea05e503128dcf25a1314f47abdcd31cf9014df5b10889374c469707e8" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.938032 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3404d4ea05e503128dcf25a1314f47abdcd31cf9014df5b10889374c469707e8"} err="failed to get container status \"3404d4ea05e503128dcf25a1314f47abdcd31cf9014df5b10889374c469707e8\": rpc error: code = NotFound desc = could not find container \"3404d4ea05e503128dcf25a1314f47abdcd31cf9014df5b10889374c469707e8\": container with ID starting with 3404d4ea05e503128dcf25a1314f47abdcd31cf9014df5b10889374c469707e8 not found: ID does not exist" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.938048 4861 scope.go:117] "RemoveContainer" containerID="35204d90abde22c56dbe4579e11f1da589271202a7e059926b799e4ddfb69f63" Mar 09 09:17:23 crc kubenswrapper[4861]: E0309 09:17:23.938216 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35204d90abde22c56dbe4579e11f1da589271202a7e059926b799e4ddfb69f63\": container with ID starting with 35204d90abde22c56dbe4579e11f1da589271202a7e059926b799e4ddfb69f63 not found: ID does not exist" containerID="35204d90abde22c56dbe4579e11f1da589271202a7e059926b799e4ddfb69f63" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.938240 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35204d90abde22c56dbe4579e11f1da589271202a7e059926b799e4ddfb69f63"} err="failed to get container status \"35204d90abde22c56dbe4579e11f1da589271202a7e059926b799e4ddfb69f63\": rpc error: code = NotFound desc = could not find container \"35204d90abde22c56dbe4579e11f1da589271202a7e059926b799e4ddfb69f63\": container with ID starting with 35204d90abde22c56dbe4579e11f1da589271202a7e059926b799e4ddfb69f63 not found: ID does not exist" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.938256 4861 scope.go:117] "RemoveContainer" containerID="04d44b4ad8719d9acaed799be8575081e6d326f27f3f02958ad30441df94d80a" Mar 09 09:17:23 crc kubenswrapper[4861]: E0309 09:17:23.938542 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04d44b4ad8719d9acaed799be8575081e6d326f27f3f02958ad30441df94d80a\": container with ID starting with 04d44b4ad8719d9acaed799be8575081e6d326f27f3f02958ad30441df94d80a not found: ID does not exist" containerID="04d44b4ad8719d9acaed799be8575081e6d326f27f3f02958ad30441df94d80a" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.938569 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04d44b4ad8719d9acaed799be8575081e6d326f27f3f02958ad30441df94d80a"} err="failed to get container status \"04d44b4ad8719d9acaed799be8575081e6d326f27f3f02958ad30441df94d80a\": rpc error: code = NotFound desc = could not find container \"04d44b4ad8719d9acaed799be8575081e6d326f27f3f02958ad30441df94d80a\": container with ID starting with 04d44b4ad8719d9acaed799be8575081e6d326f27f3f02958ad30441df94d80a not found: ID does not exist" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.938585 4861 scope.go:117] "RemoveContainer" containerID="c73566e23363dcd42915bf2068aa8e97fe90f6c7afa409450cee62c683cbfa90" Mar 09 09:17:23 crc kubenswrapper[4861]: E0309 09:17:23.938823 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c73566e23363dcd42915bf2068aa8e97fe90f6c7afa409450cee62c683cbfa90\": container with ID starting with c73566e23363dcd42915bf2068aa8e97fe90f6c7afa409450cee62c683cbfa90 not found: ID does not exist" containerID="c73566e23363dcd42915bf2068aa8e97fe90f6c7afa409450cee62c683cbfa90" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.938906 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c73566e23363dcd42915bf2068aa8e97fe90f6c7afa409450cee62c683cbfa90"} err="failed to get container status \"c73566e23363dcd42915bf2068aa8e97fe90f6c7afa409450cee62c683cbfa90\": rpc error: code = NotFound desc = could not find container \"c73566e23363dcd42915bf2068aa8e97fe90f6c7afa409450cee62c683cbfa90\": container with ID starting with c73566e23363dcd42915bf2068aa8e97fe90f6c7afa409450cee62c683cbfa90 not found: ID does not exist" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.938922 4861 scope.go:117] "RemoveContainer" containerID="5d78e61177fed79e734e99406acb1d44f730e81d601827c7be96d7dec219dd93" Mar 09 09:17:23 crc kubenswrapper[4861]: E0309 09:17:23.939243 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d78e61177fed79e734e99406acb1d44f730e81d601827c7be96d7dec219dd93\": container with ID starting with 5d78e61177fed79e734e99406acb1d44f730e81d601827c7be96d7dec219dd93 not found: ID does not exist" containerID="5d78e61177fed79e734e99406acb1d44f730e81d601827c7be96d7dec219dd93" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.939302 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d78e61177fed79e734e99406acb1d44f730e81d601827c7be96d7dec219dd93"} err="failed to get container status \"5d78e61177fed79e734e99406acb1d44f730e81d601827c7be96d7dec219dd93\": rpc error: code = NotFound desc = could not find container \"5d78e61177fed79e734e99406acb1d44f730e81d601827c7be96d7dec219dd93\": container with ID starting with 5d78e61177fed79e734e99406acb1d44f730e81d601827c7be96d7dec219dd93 not found: ID does not exist" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.939322 4861 scope.go:117] "RemoveContainer" containerID="65c9df23fe9acb6770fb9f21b9a56a1b6551aa9c7787d2a41350815ce00dd460" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.939622 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65c9df23fe9acb6770fb9f21b9a56a1b6551aa9c7787d2a41350815ce00dd460"} err="failed to get container status \"65c9df23fe9acb6770fb9f21b9a56a1b6551aa9c7787d2a41350815ce00dd460\": rpc error: code = NotFound desc = could not find container \"65c9df23fe9acb6770fb9f21b9a56a1b6551aa9c7787d2a41350815ce00dd460\": container with ID starting with 65c9df23fe9acb6770fb9f21b9a56a1b6551aa9c7787d2a41350815ce00dd460 not found: ID does not exist" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.939674 4861 scope.go:117] "RemoveContainer" containerID="431c1bbb7e00b5b4c1615bff59cac3ec1f16e7a4bb52478bd3dccb0b441c24a2" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.939945 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"431c1bbb7e00b5b4c1615bff59cac3ec1f16e7a4bb52478bd3dccb0b441c24a2"} err="failed to get container status \"431c1bbb7e00b5b4c1615bff59cac3ec1f16e7a4bb52478bd3dccb0b441c24a2\": rpc error: code = NotFound desc = could not find container \"431c1bbb7e00b5b4c1615bff59cac3ec1f16e7a4bb52478bd3dccb0b441c24a2\": container with ID starting with 431c1bbb7e00b5b4c1615bff59cac3ec1f16e7a4bb52478bd3dccb0b441c24a2 not found: ID does not exist" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.939996 4861 scope.go:117] "RemoveContainer" containerID="c141660a01917112b004037f5bfc373a39b90e14dcb37a156d0b0dc60af56e0a" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.940297 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c141660a01917112b004037f5bfc373a39b90e14dcb37a156d0b0dc60af56e0a"} err="failed to get container status \"c141660a01917112b004037f5bfc373a39b90e14dcb37a156d0b0dc60af56e0a\": rpc error: code = NotFound desc = could not find container \"c141660a01917112b004037f5bfc373a39b90e14dcb37a156d0b0dc60af56e0a\": container with ID starting with c141660a01917112b004037f5bfc373a39b90e14dcb37a156d0b0dc60af56e0a not found: ID does not exist" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.940357 4861 scope.go:117] "RemoveContainer" containerID="77865966595bc249ebb7de4adca17e523e85c0dfab3b83bcc5ef662c7c31b0e0" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.940667 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77865966595bc249ebb7de4adca17e523e85c0dfab3b83bcc5ef662c7c31b0e0"} err="failed to get container status \"77865966595bc249ebb7de4adca17e523e85c0dfab3b83bcc5ef662c7c31b0e0\": rpc error: code = NotFound desc = could not find container \"77865966595bc249ebb7de4adca17e523e85c0dfab3b83bcc5ef662c7c31b0e0\": container with ID starting with 77865966595bc249ebb7de4adca17e523e85c0dfab3b83bcc5ef662c7c31b0e0 not found: ID does not exist" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.940690 4861 scope.go:117] "RemoveContainer" containerID="3404d4ea05e503128dcf25a1314f47abdcd31cf9014df5b10889374c469707e8" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.940910 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3404d4ea05e503128dcf25a1314f47abdcd31cf9014df5b10889374c469707e8"} err="failed to get container status \"3404d4ea05e503128dcf25a1314f47abdcd31cf9014df5b10889374c469707e8\": rpc error: code = NotFound desc = could not find container \"3404d4ea05e503128dcf25a1314f47abdcd31cf9014df5b10889374c469707e8\": container with ID starting with 3404d4ea05e503128dcf25a1314f47abdcd31cf9014df5b10889374c469707e8 not found: ID does not exist" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.940944 4861 scope.go:117] "RemoveContainer" containerID="35204d90abde22c56dbe4579e11f1da589271202a7e059926b799e4ddfb69f63" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.941624 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35204d90abde22c56dbe4579e11f1da589271202a7e059926b799e4ddfb69f63"} err="failed to get container status \"35204d90abde22c56dbe4579e11f1da589271202a7e059926b799e4ddfb69f63\": rpc error: code = NotFound desc = could not find container \"35204d90abde22c56dbe4579e11f1da589271202a7e059926b799e4ddfb69f63\": container with ID starting with 35204d90abde22c56dbe4579e11f1da589271202a7e059926b799e4ddfb69f63 not found: ID does not exist" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.941707 4861 scope.go:117] "RemoveContainer" containerID="04d44b4ad8719d9acaed799be8575081e6d326f27f3f02958ad30441df94d80a" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.942116 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04d44b4ad8719d9acaed799be8575081e6d326f27f3f02958ad30441df94d80a"} err="failed to get container status \"04d44b4ad8719d9acaed799be8575081e6d326f27f3f02958ad30441df94d80a\": rpc error: code = NotFound desc = could not find container \"04d44b4ad8719d9acaed799be8575081e6d326f27f3f02958ad30441df94d80a\": container with ID starting with 04d44b4ad8719d9acaed799be8575081e6d326f27f3f02958ad30441df94d80a not found: ID does not exist" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.942140 4861 scope.go:117] "RemoveContainer" containerID="c73566e23363dcd42915bf2068aa8e97fe90f6c7afa409450cee62c683cbfa90" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.942540 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c73566e23363dcd42915bf2068aa8e97fe90f6c7afa409450cee62c683cbfa90"} err="failed to get container status \"c73566e23363dcd42915bf2068aa8e97fe90f6c7afa409450cee62c683cbfa90\": rpc error: code = NotFound desc = could not find container \"c73566e23363dcd42915bf2068aa8e97fe90f6c7afa409450cee62c683cbfa90\": container with ID starting with c73566e23363dcd42915bf2068aa8e97fe90f6c7afa409450cee62c683cbfa90 not found: ID does not exist" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.942595 4861 scope.go:117] "RemoveContainer" containerID="5d78e61177fed79e734e99406acb1d44f730e81d601827c7be96d7dec219dd93" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.942944 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d78e61177fed79e734e99406acb1d44f730e81d601827c7be96d7dec219dd93"} err="failed to get container status \"5d78e61177fed79e734e99406acb1d44f730e81d601827c7be96d7dec219dd93\": rpc error: code = NotFound desc = could not find container \"5d78e61177fed79e734e99406acb1d44f730e81d601827c7be96d7dec219dd93\": container with ID starting with 5d78e61177fed79e734e99406acb1d44f730e81d601827c7be96d7dec219dd93 not found: ID does not exist" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.942968 4861 scope.go:117] "RemoveContainer" containerID="65c9df23fe9acb6770fb9f21b9a56a1b6551aa9c7787d2a41350815ce00dd460" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.943207 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65c9df23fe9acb6770fb9f21b9a56a1b6551aa9c7787d2a41350815ce00dd460"} err="failed to get container status \"65c9df23fe9acb6770fb9f21b9a56a1b6551aa9c7787d2a41350815ce00dd460\": rpc error: code = NotFound desc = could not find container \"65c9df23fe9acb6770fb9f21b9a56a1b6551aa9c7787d2a41350815ce00dd460\": container with ID starting with 65c9df23fe9acb6770fb9f21b9a56a1b6551aa9c7787d2a41350815ce00dd460 not found: ID does not exist" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.943231 4861 scope.go:117] "RemoveContainer" containerID="431c1bbb7e00b5b4c1615bff59cac3ec1f16e7a4bb52478bd3dccb0b441c24a2" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.943486 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"431c1bbb7e00b5b4c1615bff59cac3ec1f16e7a4bb52478bd3dccb0b441c24a2"} err="failed to get container status \"431c1bbb7e00b5b4c1615bff59cac3ec1f16e7a4bb52478bd3dccb0b441c24a2\": rpc error: code = NotFound desc = could not find container \"431c1bbb7e00b5b4c1615bff59cac3ec1f16e7a4bb52478bd3dccb0b441c24a2\": container with ID starting with 431c1bbb7e00b5b4c1615bff59cac3ec1f16e7a4bb52478bd3dccb0b441c24a2 not found: ID does not exist" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.943507 4861 scope.go:117] "RemoveContainer" containerID="c141660a01917112b004037f5bfc373a39b90e14dcb37a156d0b0dc60af56e0a" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.943801 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c141660a01917112b004037f5bfc373a39b90e14dcb37a156d0b0dc60af56e0a"} err="failed to get container status \"c141660a01917112b004037f5bfc373a39b90e14dcb37a156d0b0dc60af56e0a\": rpc error: code = NotFound desc = could not find container \"c141660a01917112b004037f5bfc373a39b90e14dcb37a156d0b0dc60af56e0a\": container with ID starting with c141660a01917112b004037f5bfc373a39b90e14dcb37a156d0b0dc60af56e0a not found: ID does not exist" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.943860 4861 scope.go:117] "RemoveContainer" containerID="77865966595bc249ebb7de4adca17e523e85c0dfab3b83bcc5ef662c7c31b0e0" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.944073 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77865966595bc249ebb7de4adca17e523e85c0dfab3b83bcc5ef662c7c31b0e0"} err="failed to get container status \"77865966595bc249ebb7de4adca17e523e85c0dfab3b83bcc5ef662c7c31b0e0\": rpc error: code = NotFound desc = could not find container \"77865966595bc249ebb7de4adca17e523e85c0dfab3b83bcc5ef662c7c31b0e0\": container with ID starting with 77865966595bc249ebb7de4adca17e523e85c0dfab3b83bcc5ef662c7c31b0e0 not found: ID does not exist" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.944125 4861 scope.go:117] "RemoveContainer" containerID="3404d4ea05e503128dcf25a1314f47abdcd31cf9014df5b10889374c469707e8" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.944336 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3404d4ea05e503128dcf25a1314f47abdcd31cf9014df5b10889374c469707e8"} err="failed to get container status \"3404d4ea05e503128dcf25a1314f47abdcd31cf9014df5b10889374c469707e8\": rpc error: code = NotFound desc = could not find container \"3404d4ea05e503128dcf25a1314f47abdcd31cf9014df5b10889374c469707e8\": container with ID starting with 3404d4ea05e503128dcf25a1314f47abdcd31cf9014df5b10889374c469707e8 not found: ID does not exist" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.944474 4861 scope.go:117] "RemoveContainer" containerID="35204d90abde22c56dbe4579e11f1da589271202a7e059926b799e4ddfb69f63" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.945129 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35204d90abde22c56dbe4579e11f1da589271202a7e059926b799e4ddfb69f63"} err="failed to get container status \"35204d90abde22c56dbe4579e11f1da589271202a7e059926b799e4ddfb69f63\": rpc error: code = NotFound desc = could not find container \"35204d90abde22c56dbe4579e11f1da589271202a7e059926b799e4ddfb69f63\": container with ID starting with 35204d90abde22c56dbe4579e11f1da589271202a7e059926b799e4ddfb69f63 not found: ID does not exist" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.945150 4861 scope.go:117] "RemoveContainer" containerID="04d44b4ad8719d9acaed799be8575081e6d326f27f3f02958ad30441df94d80a" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.945485 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04d44b4ad8719d9acaed799be8575081e6d326f27f3f02958ad30441df94d80a"} err="failed to get container status \"04d44b4ad8719d9acaed799be8575081e6d326f27f3f02958ad30441df94d80a\": rpc error: code = NotFound desc = could not find container \"04d44b4ad8719d9acaed799be8575081e6d326f27f3f02958ad30441df94d80a\": container with ID starting with 04d44b4ad8719d9acaed799be8575081e6d326f27f3f02958ad30441df94d80a not found: ID does not exist" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.945507 4861 scope.go:117] "RemoveContainer" containerID="c73566e23363dcd42915bf2068aa8e97fe90f6c7afa409450cee62c683cbfa90" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.945842 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c73566e23363dcd42915bf2068aa8e97fe90f6c7afa409450cee62c683cbfa90"} err="failed to get container status \"c73566e23363dcd42915bf2068aa8e97fe90f6c7afa409450cee62c683cbfa90\": rpc error: code = NotFound desc = could not find container \"c73566e23363dcd42915bf2068aa8e97fe90f6c7afa409450cee62c683cbfa90\": container with ID starting with c73566e23363dcd42915bf2068aa8e97fe90f6c7afa409450cee62c683cbfa90 not found: ID does not exist" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.945864 4861 scope.go:117] "RemoveContainer" containerID="5d78e61177fed79e734e99406acb1d44f730e81d601827c7be96d7dec219dd93" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.946170 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d78e61177fed79e734e99406acb1d44f730e81d601827c7be96d7dec219dd93"} err="failed to get container status \"5d78e61177fed79e734e99406acb1d44f730e81d601827c7be96d7dec219dd93\": rpc error: code = NotFound desc = could not find container \"5d78e61177fed79e734e99406acb1d44f730e81d601827c7be96d7dec219dd93\": container with ID starting with 5d78e61177fed79e734e99406acb1d44f730e81d601827c7be96d7dec219dd93 not found: ID does not exist" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.946197 4861 scope.go:117] "RemoveContainer" containerID="65c9df23fe9acb6770fb9f21b9a56a1b6551aa9c7787d2a41350815ce00dd460" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.946429 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65c9df23fe9acb6770fb9f21b9a56a1b6551aa9c7787d2a41350815ce00dd460"} err="failed to get container status \"65c9df23fe9acb6770fb9f21b9a56a1b6551aa9c7787d2a41350815ce00dd460\": rpc error: code = NotFound desc = could not find container \"65c9df23fe9acb6770fb9f21b9a56a1b6551aa9c7787d2a41350815ce00dd460\": container with ID starting with 65c9df23fe9acb6770fb9f21b9a56a1b6551aa9c7787d2a41350815ce00dd460 not found: ID does not exist" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.946452 4861 scope.go:117] "RemoveContainer" containerID="431c1bbb7e00b5b4c1615bff59cac3ec1f16e7a4bb52478bd3dccb0b441c24a2" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.946766 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"431c1bbb7e00b5b4c1615bff59cac3ec1f16e7a4bb52478bd3dccb0b441c24a2"} err="failed to get container status \"431c1bbb7e00b5b4c1615bff59cac3ec1f16e7a4bb52478bd3dccb0b441c24a2\": rpc error: code = NotFound desc = could not find container \"431c1bbb7e00b5b4c1615bff59cac3ec1f16e7a4bb52478bd3dccb0b441c24a2\": container with ID starting with 431c1bbb7e00b5b4c1615bff59cac3ec1f16e7a4bb52478bd3dccb0b441c24a2 not found: ID does not exist" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.946788 4861 scope.go:117] "RemoveContainer" containerID="c141660a01917112b004037f5bfc373a39b90e14dcb37a156d0b0dc60af56e0a" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.946999 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c141660a01917112b004037f5bfc373a39b90e14dcb37a156d0b0dc60af56e0a"} err="failed to get container status \"c141660a01917112b004037f5bfc373a39b90e14dcb37a156d0b0dc60af56e0a\": rpc error: code = NotFound desc = could not find container \"c141660a01917112b004037f5bfc373a39b90e14dcb37a156d0b0dc60af56e0a\": container with ID starting with c141660a01917112b004037f5bfc373a39b90e14dcb37a156d0b0dc60af56e0a not found: ID does not exist" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.947023 4861 scope.go:117] "RemoveContainer" containerID="77865966595bc249ebb7de4adca17e523e85c0dfab3b83bcc5ef662c7c31b0e0" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.947300 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77865966595bc249ebb7de4adca17e523e85c0dfab3b83bcc5ef662c7c31b0e0"} err="failed to get container status \"77865966595bc249ebb7de4adca17e523e85c0dfab3b83bcc5ef662c7c31b0e0\": rpc error: code = NotFound desc = could not find container \"77865966595bc249ebb7de4adca17e523e85c0dfab3b83bcc5ef662c7c31b0e0\": container with ID starting with 77865966595bc249ebb7de4adca17e523e85c0dfab3b83bcc5ef662c7c31b0e0 not found: ID does not exist" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.947321 4861 scope.go:117] "RemoveContainer" containerID="3404d4ea05e503128dcf25a1314f47abdcd31cf9014df5b10889374c469707e8" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.947564 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3404d4ea05e503128dcf25a1314f47abdcd31cf9014df5b10889374c469707e8"} err="failed to get container status \"3404d4ea05e503128dcf25a1314f47abdcd31cf9014df5b10889374c469707e8\": rpc error: code = NotFound desc = could not find container \"3404d4ea05e503128dcf25a1314f47abdcd31cf9014df5b10889374c469707e8\": container with ID starting with 3404d4ea05e503128dcf25a1314f47abdcd31cf9014df5b10889374c469707e8 not found: ID does not exist" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.947594 4861 scope.go:117] "RemoveContainer" containerID="35204d90abde22c56dbe4579e11f1da589271202a7e059926b799e4ddfb69f63" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.947947 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35204d90abde22c56dbe4579e11f1da589271202a7e059926b799e4ddfb69f63"} err="failed to get container status \"35204d90abde22c56dbe4579e11f1da589271202a7e059926b799e4ddfb69f63\": rpc error: code = NotFound desc = could not find container \"35204d90abde22c56dbe4579e11f1da589271202a7e059926b799e4ddfb69f63\": container with ID starting with 35204d90abde22c56dbe4579e11f1da589271202a7e059926b799e4ddfb69f63 not found: ID does not exist" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.948014 4861 scope.go:117] "RemoveContainer" containerID="04d44b4ad8719d9acaed799be8575081e6d326f27f3f02958ad30441df94d80a" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.948365 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04d44b4ad8719d9acaed799be8575081e6d326f27f3f02958ad30441df94d80a"} err="failed to get container status \"04d44b4ad8719d9acaed799be8575081e6d326f27f3f02958ad30441df94d80a\": rpc error: code = NotFound desc = could not find container \"04d44b4ad8719d9acaed799be8575081e6d326f27f3f02958ad30441df94d80a\": container with ID starting with 04d44b4ad8719d9acaed799be8575081e6d326f27f3f02958ad30441df94d80a not found: ID does not exist" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.948430 4861 scope.go:117] "RemoveContainer" containerID="c73566e23363dcd42915bf2068aa8e97fe90f6c7afa409450cee62c683cbfa90" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.948898 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c73566e23363dcd42915bf2068aa8e97fe90f6c7afa409450cee62c683cbfa90"} err="failed to get container status \"c73566e23363dcd42915bf2068aa8e97fe90f6c7afa409450cee62c683cbfa90\": rpc error: code = NotFound desc = could not find container \"c73566e23363dcd42915bf2068aa8e97fe90f6c7afa409450cee62c683cbfa90\": container with ID starting with c73566e23363dcd42915bf2068aa8e97fe90f6c7afa409450cee62c683cbfa90 not found: ID does not exist" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.948940 4861 scope.go:117] "RemoveContainer" containerID="5d78e61177fed79e734e99406acb1d44f730e81d601827c7be96d7dec219dd93" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.949335 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d78e61177fed79e734e99406acb1d44f730e81d601827c7be96d7dec219dd93"} err="failed to get container status \"5d78e61177fed79e734e99406acb1d44f730e81d601827c7be96d7dec219dd93\": rpc error: code = NotFound desc = could not find container \"5d78e61177fed79e734e99406acb1d44f730e81d601827c7be96d7dec219dd93\": container with ID starting with 5d78e61177fed79e734e99406acb1d44f730e81d601827c7be96d7dec219dd93 not found: ID does not exist" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.949387 4861 scope.go:117] "RemoveContainer" containerID="65c9df23fe9acb6770fb9f21b9a56a1b6551aa9c7787d2a41350815ce00dd460" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.949635 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65c9df23fe9acb6770fb9f21b9a56a1b6551aa9c7787d2a41350815ce00dd460"} err="failed to get container status \"65c9df23fe9acb6770fb9f21b9a56a1b6551aa9c7787d2a41350815ce00dd460\": rpc error: code = NotFound desc = could not find container \"65c9df23fe9acb6770fb9f21b9a56a1b6551aa9c7787d2a41350815ce00dd460\": container with ID starting with 65c9df23fe9acb6770fb9f21b9a56a1b6551aa9c7787d2a41350815ce00dd460 not found: ID does not exist" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.949665 4861 scope.go:117] "RemoveContainer" containerID="431c1bbb7e00b5b4c1615bff59cac3ec1f16e7a4bb52478bd3dccb0b441c24a2" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.949865 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"431c1bbb7e00b5b4c1615bff59cac3ec1f16e7a4bb52478bd3dccb0b441c24a2"} err="failed to get container status \"431c1bbb7e00b5b4c1615bff59cac3ec1f16e7a4bb52478bd3dccb0b441c24a2\": rpc error: code = NotFound desc = could not find container \"431c1bbb7e00b5b4c1615bff59cac3ec1f16e7a4bb52478bd3dccb0b441c24a2\": container with ID starting with 431c1bbb7e00b5b4c1615bff59cac3ec1f16e7a4bb52478bd3dccb0b441c24a2 not found: ID does not exist" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.949895 4861 scope.go:117] "RemoveContainer" containerID="c141660a01917112b004037f5bfc373a39b90e14dcb37a156d0b0dc60af56e0a" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.950107 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c141660a01917112b004037f5bfc373a39b90e14dcb37a156d0b0dc60af56e0a"} err="failed to get container status \"c141660a01917112b004037f5bfc373a39b90e14dcb37a156d0b0dc60af56e0a\": rpc error: code = NotFound desc = could not find container \"c141660a01917112b004037f5bfc373a39b90e14dcb37a156d0b0dc60af56e0a\": container with ID starting with c141660a01917112b004037f5bfc373a39b90e14dcb37a156d0b0dc60af56e0a not found: ID does not exist" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.950136 4861 scope.go:117] "RemoveContainer" containerID="77865966595bc249ebb7de4adca17e523e85c0dfab3b83bcc5ef662c7c31b0e0" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.950340 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77865966595bc249ebb7de4adca17e523e85c0dfab3b83bcc5ef662c7c31b0e0"} err="failed to get container status \"77865966595bc249ebb7de4adca17e523e85c0dfab3b83bcc5ef662c7c31b0e0\": rpc error: code = NotFound desc = could not find container \"77865966595bc249ebb7de4adca17e523e85c0dfab3b83bcc5ef662c7c31b0e0\": container with ID starting with 77865966595bc249ebb7de4adca17e523e85c0dfab3b83bcc5ef662c7c31b0e0 not found: ID does not exist" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.950391 4861 scope.go:117] "RemoveContainer" containerID="3404d4ea05e503128dcf25a1314f47abdcd31cf9014df5b10889374c469707e8" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.950635 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3404d4ea05e503128dcf25a1314f47abdcd31cf9014df5b10889374c469707e8"} err="failed to get container status \"3404d4ea05e503128dcf25a1314f47abdcd31cf9014df5b10889374c469707e8\": rpc error: code = NotFound desc = could not find container \"3404d4ea05e503128dcf25a1314f47abdcd31cf9014df5b10889374c469707e8\": container with ID starting with 3404d4ea05e503128dcf25a1314f47abdcd31cf9014df5b10889374c469707e8 not found: ID does not exist" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.950665 4861 scope.go:117] "RemoveContainer" containerID="35204d90abde22c56dbe4579e11f1da589271202a7e059926b799e4ddfb69f63" Mar 09 09:17:23 crc kubenswrapper[4861]: I0309 09:17:23.950925 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35204d90abde22c56dbe4579e11f1da589271202a7e059926b799e4ddfb69f63"} err="failed to get container status \"35204d90abde22c56dbe4579e11f1da589271202a7e059926b799e4ddfb69f63\": rpc error: code = NotFound desc = could not find container \"35204d90abde22c56dbe4579e11f1da589271202a7e059926b799e4ddfb69f63\": container with ID starting with 35204d90abde22c56dbe4579e11f1da589271202a7e059926b799e4ddfb69f63 not found: ID does not exist" Mar 09 09:17:24 crc kubenswrapper[4861]: I0309 09:17:24.752327 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-dnjcp_2a7b6abe-370e-4514-9777-7483bb64e1f0/kube-multus/0.log" Mar 09 09:17:24 crc kubenswrapper[4861]: I0309 09:17:24.752884 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dnjcp" event={"ID":"2a7b6abe-370e-4514-9777-7483bb64e1f0","Type":"ContainerStarted","Data":"c266da6f372bfa6be995ac9b87ccae2e4e267b5d647389323c4c68b524e6725a"} Mar 09 09:17:24 crc kubenswrapper[4861]: I0309 09:17:24.757009 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" event={"ID":"25f54e79-6851-4967-8fba-16c66a7ea099","Type":"ContainerStarted","Data":"b8f1f2b44e46d147529f0153e1816205b4a1085b9073f07cab6d37559da152ce"} Mar 09 09:17:24 crc kubenswrapper[4861]: I0309 09:17:24.757030 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" event={"ID":"25f54e79-6851-4967-8fba-16c66a7ea099","Type":"ContainerStarted","Data":"c0d00e8a7cde0a61553033b0e7497996524813f1fb8d1ad56ac212e8b37b9b8d"} Mar 09 09:17:24 crc kubenswrapper[4861]: I0309 09:17:24.757043 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" event={"ID":"25f54e79-6851-4967-8fba-16c66a7ea099","Type":"ContainerStarted","Data":"3735ab7e96fd7627c3283e0c257835059504b8bf21bbdf9cb9cc2b1487641903"} Mar 09 09:17:24 crc kubenswrapper[4861]: I0309 09:17:24.757052 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" event={"ID":"25f54e79-6851-4967-8fba-16c66a7ea099","Type":"ContainerStarted","Data":"2e436fde8b83be83167cf0324b9273280340c7f383ffe0795170a7e1974644ad"} Mar 09 09:17:24 crc kubenswrapper[4861]: I0309 09:17:24.757061 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" event={"ID":"25f54e79-6851-4967-8fba-16c66a7ea099","Type":"ContainerStarted","Data":"056993fa84c729dd664f1cf8f1397faba5188bde1a2e5335e008e12beca55c31"} Mar 09 09:17:24 crc kubenswrapper[4861]: I0309 09:17:24.757069 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" event={"ID":"25f54e79-6851-4967-8fba-16c66a7ea099","Type":"ContainerStarted","Data":"adcb9d7c219a6fd1bc16b2c627dc1e2aefd36703b7c56b4ed2a08e56009df234"} Mar 09 09:17:25 crc kubenswrapper[4861]: I0309 09:17:25.671303 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="752be2d4-f338-4c5e-b51e-452fd8391c73" path="/var/lib/kubelet/pods/752be2d4-f338-4c5e-b51e-452fd8391c73/volumes" Mar 09 09:17:26 crc kubenswrapper[4861]: I0309 09:17:26.773749 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" event={"ID":"25f54e79-6851-4967-8fba-16c66a7ea099","Type":"ContainerStarted","Data":"73a9f1744a66ef41abca270dbe271fc2116c18e611cede954c8dbe0afe519712"} Mar 09 09:17:29 crc kubenswrapper[4861]: I0309 09:17:29.800213 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" event={"ID":"25f54e79-6851-4967-8fba-16c66a7ea099","Type":"ContainerStarted","Data":"1e2d974bc860be588e21f15a27fae7dac1816450b9ebe1182ba65156799b110b"} Mar 09 09:17:29 crc kubenswrapper[4861]: I0309 09:17:29.800961 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:29 crc kubenswrapper[4861]: I0309 09:17:29.800984 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:29 crc kubenswrapper[4861]: I0309 09:17:29.830546 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:29 crc kubenswrapper[4861]: I0309 09:17:29.840355 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" podStartSLOduration=6.840335674 podStartE2EDuration="6.840335674s" podCreationTimestamp="2026-03-09 09:17:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:17:29.835165587 +0000 UTC m=+692.920205008" watchObservedRunningTime="2026-03-09 09:17:29.840335674 +0000 UTC m=+692.925375085" Mar 09 09:17:30 crc kubenswrapper[4861]: I0309 09:17:30.806877 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:30 crc kubenswrapper[4861]: I0309 09:17:30.842074 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:17:50 crc kubenswrapper[4861]: I0309 09:17:50.087022 4861 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 09 09:17:53 crc kubenswrapper[4861]: I0309 09:17:53.549247 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jt6z4" Mar 09 09:18:00 crc kubenswrapper[4861]: I0309 09:18:00.131747 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550798-cd9xx"] Mar 09 09:18:00 crc kubenswrapper[4861]: I0309 09:18:00.134098 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550798-cd9xx" Mar 09 09:18:00 crc kubenswrapper[4861]: I0309 09:18:00.140276 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550798-cd9xx"] Mar 09 09:18:00 crc kubenswrapper[4861]: I0309 09:18:00.141080 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:18:00 crc kubenswrapper[4861]: I0309 09:18:00.141257 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k86l8" Mar 09 09:18:00 crc kubenswrapper[4861]: I0309 09:18:00.141443 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:18:00 crc kubenswrapper[4861]: I0309 09:18:00.169611 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdd2d\" (UniqueName: \"kubernetes.io/projected/e992bde7-1734-4260-9ab3-0da5ab187665-kube-api-access-mdd2d\") pod \"auto-csr-approver-29550798-cd9xx\" (UID: \"e992bde7-1734-4260-9ab3-0da5ab187665\") " pod="openshift-infra/auto-csr-approver-29550798-cd9xx" Mar 09 09:18:00 crc kubenswrapper[4861]: I0309 09:18:00.234276 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ggwqp"] Mar 09 09:18:00 crc kubenswrapper[4861]: I0309 09:18:00.235675 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ggwqp" Mar 09 09:18:00 crc kubenswrapper[4861]: I0309 09:18:00.237755 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 09 09:18:00 crc kubenswrapper[4861]: I0309 09:18:00.244039 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ggwqp"] Mar 09 09:18:00 crc kubenswrapper[4861]: I0309 09:18:00.270667 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdd2d\" (UniqueName: \"kubernetes.io/projected/e992bde7-1734-4260-9ab3-0da5ab187665-kube-api-access-mdd2d\") pod \"auto-csr-approver-29550798-cd9xx\" (UID: \"e992bde7-1734-4260-9ab3-0da5ab187665\") " pod="openshift-infra/auto-csr-approver-29550798-cd9xx" Mar 09 09:18:00 crc kubenswrapper[4861]: I0309 09:18:00.270722 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8668713c-12cf-457c-a09c-5302f11d19cc-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ggwqp\" (UID: \"8668713c-12cf-457c-a09c-5302f11d19cc\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ggwqp" Mar 09 09:18:00 crc kubenswrapper[4861]: I0309 09:18:00.270767 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzfv7\" (UniqueName: \"kubernetes.io/projected/8668713c-12cf-457c-a09c-5302f11d19cc-kube-api-access-tzfv7\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ggwqp\" (UID: \"8668713c-12cf-457c-a09c-5302f11d19cc\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ggwqp" Mar 09 09:18:00 crc kubenswrapper[4861]: I0309 09:18:00.270799 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8668713c-12cf-457c-a09c-5302f11d19cc-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ggwqp\" (UID: \"8668713c-12cf-457c-a09c-5302f11d19cc\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ggwqp" Mar 09 09:18:00 crc kubenswrapper[4861]: I0309 09:18:00.288649 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdd2d\" (UniqueName: \"kubernetes.io/projected/e992bde7-1734-4260-9ab3-0da5ab187665-kube-api-access-mdd2d\") pod \"auto-csr-approver-29550798-cd9xx\" (UID: \"e992bde7-1734-4260-9ab3-0da5ab187665\") " pod="openshift-infra/auto-csr-approver-29550798-cd9xx" Mar 09 09:18:00 crc kubenswrapper[4861]: I0309 09:18:00.372263 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8668713c-12cf-457c-a09c-5302f11d19cc-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ggwqp\" (UID: \"8668713c-12cf-457c-a09c-5302f11d19cc\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ggwqp" Mar 09 09:18:00 crc kubenswrapper[4861]: I0309 09:18:00.372697 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzfv7\" (UniqueName: \"kubernetes.io/projected/8668713c-12cf-457c-a09c-5302f11d19cc-kube-api-access-tzfv7\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ggwqp\" (UID: \"8668713c-12cf-457c-a09c-5302f11d19cc\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ggwqp" Mar 09 09:18:00 crc kubenswrapper[4861]: I0309 09:18:00.372759 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8668713c-12cf-457c-a09c-5302f11d19cc-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ggwqp\" (UID: \"8668713c-12cf-457c-a09c-5302f11d19cc\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ggwqp" Mar 09 09:18:00 crc kubenswrapper[4861]: I0309 09:18:00.373476 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8668713c-12cf-457c-a09c-5302f11d19cc-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ggwqp\" (UID: \"8668713c-12cf-457c-a09c-5302f11d19cc\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ggwqp" Mar 09 09:18:00 crc kubenswrapper[4861]: I0309 09:18:00.373611 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8668713c-12cf-457c-a09c-5302f11d19cc-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ggwqp\" (UID: \"8668713c-12cf-457c-a09c-5302f11d19cc\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ggwqp" Mar 09 09:18:00 crc kubenswrapper[4861]: I0309 09:18:00.399632 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzfv7\" (UniqueName: \"kubernetes.io/projected/8668713c-12cf-457c-a09c-5302f11d19cc-kube-api-access-tzfv7\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ggwqp\" (UID: \"8668713c-12cf-457c-a09c-5302f11d19cc\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ggwqp" Mar 09 09:18:00 crc kubenswrapper[4861]: I0309 09:18:00.448884 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550798-cd9xx" Mar 09 09:18:00 crc kubenswrapper[4861]: I0309 09:18:00.558820 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ggwqp" Mar 09 09:18:00 crc kubenswrapper[4861]: I0309 09:18:00.748986 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550798-cd9xx"] Mar 09 09:18:00 crc kubenswrapper[4861]: I0309 09:18:00.797393 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ggwqp"] Mar 09 09:18:00 crc kubenswrapper[4861]: W0309 09:18:00.801035 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8668713c_12cf_457c_a09c_5302f11d19cc.slice/crio-c8b8fc63fc538a21a754a242e28a4b16c19423477390a62124d7be197ddb292e WatchSource:0}: Error finding container c8b8fc63fc538a21a754a242e28a4b16c19423477390a62124d7be197ddb292e: Status 404 returned error can't find the container with id c8b8fc63fc538a21a754a242e28a4b16c19423477390a62124d7be197ddb292e Mar 09 09:18:00 crc kubenswrapper[4861]: I0309 09:18:00.999640 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ggwqp" event={"ID":"8668713c-12cf-457c-a09c-5302f11d19cc","Type":"ContainerStarted","Data":"99c18196f5a967a0ce654b18a5e0e561325846a011c44d979b6cfa9ae3bb397d"} Mar 09 09:18:00 crc kubenswrapper[4861]: I0309 09:18:00.999688 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ggwqp" event={"ID":"8668713c-12cf-457c-a09c-5302f11d19cc","Type":"ContainerStarted","Data":"c8b8fc63fc538a21a754a242e28a4b16c19423477390a62124d7be197ddb292e"} Mar 09 09:18:01 crc kubenswrapper[4861]: I0309 09:18:01.000432 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550798-cd9xx" event={"ID":"e992bde7-1734-4260-9ab3-0da5ab187665","Type":"ContainerStarted","Data":"2320ea9a83e4caa1995add4ca702c31277dcafe1bc731da2ce3b65354084f054"} Mar 09 09:18:02 crc kubenswrapper[4861]: I0309 09:18:02.007394 4861 generic.go:334] "Generic (PLEG): container finished" podID="8668713c-12cf-457c-a09c-5302f11d19cc" containerID="99c18196f5a967a0ce654b18a5e0e561325846a011c44d979b6cfa9ae3bb397d" exitCode=0 Mar 09 09:18:02 crc kubenswrapper[4861]: I0309 09:18:02.007491 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ggwqp" event={"ID":"8668713c-12cf-457c-a09c-5302f11d19cc","Type":"ContainerDied","Data":"99c18196f5a967a0ce654b18a5e0e561325846a011c44d979b6cfa9ae3bb397d"} Mar 09 09:18:02 crc kubenswrapper[4861]: I0309 09:18:02.010252 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550798-cd9xx" event={"ID":"e992bde7-1734-4260-9ab3-0da5ab187665","Type":"ContainerStarted","Data":"ad8e73462be5f0224a635ca2365c40f471e78fa9cded994a064d5de52f0a2d76"} Mar 09 09:18:02 crc kubenswrapper[4861]: I0309 09:18:02.042357 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rn59d"] Mar 09 09:18:02 crc kubenswrapper[4861]: I0309 09:18:02.043541 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rn59d" Mar 09 09:18:02 crc kubenswrapper[4861]: I0309 09:18:02.046899 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550798-cd9xx" podStartSLOduration=1.127620637 podStartE2EDuration="2.046870656s" podCreationTimestamp="2026-03-09 09:18:00 +0000 UTC" firstStartedPulling="2026-03-09 09:18:00.75586378 +0000 UTC m=+723.840903181" lastFinishedPulling="2026-03-09 09:18:01.675113759 +0000 UTC m=+724.760153200" observedRunningTime="2026-03-09 09:18:02.040198653 +0000 UTC m=+725.125238054" watchObservedRunningTime="2026-03-09 09:18:02.046870656 +0000 UTC m=+725.131910067" Mar 09 09:18:02 crc kubenswrapper[4861]: I0309 09:18:02.063272 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rn59d"] Mar 09 09:18:02 crc kubenswrapper[4861]: I0309 09:18:02.102332 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fdbf2a5-6d7a-425d-8935-288135585e5c-catalog-content\") pod \"redhat-operators-rn59d\" (UID: \"8fdbf2a5-6d7a-425d-8935-288135585e5c\") " pod="openshift-marketplace/redhat-operators-rn59d" Mar 09 09:18:02 crc kubenswrapper[4861]: I0309 09:18:02.102755 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fdbf2a5-6d7a-425d-8935-288135585e5c-utilities\") pod \"redhat-operators-rn59d\" (UID: \"8fdbf2a5-6d7a-425d-8935-288135585e5c\") " pod="openshift-marketplace/redhat-operators-rn59d" Mar 09 09:18:02 crc kubenswrapper[4861]: I0309 09:18:02.102781 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjt8d\" (UniqueName: \"kubernetes.io/projected/8fdbf2a5-6d7a-425d-8935-288135585e5c-kube-api-access-wjt8d\") pod \"redhat-operators-rn59d\" (UID: \"8fdbf2a5-6d7a-425d-8935-288135585e5c\") " pod="openshift-marketplace/redhat-operators-rn59d" Mar 09 09:18:02 crc kubenswrapper[4861]: I0309 09:18:02.203940 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fdbf2a5-6d7a-425d-8935-288135585e5c-utilities\") pod \"redhat-operators-rn59d\" (UID: \"8fdbf2a5-6d7a-425d-8935-288135585e5c\") " pod="openshift-marketplace/redhat-operators-rn59d" Mar 09 09:18:02 crc kubenswrapper[4861]: I0309 09:18:02.203988 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjt8d\" (UniqueName: \"kubernetes.io/projected/8fdbf2a5-6d7a-425d-8935-288135585e5c-kube-api-access-wjt8d\") pod \"redhat-operators-rn59d\" (UID: \"8fdbf2a5-6d7a-425d-8935-288135585e5c\") " pod="openshift-marketplace/redhat-operators-rn59d" Mar 09 09:18:02 crc kubenswrapper[4861]: I0309 09:18:02.204033 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fdbf2a5-6d7a-425d-8935-288135585e5c-catalog-content\") pod \"redhat-operators-rn59d\" (UID: \"8fdbf2a5-6d7a-425d-8935-288135585e5c\") " pod="openshift-marketplace/redhat-operators-rn59d" Mar 09 09:18:02 crc kubenswrapper[4861]: I0309 09:18:02.204421 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fdbf2a5-6d7a-425d-8935-288135585e5c-catalog-content\") pod \"redhat-operators-rn59d\" (UID: \"8fdbf2a5-6d7a-425d-8935-288135585e5c\") " pod="openshift-marketplace/redhat-operators-rn59d" Mar 09 09:18:02 crc kubenswrapper[4861]: I0309 09:18:02.204557 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fdbf2a5-6d7a-425d-8935-288135585e5c-utilities\") pod \"redhat-operators-rn59d\" (UID: \"8fdbf2a5-6d7a-425d-8935-288135585e5c\") " pod="openshift-marketplace/redhat-operators-rn59d" Mar 09 09:18:02 crc kubenswrapper[4861]: I0309 09:18:02.222742 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjt8d\" (UniqueName: \"kubernetes.io/projected/8fdbf2a5-6d7a-425d-8935-288135585e5c-kube-api-access-wjt8d\") pod \"redhat-operators-rn59d\" (UID: \"8fdbf2a5-6d7a-425d-8935-288135585e5c\") " pod="openshift-marketplace/redhat-operators-rn59d" Mar 09 09:18:02 crc kubenswrapper[4861]: I0309 09:18:02.366419 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rn59d" Mar 09 09:18:02 crc kubenswrapper[4861]: I0309 09:18:02.590186 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rn59d"] Mar 09 09:18:02 crc kubenswrapper[4861]: W0309 09:18:02.600771 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8fdbf2a5_6d7a_425d_8935_288135585e5c.slice/crio-024365e1327ae3c9a39de3e8037d73343167ecddc5a4a4aef719a2a76b65780b WatchSource:0}: Error finding container 024365e1327ae3c9a39de3e8037d73343167ecddc5a4a4aef719a2a76b65780b: Status 404 returned error can't find the container with id 024365e1327ae3c9a39de3e8037d73343167ecddc5a4a4aef719a2a76b65780b Mar 09 09:18:03 crc kubenswrapper[4861]: I0309 09:18:03.021940 4861 generic.go:334] "Generic (PLEG): container finished" podID="8fdbf2a5-6d7a-425d-8935-288135585e5c" containerID="0ef89735e68e246bb02311f8bd85b818333126c9f00e53f56ac3298c1e86b1e9" exitCode=0 Mar 09 09:18:03 crc kubenswrapper[4861]: I0309 09:18:03.022109 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rn59d" event={"ID":"8fdbf2a5-6d7a-425d-8935-288135585e5c","Type":"ContainerDied","Data":"0ef89735e68e246bb02311f8bd85b818333126c9f00e53f56ac3298c1e86b1e9"} Mar 09 09:18:03 crc kubenswrapper[4861]: I0309 09:18:03.022155 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rn59d" event={"ID":"8fdbf2a5-6d7a-425d-8935-288135585e5c","Type":"ContainerStarted","Data":"024365e1327ae3c9a39de3e8037d73343167ecddc5a4a4aef719a2a76b65780b"} Mar 09 09:18:03 crc kubenswrapper[4861]: I0309 09:18:03.026479 4861 generic.go:334] "Generic (PLEG): container finished" podID="e992bde7-1734-4260-9ab3-0da5ab187665" containerID="ad8e73462be5f0224a635ca2365c40f471e78fa9cded994a064d5de52f0a2d76" exitCode=0 Mar 09 09:18:03 crc kubenswrapper[4861]: I0309 09:18:03.026531 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550798-cd9xx" event={"ID":"e992bde7-1734-4260-9ab3-0da5ab187665","Type":"ContainerDied","Data":"ad8e73462be5f0224a635ca2365c40f471e78fa9cded994a064d5de52f0a2d76"} Mar 09 09:18:04 crc kubenswrapper[4861]: I0309 09:18:04.037163 4861 generic.go:334] "Generic (PLEG): container finished" podID="8668713c-12cf-457c-a09c-5302f11d19cc" containerID="86b7eda6f9691f93633ce5d5d957773bc4c2761293f6167b38eaf44d3bd7ddef" exitCode=0 Mar 09 09:18:04 crc kubenswrapper[4861]: I0309 09:18:04.037312 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ggwqp" event={"ID":"8668713c-12cf-457c-a09c-5302f11d19cc","Type":"ContainerDied","Data":"86b7eda6f9691f93633ce5d5d957773bc4c2761293f6167b38eaf44d3bd7ddef"} Mar 09 09:18:04 crc kubenswrapper[4861]: I0309 09:18:04.378784 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550798-cd9xx" Mar 09 09:18:04 crc kubenswrapper[4861]: I0309 09:18:04.437522 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdd2d\" (UniqueName: \"kubernetes.io/projected/e992bde7-1734-4260-9ab3-0da5ab187665-kube-api-access-mdd2d\") pod \"e992bde7-1734-4260-9ab3-0da5ab187665\" (UID: \"e992bde7-1734-4260-9ab3-0da5ab187665\") " Mar 09 09:18:04 crc kubenswrapper[4861]: I0309 09:18:04.443264 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e992bde7-1734-4260-9ab3-0da5ab187665-kube-api-access-mdd2d" (OuterVolumeSpecName: "kube-api-access-mdd2d") pod "e992bde7-1734-4260-9ab3-0da5ab187665" (UID: "e992bde7-1734-4260-9ab3-0da5ab187665"). InnerVolumeSpecName "kube-api-access-mdd2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:18:04 crc kubenswrapper[4861]: I0309 09:18:04.539234 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdd2d\" (UniqueName: \"kubernetes.io/projected/e992bde7-1734-4260-9ab3-0da5ab187665-kube-api-access-mdd2d\") on node \"crc\" DevicePath \"\"" Mar 09 09:18:05 crc kubenswrapper[4861]: I0309 09:18:05.049533 4861 generic.go:334] "Generic (PLEG): container finished" podID="8668713c-12cf-457c-a09c-5302f11d19cc" containerID="8f4dcb38dcde7a6eab327c5b5153b9ff7cda90f4a4adad5bff1462d28b351120" exitCode=0 Mar 09 09:18:05 crc kubenswrapper[4861]: I0309 09:18:05.050034 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ggwqp" event={"ID":"8668713c-12cf-457c-a09c-5302f11d19cc","Type":"ContainerDied","Data":"8f4dcb38dcde7a6eab327c5b5153b9ff7cda90f4a4adad5bff1462d28b351120"} Mar 09 09:18:05 crc kubenswrapper[4861]: I0309 09:18:05.051777 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550798-cd9xx" event={"ID":"e992bde7-1734-4260-9ab3-0da5ab187665","Type":"ContainerDied","Data":"2320ea9a83e4caa1995add4ca702c31277dcafe1bc731da2ce3b65354084f054"} Mar 09 09:18:05 crc kubenswrapper[4861]: I0309 09:18:05.051833 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2320ea9a83e4caa1995add4ca702c31277dcafe1bc731da2ce3b65354084f054" Mar 09 09:18:05 crc kubenswrapper[4861]: I0309 09:18:05.051892 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550798-cd9xx" Mar 09 09:18:05 crc kubenswrapper[4861]: I0309 09:18:05.059238 4861 generic.go:334] "Generic (PLEG): container finished" podID="8fdbf2a5-6d7a-425d-8935-288135585e5c" containerID="70fc447257f2601448d2f0633947753bf590ecc84cbfb84218f4979a71d34202" exitCode=0 Mar 09 09:18:05 crc kubenswrapper[4861]: I0309 09:18:05.059447 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rn59d" event={"ID":"8fdbf2a5-6d7a-425d-8935-288135585e5c","Type":"ContainerDied","Data":"70fc447257f2601448d2f0633947753bf590ecc84cbfb84218f4979a71d34202"} Mar 09 09:18:05 crc kubenswrapper[4861]: I0309 09:18:05.437594 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550792-9bqqs"] Mar 09 09:18:05 crc kubenswrapper[4861]: I0309 09:18:05.448255 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550792-9bqqs"] Mar 09 09:18:05 crc kubenswrapper[4861]: I0309 09:18:05.672007 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0326dab3-8e2c-4b7c-8bf0-6a7686493119" path="/var/lib/kubelet/pods/0326dab3-8e2c-4b7c-8bf0-6a7686493119/volumes" Mar 09 09:18:06 crc kubenswrapper[4861]: I0309 09:18:06.066965 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rn59d" event={"ID":"8fdbf2a5-6d7a-425d-8935-288135585e5c","Type":"ContainerStarted","Data":"3c918934429c10c570d0ff547a064fe7a550212d3fb7268f69a0dee50e0a07c0"} Mar 09 09:18:06 crc kubenswrapper[4861]: I0309 09:18:06.100314 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rn59d" podStartSLOduration=1.6395507120000001 podStartE2EDuration="4.100294429s" podCreationTimestamp="2026-03-09 09:18:02 +0000 UTC" firstStartedPulling="2026-03-09 09:18:03.025023564 +0000 UTC m=+726.110062965" lastFinishedPulling="2026-03-09 09:18:05.485767271 +0000 UTC m=+728.570806682" observedRunningTime="2026-03-09 09:18:06.098402473 +0000 UTC m=+729.183441954" watchObservedRunningTime="2026-03-09 09:18:06.100294429 +0000 UTC m=+729.185333840" Mar 09 09:18:06 crc kubenswrapper[4861]: I0309 09:18:06.326723 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ggwqp" Mar 09 09:18:06 crc kubenswrapper[4861]: I0309 09:18:06.363713 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8668713c-12cf-457c-a09c-5302f11d19cc-bundle\") pod \"8668713c-12cf-457c-a09c-5302f11d19cc\" (UID: \"8668713c-12cf-457c-a09c-5302f11d19cc\") " Mar 09 09:18:06 crc kubenswrapper[4861]: I0309 09:18:06.363752 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8668713c-12cf-457c-a09c-5302f11d19cc-util\") pod \"8668713c-12cf-457c-a09c-5302f11d19cc\" (UID: \"8668713c-12cf-457c-a09c-5302f11d19cc\") " Mar 09 09:18:06 crc kubenswrapper[4861]: I0309 09:18:06.363779 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzfv7\" (UniqueName: \"kubernetes.io/projected/8668713c-12cf-457c-a09c-5302f11d19cc-kube-api-access-tzfv7\") pod \"8668713c-12cf-457c-a09c-5302f11d19cc\" (UID: \"8668713c-12cf-457c-a09c-5302f11d19cc\") " Mar 09 09:18:06 crc kubenswrapper[4861]: I0309 09:18:06.364420 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8668713c-12cf-457c-a09c-5302f11d19cc-bundle" (OuterVolumeSpecName: "bundle") pod "8668713c-12cf-457c-a09c-5302f11d19cc" (UID: "8668713c-12cf-457c-a09c-5302f11d19cc"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:18:06 crc kubenswrapper[4861]: I0309 09:18:06.368595 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8668713c-12cf-457c-a09c-5302f11d19cc-kube-api-access-tzfv7" (OuterVolumeSpecName: "kube-api-access-tzfv7") pod "8668713c-12cf-457c-a09c-5302f11d19cc" (UID: "8668713c-12cf-457c-a09c-5302f11d19cc"). InnerVolumeSpecName "kube-api-access-tzfv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:18:06 crc kubenswrapper[4861]: I0309 09:18:06.386322 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8668713c-12cf-457c-a09c-5302f11d19cc-util" (OuterVolumeSpecName: "util") pod "8668713c-12cf-457c-a09c-5302f11d19cc" (UID: "8668713c-12cf-457c-a09c-5302f11d19cc"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:18:06 crc kubenswrapper[4861]: I0309 09:18:06.465038 4861 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8668713c-12cf-457c-a09c-5302f11d19cc-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:18:06 crc kubenswrapper[4861]: I0309 09:18:06.465065 4861 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8668713c-12cf-457c-a09c-5302f11d19cc-util\") on node \"crc\" DevicePath \"\"" Mar 09 09:18:06 crc kubenswrapper[4861]: I0309 09:18:06.465074 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzfv7\" (UniqueName: \"kubernetes.io/projected/8668713c-12cf-457c-a09c-5302f11d19cc-kube-api-access-tzfv7\") on node \"crc\" DevicePath \"\"" Mar 09 09:18:07 crc kubenswrapper[4861]: I0309 09:18:07.077111 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ggwqp" Mar 09 09:18:07 crc kubenswrapper[4861]: I0309 09:18:07.077147 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ggwqp" event={"ID":"8668713c-12cf-457c-a09c-5302f11d19cc","Type":"ContainerDied","Data":"c8b8fc63fc538a21a754a242e28a4b16c19423477390a62124d7be197ddb292e"} Mar 09 09:18:07 crc kubenswrapper[4861]: I0309 09:18:07.077208 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8b8fc63fc538a21a754a242e28a4b16c19423477390a62124d7be197ddb292e" Mar 09 09:18:10 crc kubenswrapper[4861]: I0309 09:18:10.367457 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-bgvbh"] Mar 09 09:18:10 crc kubenswrapper[4861]: E0309 09:18:10.367993 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8668713c-12cf-457c-a09c-5302f11d19cc" containerName="extract" Mar 09 09:18:10 crc kubenswrapper[4861]: I0309 09:18:10.368007 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="8668713c-12cf-457c-a09c-5302f11d19cc" containerName="extract" Mar 09 09:18:10 crc kubenswrapper[4861]: E0309 09:18:10.368022 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e992bde7-1734-4260-9ab3-0da5ab187665" containerName="oc" Mar 09 09:18:10 crc kubenswrapper[4861]: I0309 09:18:10.368030 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="e992bde7-1734-4260-9ab3-0da5ab187665" containerName="oc" Mar 09 09:18:10 crc kubenswrapper[4861]: E0309 09:18:10.368055 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8668713c-12cf-457c-a09c-5302f11d19cc" containerName="pull" Mar 09 09:18:10 crc kubenswrapper[4861]: I0309 09:18:10.368069 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="8668713c-12cf-457c-a09c-5302f11d19cc" containerName="pull" Mar 09 09:18:10 crc kubenswrapper[4861]: E0309 09:18:10.368085 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8668713c-12cf-457c-a09c-5302f11d19cc" containerName="util" Mar 09 09:18:10 crc kubenswrapper[4861]: I0309 09:18:10.368098 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="8668713c-12cf-457c-a09c-5302f11d19cc" containerName="util" Mar 09 09:18:10 crc kubenswrapper[4861]: I0309 09:18:10.368231 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="8668713c-12cf-457c-a09c-5302f11d19cc" containerName="extract" Mar 09 09:18:10 crc kubenswrapper[4861]: I0309 09:18:10.368243 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="e992bde7-1734-4260-9ab3-0da5ab187665" containerName="oc" Mar 09 09:18:10 crc kubenswrapper[4861]: I0309 09:18:10.368678 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-bgvbh" Mar 09 09:18:10 crc kubenswrapper[4861]: I0309 09:18:10.371650 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 09 09:18:10 crc kubenswrapper[4861]: I0309 09:18:10.372030 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-dshpr" Mar 09 09:18:10 crc kubenswrapper[4861]: I0309 09:18:10.373787 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 09 09:18:10 crc kubenswrapper[4861]: I0309 09:18:10.416959 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mg87\" (UniqueName: \"kubernetes.io/projected/ec451b1d-d99e-48c4-a550-83bac053d5dc-kube-api-access-6mg87\") pod \"nmstate-operator-75c5dccd6c-bgvbh\" (UID: \"ec451b1d-d99e-48c4-a550-83bac053d5dc\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-bgvbh" Mar 09 09:18:10 crc kubenswrapper[4861]: I0309 09:18:10.418586 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-bgvbh"] Mar 09 09:18:10 crc kubenswrapper[4861]: I0309 09:18:10.518416 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mg87\" (UniqueName: \"kubernetes.io/projected/ec451b1d-d99e-48c4-a550-83bac053d5dc-kube-api-access-6mg87\") pod \"nmstate-operator-75c5dccd6c-bgvbh\" (UID: \"ec451b1d-d99e-48c4-a550-83bac053d5dc\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-bgvbh" Mar 09 09:18:10 crc kubenswrapper[4861]: I0309 09:18:10.536017 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mg87\" (UniqueName: \"kubernetes.io/projected/ec451b1d-d99e-48c4-a550-83bac053d5dc-kube-api-access-6mg87\") pod \"nmstate-operator-75c5dccd6c-bgvbh\" (UID: \"ec451b1d-d99e-48c4-a550-83bac053d5dc\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-bgvbh" Mar 09 09:18:10 crc kubenswrapper[4861]: I0309 09:18:10.685394 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-bgvbh" Mar 09 09:18:10 crc kubenswrapper[4861]: I0309 09:18:10.906124 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-bgvbh"] Mar 09 09:18:11 crc kubenswrapper[4861]: I0309 09:18:11.103110 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-bgvbh" event={"ID":"ec451b1d-d99e-48c4-a550-83bac053d5dc","Type":"ContainerStarted","Data":"1e4dfc31b9a9b0ded6c3efc1a142124be061b51c3eca4e582484fb9f4a57105b"} Mar 09 09:18:12 crc kubenswrapper[4861]: I0309 09:18:12.367528 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rn59d" Mar 09 09:18:12 crc kubenswrapper[4861]: I0309 09:18:12.367637 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rn59d" Mar 09 09:18:12 crc kubenswrapper[4861]: I0309 09:18:12.435109 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rn59d" Mar 09 09:18:13 crc kubenswrapper[4861]: I0309 09:18:13.162612 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rn59d" Mar 09 09:18:14 crc kubenswrapper[4861]: I0309 09:18:14.124759 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-bgvbh" event={"ID":"ec451b1d-d99e-48c4-a550-83bac053d5dc","Type":"ContainerStarted","Data":"81178b04d057526a53e5c938e7f5062e864f7d93a07c5b0b69f43f9137ff23f1"} Mar 09 09:18:14 crc kubenswrapper[4861]: I0309 09:18:14.158878 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-bgvbh" podStartSLOduration=1.627390661 podStartE2EDuration="4.158843709s" podCreationTimestamp="2026-03-09 09:18:10 +0000 UTC" firstStartedPulling="2026-03-09 09:18:10.917852669 +0000 UTC m=+734.002892070" lastFinishedPulling="2026-03-09 09:18:13.449305707 +0000 UTC m=+736.534345118" observedRunningTime="2026-03-09 09:18:14.152682031 +0000 UTC m=+737.237721472" watchObservedRunningTime="2026-03-09 09:18:14.158843709 +0000 UTC m=+737.243883150" Mar 09 09:18:14 crc kubenswrapper[4861]: I0309 09:18:14.829962 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rn59d"] Mar 09 09:18:15 crc kubenswrapper[4861]: I0309 09:18:15.139660 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rn59d" podUID="8fdbf2a5-6d7a-425d-8935-288135585e5c" containerName="registry-server" containerID="cri-o://3c918934429c10c570d0ff547a064fe7a550212d3fb7268f69a0dee50e0a07c0" gracePeriod=2 Mar 09 09:18:17 crc kubenswrapper[4861]: I0309 09:18:17.159035 4861 generic.go:334] "Generic (PLEG): container finished" podID="8fdbf2a5-6d7a-425d-8935-288135585e5c" containerID="3c918934429c10c570d0ff547a064fe7a550212d3fb7268f69a0dee50e0a07c0" exitCode=0 Mar 09 09:18:17 crc kubenswrapper[4861]: I0309 09:18:17.159098 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rn59d" event={"ID":"8fdbf2a5-6d7a-425d-8935-288135585e5c","Type":"ContainerDied","Data":"3c918934429c10c570d0ff547a064fe7a550212d3fb7268f69a0dee50e0a07c0"} Mar 09 09:18:17 crc kubenswrapper[4861]: I0309 09:18:17.389417 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rn59d" Mar 09 09:18:17 crc kubenswrapper[4861]: I0309 09:18:17.509236 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fdbf2a5-6d7a-425d-8935-288135585e5c-catalog-content\") pod \"8fdbf2a5-6d7a-425d-8935-288135585e5c\" (UID: \"8fdbf2a5-6d7a-425d-8935-288135585e5c\") " Mar 09 09:18:17 crc kubenswrapper[4861]: I0309 09:18:17.509325 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjt8d\" (UniqueName: \"kubernetes.io/projected/8fdbf2a5-6d7a-425d-8935-288135585e5c-kube-api-access-wjt8d\") pod \"8fdbf2a5-6d7a-425d-8935-288135585e5c\" (UID: \"8fdbf2a5-6d7a-425d-8935-288135585e5c\") " Mar 09 09:18:17 crc kubenswrapper[4861]: I0309 09:18:17.509527 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fdbf2a5-6d7a-425d-8935-288135585e5c-utilities\") pod \"8fdbf2a5-6d7a-425d-8935-288135585e5c\" (UID: \"8fdbf2a5-6d7a-425d-8935-288135585e5c\") " Mar 09 09:18:17 crc kubenswrapper[4861]: I0309 09:18:17.510320 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fdbf2a5-6d7a-425d-8935-288135585e5c-utilities" (OuterVolumeSpecName: "utilities") pod "8fdbf2a5-6d7a-425d-8935-288135585e5c" (UID: "8fdbf2a5-6d7a-425d-8935-288135585e5c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:18:17 crc kubenswrapper[4861]: I0309 09:18:17.518759 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fdbf2a5-6d7a-425d-8935-288135585e5c-kube-api-access-wjt8d" (OuterVolumeSpecName: "kube-api-access-wjt8d") pod "8fdbf2a5-6d7a-425d-8935-288135585e5c" (UID: "8fdbf2a5-6d7a-425d-8935-288135585e5c"). InnerVolumeSpecName "kube-api-access-wjt8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:18:17 crc kubenswrapper[4861]: I0309 09:18:17.610732 4861 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fdbf2a5-6d7a-425d-8935-288135585e5c-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:18:17 crc kubenswrapper[4861]: I0309 09:18:17.610770 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjt8d\" (UniqueName: \"kubernetes.io/projected/8fdbf2a5-6d7a-425d-8935-288135585e5c-kube-api-access-wjt8d\") on node \"crc\" DevicePath \"\"" Mar 09 09:18:17 crc kubenswrapper[4861]: I0309 09:18:17.673074 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fdbf2a5-6d7a-425d-8935-288135585e5c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8fdbf2a5-6d7a-425d-8935-288135585e5c" (UID: "8fdbf2a5-6d7a-425d-8935-288135585e5c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:18:17 crc kubenswrapper[4861]: I0309 09:18:17.712965 4861 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fdbf2a5-6d7a-425d-8935-288135585e5c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:18:18 crc kubenswrapper[4861]: I0309 09:18:18.166839 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rn59d" event={"ID":"8fdbf2a5-6d7a-425d-8935-288135585e5c","Type":"ContainerDied","Data":"024365e1327ae3c9a39de3e8037d73343167ecddc5a4a4aef719a2a76b65780b"} Mar 09 09:18:18 crc kubenswrapper[4861]: I0309 09:18:18.166908 4861 scope.go:117] "RemoveContainer" containerID="3c918934429c10c570d0ff547a064fe7a550212d3fb7268f69a0dee50e0a07c0" Mar 09 09:18:18 crc kubenswrapper[4861]: I0309 09:18:18.166929 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rn59d" Mar 09 09:18:18 crc kubenswrapper[4861]: I0309 09:18:18.189807 4861 scope.go:117] "RemoveContainer" containerID="70fc447257f2601448d2f0633947753bf590ecc84cbfb84218f4979a71d34202" Mar 09 09:18:18 crc kubenswrapper[4861]: I0309 09:18:18.218278 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rn59d"] Mar 09 09:18:18 crc kubenswrapper[4861]: I0309 09:18:18.227356 4861 scope.go:117] "RemoveContainer" containerID="0ef89735e68e246bb02311f8bd85b818333126c9f00e53f56ac3298c1e86b1e9" Mar 09 09:18:18 crc kubenswrapper[4861]: I0309 09:18:18.227746 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rn59d"] Mar 09 09:18:19 crc kubenswrapper[4861]: I0309 09:18:19.666752 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fdbf2a5-6d7a-425d-8935-288135585e5c" path="/var/lib/kubelet/pods/8fdbf2a5-6d7a-425d-8935-288135585e5c/volumes" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.190775 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-wz5b2"] Mar 09 09:18:20 crc kubenswrapper[4861]: E0309 09:18:20.191072 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fdbf2a5-6d7a-425d-8935-288135585e5c" containerName="extract-utilities" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.191095 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fdbf2a5-6d7a-425d-8935-288135585e5c" containerName="extract-utilities" Mar 09 09:18:20 crc kubenswrapper[4861]: E0309 09:18:20.191115 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fdbf2a5-6d7a-425d-8935-288135585e5c" containerName="registry-server" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.191126 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fdbf2a5-6d7a-425d-8935-288135585e5c" containerName="registry-server" Mar 09 09:18:20 crc kubenswrapper[4861]: E0309 09:18:20.191148 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fdbf2a5-6d7a-425d-8935-288135585e5c" containerName="extract-content" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.191159 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fdbf2a5-6d7a-425d-8935-288135585e5c" containerName="extract-content" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.191321 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fdbf2a5-6d7a-425d-8935-288135585e5c" containerName="registry-server" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.192198 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-wz5b2" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.194424 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-srgmr" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.207005 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-wz5b2"] Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.234045 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-27zgp"] Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.234959 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-27zgp" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.237017 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.248987 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kdjh\" (UniqueName: \"kubernetes.io/projected/194a66ef-2b30-40b7-bbfd-5a2c3a51ad55-kube-api-access-5kdjh\") pod \"nmstate-metrics-69594cc75-wz5b2\" (UID: \"194a66ef-2b30-40b7-bbfd-5a2c3a51ad55\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-wz5b2" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.252024 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jb6vv"] Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.253414 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jb6vv" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.256568 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-27zgp"] Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.272651 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-qvk5q"] Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.273398 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-qvk5q" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.277837 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jb6vv"] Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.349995 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktrln\" (UniqueName: \"kubernetes.io/projected/09f364f8-e2b6-4ffe-b51a-37af17081bf8-kube-api-access-ktrln\") pod \"nmstate-handler-qvk5q\" (UID: \"09f364f8-e2b6-4ffe-b51a-37af17081bf8\") " pod="openshift-nmstate/nmstate-handler-qvk5q" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.350057 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f313fc1b-02bd-4bbd-bfdb-a18a300ed8bb-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-27zgp\" (UID: \"f313fc1b-02bd-4bbd-bfdb-a18a300ed8bb\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-27zgp" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.350075 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/09f364f8-e2b6-4ffe-b51a-37af17081bf8-ovs-socket\") pod \"nmstate-handler-qvk5q\" (UID: \"09f364f8-e2b6-4ffe-b51a-37af17081bf8\") " pod="openshift-nmstate/nmstate-handler-qvk5q" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.350109 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d45d7871-0d6e-4307-a739-9ae6d31f6ee1-utilities\") pod \"community-operators-jb6vv\" (UID: \"d45d7871-0d6e-4307-a739-9ae6d31f6ee1\") " pod="openshift-marketplace/community-operators-jb6vv" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.350134 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kdjh\" (UniqueName: \"kubernetes.io/projected/194a66ef-2b30-40b7-bbfd-5a2c3a51ad55-kube-api-access-5kdjh\") pod \"nmstate-metrics-69594cc75-wz5b2\" (UID: \"194a66ef-2b30-40b7-bbfd-5a2c3a51ad55\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-wz5b2" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.350151 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vt6gq\" (UniqueName: \"kubernetes.io/projected/f313fc1b-02bd-4bbd-bfdb-a18a300ed8bb-kube-api-access-vt6gq\") pod \"nmstate-webhook-786f45cff4-27zgp\" (UID: \"f313fc1b-02bd-4bbd-bfdb-a18a300ed8bb\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-27zgp" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.350169 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/09f364f8-e2b6-4ffe-b51a-37af17081bf8-dbus-socket\") pod \"nmstate-handler-qvk5q\" (UID: \"09f364f8-e2b6-4ffe-b51a-37af17081bf8\") " pod="openshift-nmstate/nmstate-handler-qvk5q" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.350200 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khww2\" (UniqueName: \"kubernetes.io/projected/d45d7871-0d6e-4307-a739-9ae6d31f6ee1-kube-api-access-khww2\") pod \"community-operators-jb6vv\" (UID: \"d45d7871-0d6e-4307-a739-9ae6d31f6ee1\") " pod="openshift-marketplace/community-operators-jb6vv" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.350217 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d45d7871-0d6e-4307-a739-9ae6d31f6ee1-catalog-content\") pod \"community-operators-jb6vv\" (UID: \"d45d7871-0d6e-4307-a739-9ae6d31f6ee1\") " pod="openshift-marketplace/community-operators-jb6vv" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.350231 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/09f364f8-e2b6-4ffe-b51a-37af17081bf8-nmstate-lock\") pod \"nmstate-handler-qvk5q\" (UID: \"09f364f8-e2b6-4ffe-b51a-37af17081bf8\") " pod="openshift-nmstate/nmstate-handler-qvk5q" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.367254 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mgp5h"] Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.367943 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mgp5h" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.370494 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.371663 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.371897 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-rcwtd" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.378496 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kdjh\" (UniqueName: \"kubernetes.io/projected/194a66ef-2b30-40b7-bbfd-5a2c3a51ad55-kube-api-access-5kdjh\") pod \"nmstate-metrics-69594cc75-wz5b2\" (UID: \"194a66ef-2b30-40b7-bbfd-5a2c3a51ad55\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-wz5b2" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.384479 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mgp5h"] Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.451671 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/d29132f4-b735-40f3-94da-033f1174963f-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-mgp5h\" (UID: \"d29132f4-b735-40f3-94da-033f1174963f\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mgp5h" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.452017 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d45d7871-0d6e-4307-a739-9ae6d31f6ee1-utilities\") pod \"community-operators-jb6vv\" (UID: \"d45d7871-0d6e-4307-a739-9ae6d31f6ee1\") " pod="openshift-marketplace/community-operators-jb6vv" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.452048 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vt6gq\" (UniqueName: \"kubernetes.io/projected/f313fc1b-02bd-4bbd-bfdb-a18a300ed8bb-kube-api-access-vt6gq\") pod \"nmstate-webhook-786f45cff4-27zgp\" (UID: \"f313fc1b-02bd-4bbd-bfdb-a18a300ed8bb\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-27zgp" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.452070 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/09f364f8-e2b6-4ffe-b51a-37af17081bf8-dbus-socket\") pod \"nmstate-handler-qvk5q\" (UID: \"09f364f8-e2b6-4ffe-b51a-37af17081bf8\") " pod="openshift-nmstate/nmstate-handler-qvk5q" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.452107 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khww2\" (UniqueName: \"kubernetes.io/projected/d45d7871-0d6e-4307-a739-9ae6d31f6ee1-kube-api-access-khww2\") pod \"community-operators-jb6vv\" (UID: \"d45d7871-0d6e-4307-a739-9ae6d31f6ee1\") " pod="openshift-marketplace/community-operators-jb6vv" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.452122 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d45d7871-0d6e-4307-a739-9ae6d31f6ee1-catalog-content\") pod \"community-operators-jb6vv\" (UID: \"d45d7871-0d6e-4307-a739-9ae6d31f6ee1\") " pod="openshift-marketplace/community-operators-jb6vv" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.452140 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/09f364f8-e2b6-4ffe-b51a-37af17081bf8-nmstate-lock\") pod \"nmstate-handler-qvk5q\" (UID: \"09f364f8-e2b6-4ffe-b51a-37af17081bf8\") " pod="openshift-nmstate/nmstate-handler-qvk5q" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.452165 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktrln\" (UniqueName: \"kubernetes.io/projected/09f364f8-e2b6-4ffe-b51a-37af17081bf8-kube-api-access-ktrln\") pod \"nmstate-handler-qvk5q\" (UID: \"09f364f8-e2b6-4ffe-b51a-37af17081bf8\") " pod="openshift-nmstate/nmstate-handler-qvk5q" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.452197 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkjpz\" (UniqueName: \"kubernetes.io/projected/d29132f4-b735-40f3-94da-033f1174963f-kube-api-access-hkjpz\") pod \"nmstate-console-plugin-5dcbbd79cf-mgp5h\" (UID: \"d29132f4-b735-40f3-94da-033f1174963f\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mgp5h" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.452221 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d29132f4-b735-40f3-94da-033f1174963f-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-mgp5h\" (UID: \"d29132f4-b735-40f3-94da-033f1174963f\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mgp5h" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.452244 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f313fc1b-02bd-4bbd-bfdb-a18a300ed8bb-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-27zgp\" (UID: \"f313fc1b-02bd-4bbd-bfdb-a18a300ed8bb\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-27zgp" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.452260 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/09f364f8-e2b6-4ffe-b51a-37af17081bf8-ovs-socket\") pod \"nmstate-handler-qvk5q\" (UID: \"09f364f8-e2b6-4ffe-b51a-37af17081bf8\") " pod="openshift-nmstate/nmstate-handler-qvk5q" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.452321 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/09f364f8-e2b6-4ffe-b51a-37af17081bf8-ovs-socket\") pod \"nmstate-handler-qvk5q\" (UID: \"09f364f8-e2b6-4ffe-b51a-37af17081bf8\") " pod="openshift-nmstate/nmstate-handler-qvk5q" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.452351 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/09f364f8-e2b6-4ffe-b51a-37af17081bf8-nmstate-lock\") pod \"nmstate-handler-qvk5q\" (UID: \"09f364f8-e2b6-4ffe-b51a-37af17081bf8\") " pod="openshift-nmstate/nmstate-handler-qvk5q" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.452471 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d45d7871-0d6e-4307-a739-9ae6d31f6ee1-utilities\") pod \"community-operators-jb6vv\" (UID: \"d45d7871-0d6e-4307-a739-9ae6d31f6ee1\") " pod="openshift-marketplace/community-operators-jb6vv" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.452709 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d45d7871-0d6e-4307-a739-9ae6d31f6ee1-catalog-content\") pod \"community-operators-jb6vv\" (UID: \"d45d7871-0d6e-4307-a739-9ae6d31f6ee1\") " pod="openshift-marketplace/community-operators-jb6vv" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.452836 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/09f364f8-e2b6-4ffe-b51a-37af17081bf8-dbus-socket\") pod \"nmstate-handler-qvk5q\" (UID: \"09f364f8-e2b6-4ffe-b51a-37af17081bf8\") " pod="openshift-nmstate/nmstate-handler-qvk5q" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.457419 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f313fc1b-02bd-4bbd-bfdb-a18a300ed8bb-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-27zgp\" (UID: \"f313fc1b-02bd-4bbd-bfdb-a18a300ed8bb\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-27zgp" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.469242 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khww2\" (UniqueName: \"kubernetes.io/projected/d45d7871-0d6e-4307-a739-9ae6d31f6ee1-kube-api-access-khww2\") pod \"community-operators-jb6vv\" (UID: \"d45d7871-0d6e-4307-a739-9ae6d31f6ee1\") " pod="openshift-marketplace/community-operators-jb6vv" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.474169 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vt6gq\" (UniqueName: \"kubernetes.io/projected/f313fc1b-02bd-4bbd-bfdb-a18a300ed8bb-kube-api-access-vt6gq\") pod \"nmstate-webhook-786f45cff4-27zgp\" (UID: \"f313fc1b-02bd-4bbd-bfdb-a18a300ed8bb\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-27zgp" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.480124 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktrln\" (UniqueName: \"kubernetes.io/projected/09f364f8-e2b6-4ffe-b51a-37af17081bf8-kube-api-access-ktrln\") pod \"nmstate-handler-qvk5q\" (UID: \"09f364f8-e2b6-4ffe-b51a-37af17081bf8\") " pod="openshift-nmstate/nmstate-handler-qvk5q" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.512293 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-wz5b2" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.550785 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-27zgp" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.553923 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/d29132f4-b735-40f3-94da-033f1174963f-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-mgp5h\" (UID: \"d29132f4-b735-40f3-94da-033f1174963f\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mgp5h" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.554033 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkjpz\" (UniqueName: \"kubernetes.io/projected/d29132f4-b735-40f3-94da-033f1174963f-kube-api-access-hkjpz\") pod \"nmstate-console-plugin-5dcbbd79cf-mgp5h\" (UID: \"d29132f4-b735-40f3-94da-033f1174963f\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mgp5h" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.554068 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d29132f4-b735-40f3-94da-033f1174963f-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-mgp5h\" (UID: \"d29132f4-b735-40f3-94da-033f1174963f\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mgp5h" Mar 09 09:18:20 crc kubenswrapper[4861]: E0309 09:18:20.554231 4861 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Mar 09 09:18:20 crc kubenswrapper[4861]: E0309 09:18:20.554296 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d29132f4-b735-40f3-94da-033f1174963f-plugin-serving-cert podName:d29132f4-b735-40f3-94da-033f1174963f nodeName:}" failed. No retries permitted until 2026-03-09 09:18:21.054277579 +0000 UTC m=+744.139316980 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/d29132f4-b735-40f3-94da-033f1174963f-plugin-serving-cert") pod "nmstate-console-plugin-5dcbbd79cf-mgp5h" (UID: "d29132f4-b735-40f3-94da-033f1174963f") : secret "plugin-serving-cert" not found Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.555137 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d29132f4-b735-40f3-94da-033f1174963f-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-mgp5h\" (UID: \"d29132f4-b735-40f3-94da-033f1174963f\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mgp5h" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.565968 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jb6vv" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.581265 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkjpz\" (UniqueName: \"kubernetes.io/projected/d29132f4-b735-40f3-94da-033f1174963f-kube-api-access-hkjpz\") pod \"nmstate-console-plugin-5dcbbd79cf-mgp5h\" (UID: \"d29132f4-b735-40f3-94da-033f1174963f\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mgp5h" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.591221 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-qvk5q" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.593942 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5d7868ff7-8vxms"] Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.594655 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d7868ff7-8vxms" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.611477 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d7868ff7-8vxms"] Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.657153 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5113fde6-e331-4c76-abdd-2016bd8facc1-console-serving-cert\") pod \"console-5d7868ff7-8vxms\" (UID: \"5113fde6-e331-4c76-abdd-2016bd8facc1\") " pod="openshift-console/console-5d7868ff7-8vxms" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.657213 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5113fde6-e331-4c76-abdd-2016bd8facc1-console-config\") pod \"console-5d7868ff7-8vxms\" (UID: \"5113fde6-e331-4c76-abdd-2016bd8facc1\") " pod="openshift-console/console-5d7868ff7-8vxms" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.657233 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5113fde6-e331-4c76-abdd-2016bd8facc1-trusted-ca-bundle\") pod \"console-5d7868ff7-8vxms\" (UID: \"5113fde6-e331-4c76-abdd-2016bd8facc1\") " pod="openshift-console/console-5d7868ff7-8vxms" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.657255 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5113fde6-e331-4c76-abdd-2016bd8facc1-console-oauth-config\") pod \"console-5d7868ff7-8vxms\" (UID: \"5113fde6-e331-4c76-abdd-2016bd8facc1\") " pod="openshift-console/console-5d7868ff7-8vxms" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.657302 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5113fde6-e331-4c76-abdd-2016bd8facc1-service-ca\") pod \"console-5d7868ff7-8vxms\" (UID: \"5113fde6-e331-4c76-abdd-2016bd8facc1\") " pod="openshift-console/console-5d7868ff7-8vxms" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.657385 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5113fde6-e331-4c76-abdd-2016bd8facc1-oauth-serving-cert\") pod \"console-5d7868ff7-8vxms\" (UID: \"5113fde6-e331-4c76-abdd-2016bd8facc1\") " pod="openshift-console/console-5d7868ff7-8vxms" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.657413 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nc8rf\" (UniqueName: \"kubernetes.io/projected/5113fde6-e331-4c76-abdd-2016bd8facc1-kube-api-access-nc8rf\") pod \"console-5d7868ff7-8vxms\" (UID: \"5113fde6-e331-4c76-abdd-2016bd8facc1\") " pod="openshift-console/console-5d7868ff7-8vxms" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.758962 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5113fde6-e331-4c76-abdd-2016bd8facc1-oauth-serving-cert\") pod \"console-5d7868ff7-8vxms\" (UID: \"5113fde6-e331-4c76-abdd-2016bd8facc1\") " pod="openshift-console/console-5d7868ff7-8vxms" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.759226 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nc8rf\" (UniqueName: \"kubernetes.io/projected/5113fde6-e331-4c76-abdd-2016bd8facc1-kube-api-access-nc8rf\") pod \"console-5d7868ff7-8vxms\" (UID: \"5113fde6-e331-4c76-abdd-2016bd8facc1\") " pod="openshift-console/console-5d7868ff7-8vxms" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.759249 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5113fde6-e331-4c76-abdd-2016bd8facc1-console-serving-cert\") pod \"console-5d7868ff7-8vxms\" (UID: \"5113fde6-e331-4c76-abdd-2016bd8facc1\") " pod="openshift-console/console-5d7868ff7-8vxms" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.759270 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5113fde6-e331-4c76-abdd-2016bd8facc1-console-config\") pod \"console-5d7868ff7-8vxms\" (UID: \"5113fde6-e331-4c76-abdd-2016bd8facc1\") " pod="openshift-console/console-5d7868ff7-8vxms" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.759287 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5113fde6-e331-4c76-abdd-2016bd8facc1-trusted-ca-bundle\") pod \"console-5d7868ff7-8vxms\" (UID: \"5113fde6-e331-4c76-abdd-2016bd8facc1\") " pod="openshift-console/console-5d7868ff7-8vxms" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.759306 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5113fde6-e331-4c76-abdd-2016bd8facc1-console-oauth-config\") pod \"console-5d7868ff7-8vxms\" (UID: \"5113fde6-e331-4c76-abdd-2016bd8facc1\") " pod="openshift-console/console-5d7868ff7-8vxms" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.759336 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5113fde6-e331-4c76-abdd-2016bd8facc1-service-ca\") pod \"console-5d7868ff7-8vxms\" (UID: \"5113fde6-e331-4c76-abdd-2016bd8facc1\") " pod="openshift-console/console-5d7868ff7-8vxms" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.760084 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5113fde6-e331-4c76-abdd-2016bd8facc1-service-ca\") pod \"console-5d7868ff7-8vxms\" (UID: \"5113fde6-e331-4c76-abdd-2016bd8facc1\") " pod="openshift-console/console-5d7868ff7-8vxms" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.760599 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5113fde6-e331-4c76-abdd-2016bd8facc1-oauth-serving-cert\") pod \"console-5d7868ff7-8vxms\" (UID: \"5113fde6-e331-4c76-abdd-2016bd8facc1\") " pod="openshift-console/console-5d7868ff7-8vxms" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.761429 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5113fde6-e331-4c76-abdd-2016bd8facc1-console-config\") pod \"console-5d7868ff7-8vxms\" (UID: \"5113fde6-e331-4c76-abdd-2016bd8facc1\") " pod="openshift-console/console-5d7868ff7-8vxms" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.762023 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5113fde6-e331-4c76-abdd-2016bd8facc1-trusted-ca-bundle\") pod \"console-5d7868ff7-8vxms\" (UID: \"5113fde6-e331-4c76-abdd-2016bd8facc1\") " pod="openshift-console/console-5d7868ff7-8vxms" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.766630 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5113fde6-e331-4c76-abdd-2016bd8facc1-console-oauth-config\") pod \"console-5d7868ff7-8vxms\" (UID: \"5113fde6-e331-4c76-abdd-2016bd8facc1\") " pod="openshift-console/console-5d7868ff7-8vxms" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.776184 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5113fde6-e331-4c76-abdd-2016bd8facc1-console-serving-cert\") pod \"console-5d7868ff7-8vxms\" (UID: \"5113fde6-e331-4c76-abdd-2016bd8facc1\") " pod="openshift-console/console-5d7868ff7-8vxms" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.791283 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nc8rf\" (UniqueName: \"kubernetes.io/projected/5113fde6-e331-4c76-abdd-2016bd8facc1-kube-api-access-nc8rf\") pod \"console-5d7868ff7-8vxms\" (UID: \"5113fde6-e331-4c76-abdd-2016bd8facc1\") " pod="openshift-console/console-5d7868ff7-8vxms" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.884528 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-wz5b2"] Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.922705 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d7868ff7-8vxms" Mar 09 09:18:20 crc kubenswrapper[4861]: I0309 09:18:20.936267 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jb6vv"] Mar 09 09:18:21 crc kubenswrapper[4861]: I0309 09:18:21.064165 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/d29132f4-b735-40f3-94da-033f1174963f-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-mgp5h\" (UID: \"d29132f4-b735-40f3-94da-033f1174963f\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mgp5h" Mar 09 09:18:21 crc kubenswrapper[4861]: I0309 09:18:21.071068 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/d29132f4-b735-40f3-94da-033f1174963f-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-mgp5h\" (UID: \"d29132f4-b735-40f3-94da-033f1174963f\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mgp5h" Mar 09 09:18:21 crc kubenswrapper[4861]: I0309 09:18:21.139502 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d7868ff7-8vxms"] Mar 09 09:18:21 crc kubenswrapper[4861]: W0309 09:18:21.144436 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5113fde6_e331_4c76_abdd_2016bd8facc1.slice/crio-a8eceae2cb28648b9406b1148fdea0117dd25d1ea94e8094026356eedb1595fd WatchSource:0}: Error finding container a8eceae2cb28648b9406b1148fdea0117dd25d1ea94e8094026356eedb1595fd: Status 404 returned error can't find the container with id a8eceae2cb28648b9406b1148fdea0117dd25d1ea94e8094026356eedb1595fd Mar 09 09:18:21 crc kubenswrapper[4861]: I0309 09:18:21.157210 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-27zgp"] Mar 09 09:18:21 crc kubenswrapper[4861]: W0309 09:18:21.163541 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf313fc1b_02bd_4bbd_bfdb_a18a300ed8bb.slice/crio-0d8ac9085bab54bf3f8acae23c99df93274ddd31fff8f2de1ae1f9191bd72370 WatchSource:0}: Error finding container 0d8ac9085bab54bf3f8acae23c99df93274ddd31fff8f2de1ae1f9191bd72370: Status 404 returned error can't find the container with id 0d8ac9085bab54bf3f8acae23c99df93274ddd31fff8f2de1ae1f9191bd72370 Mar 09 09:18:21 crc kubenswrapper[4861]: I0309 09:18:21.190247 4861 generic.go:334] "Generic (PLEG): container finished" podID="d45d7871-0d6e-4307-a739-9ae6d31f6ee1" containerID="89877e982b80d31bf72bb67312668848aca782aee35e2af0835f52863432d6eb" exitCode=0 Mar 09 09:18:21 crc kubenswrapper[4861]: I0309 09:18:21.190303 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jb6vv" event={"ID":"d45d7871-0d6e-4307-a739-9ae6d31f6ee1","Type":"ContainerDied","Data":"89877e982b80d31bf72bb67312668848aca782aee35e2af0835f52863432d6eb"} Mar 09 09:18:21 crc kubenswrapper[4861]: I0309 09:18:21.190356 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jb6vv" event={"ID":"d45d7871-0d6e-4307-a739-9ae6d31f6ee1","Type":"ContainerStarted","Data":"5c049116a9d00690cdf7bdbea888b95c491f9bedef53c3f55c739a87968ee1e0"} Mar 09 09:18:21 crc kubenswrapper[4861]: I0309 09:18:21.191979 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d7868ff7-8vxms" event={"ID":"5113fde6-e331-4c76-abdd-2016bd8facc1","Type":"ContainerStarted","Data":"a8eceae2cb28648b9406b1148fdea0117dd25d1ea94e8094026356eedb1595fd"} Mar 09 09:18:21 crc kubenswrapper[4861]: I0309 09:18:21.195263 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-wz5b2" event={"ID":"194a66ef-2b30-40b7-bbfd-5a2c3a51ad55","Type":"ContainerStarted","Data":"c0e182bfa2b6710743969a07267d17927e97201cfc6a3b268e7a3afc91d4a623"} Mar 09 09:18:21 crc kubenswrapper[4861]: I0309 09:18:21.196480 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-qvk5q" event={"ID":"09f364f8-e2b6-4ffe-b51a-37af17081bf8","Type":"ContainerStarted","Data":"97b9cb4edf918c1658979d0002d3e0a83a2dc8d0001af87fcaf6206394023b3c"} Mar 09 09:18:21 crc kubenswrapper[4861]: I0309 09:18:21.197517 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-27zgp" event={"ID":"f313fc1b-02bd-4bbd-bfdb-a18a300ed8bb","Type":"ContainerStarted","Data":"0d8ac9085bab54bf3f8acae23c99df93274ddd31fff8f2de1ae1f9191bd72370"} Mar 09 09:18:21 crc kubenswrapper[4861]: I0309 09:18:21.307605 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mgp5h" Mar 09 09:18:21 crc kubenswrapper[4861]: I0309 09:18:21.487088 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mgp5h"] Mar 09 09:18:22 crc kubenswrapper[4861]: I0309 09:18:22.204585 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mgp5h" event={"ID":"d29132f4-b735-40f3-94da-033f1174963f","Type":"ContainerStarted","Data":"b56ea0fbd2ac09126daf09c937fabf4e8528eb9a2049e45ede298ce3c5ae0c87"} Mar 09 09:18:22 crc kubenswrapper[4861]: I0309 09:18:22.207987 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jb6vv" event={"ID":"d45d7871-0d6e-4307-a739-9ae6d31f6ee1","Type":"ContainerStarted","Data":"1dc636adff0c2b4e3b5b51945777df97a2c74f1e9fdc53a52b4b3ceb42db3da3"} Mar 09 09:18:22 crc kubenswrapper[4861]: I0309 09:18:22.209339 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d7868ff7-8vxms" event={"ID":"5113fde6-e331-4c76-abdd-2016bd8facc1","Type":"ContainerStarted","Data":"d7a2519484c187713cc59d1a46606839c645becf76abadf2563fda8064e9802c"} Mar 09 09:18:22 crc kubenswrapper[4861]: I0309 09:18:22.251101 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5d7868ff7-8vxms" podStartSLOduration=2.251078593 podStartE2EDuration="2.251078593s" podCreationTimestamp="2026-03-09 09:18:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:18:22.245915544 +0000 UTC m=+745.330954965" watchObservedRunningTime="2026-03-09 09:18:22.251078593 +0000 UTC m=+745.336117994" Mar 09 09:18:23 crc kubenswrapper[4861]: I0309 09:18:23.222610 4861 generic.go:334] "Generic (PLEG): container finished" podID="d45d7871-0d6e-4307-a739-9ae6d31f6ee1" containerID="1dc636adff0c2b4e3b5b51945777df97a2c74f1e9fdc53a52b4b3ceb42db3da3" exitCode=0 Mar 09 09:18:23 crc kubenswrapper[4861]: I0309 09:18:23.223641 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jb6vv" event={"ID":"d45d7871-0d6e-4307-a739-9ae6d31f6ee1","Type":"ContainerDied","Data":"1dc636adff0c2b4e3b5b51945777df97a2c74f1e9fdc53a52b4b3ceb42db3da3"} Mar 09 09:18:24 crc kubenswrapper[4861]: I0309 09:18:24.232838 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-27zgp" event={"ID":"f313fc1b-02bd-4bbd-bfdb-a18a300ed8bb","Type":"ContainerStarted","Data":"ee96dcd4decc431c6ac454aea3bfb3a82e965b99ab5f6d529bc5a2f8cf7c9b23"} Mar 09 09:18:24 crc kubenswrapper[4861]: I0309 09:18:24.233256 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-786f45cff4-27zgp" Mar 09 09:18:24 crc kubenswrapper[4861]: I0309 09:18:24.234732 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-wz5b2" event={"ID":"194a66ef-2b30-40b7-bbfd-5a2c3a51ad55","Type":"ContainerStarted","Data":"8e5e9a82df361bf63ee1814beb704f451d2c3d30e3af5739b8bcc5da58c1e2ba"} Mar 09 09:18:24 crc kubenswrapper[4861]: I0309 09:18:24.236506 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-qvk5q" event={"ID":"09f364f8-e2b6-4ffe-b51a-37af17081bf8","Type":"ContainerStarted","Data":"a1ecede0ce8fed42d8e441e5f0e4ff78bab5a994c3683f050f9b48106abf8ab1"} Mar 09 09:18:24 crc kubenswrapper[4861]: I0309 09:18:24.236752 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-qvk5q" Mar 09 09:18:24 crc kubenswrapper[4861]: I0309 09:18:24.257969 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-786f45cff4-27zgp" podStartSLOduration=2.202110578 podStartE2EDuration="4.257947681s" podCreationTimestamp="2026-03-09 09:18:20 +0000 UTC" firstStartedPulling="2026-03-09 09:18:21.165594624 +0000 UTC m=+744.250634025" lastFinishedPulling="2026-03-09 09:18:23.221431727 +0000 UTC m=+746.306471128" observedRunningTime="2026-03-09 09:18:24.251683569 +0000 UTC m=+747.336723060" watchObservedRunningTime="2026-03-09 09:18:24.257947681 +0000 UTC m=+747.342987082" Mar 09 09:18:24 crc kubenswrapper[4861]: I0309 09:18:24.269951 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-qvk5q" podStartSLOduration=1.735359739 podStartE2EDuration="4.269935947s" podCreationTimestamp="2026-03-09 09:18:20 +0000 UTC" firstStartedPulling="2026-03-09 09:18:20.666676805 +0000 UTC m=+743.751716206" lastFinishedPulling="2026-03-09 09:18:23.201253013 +0000 UTC m=+746.286292414" observedRunningTime="2026-03-09 09:18:24.26899032 +0000 UTC m=+747.354029771" watchObservedRunningTime="2026-03-09 09:18:24.269935947 +0000 UTC m=+747.354975348" Mar 09 09:18:24 crc kubenswrapper[4861]: I0309 09:18:24.605868 4861 patch_prober.go:28] interesting pod/machine-config-daemon-5g7gc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:18:24 crc kubenswrapper[4861]: I0309 09:18:24.606236 4861 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:18:25 crc kubenswrapper[4861]: I0309 09:18:25.244850 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mgp5h" event={"ID":"d29132f4-b735-40f3-94da-033f1174963f","Type":"ContainerStarted","Data":"ada0cf9f1451722b60c25b7f81a86c1005d7735763de0d92c8c1308a3ab9a43a"} Mar 09 09:18:25 crc kubenswrapper[4861]: I0309 09:18:25.250259 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jb6vv" event={"ID":"d45d7871-0d6e-4307-a739-9ae6d31f6ee1","Type":"ContainerStarted","Data":"28badb1f6ded10564a77f80d961e4c6ece938042b81e6499d6bb4168d7ec877c"} Mar 09 09:18:25 crc kubenswrapper[4861]: I0309 09:18:25.284017 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-mgp5h" podStartSLOduration=2.542850698 podStartE2EDuration="5.283998883s" podCreationTimestamp="2026-03-09 09:18:20 +0000 UTC" firstStartedPulling="2026-03-09 09:18:21.494035729 +0000 UTC m=+744.579075130" lastFinishedPulling="2026-03-09 09:18:24.235183914 +0000 UTC m=+747.320223315" observedRunningTime="2026-03-09 09:18:25.263255154 +0000 UTC m=+748.348294565" watchObservedRunningTime="2026-03-09 09:18:25.283998883 +0000 UTC m=+748.369038294" Mar 09 09:18:25 crc kubenswrapper[4861]: I0309 09:18:25.286382 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jb6vv" podStartSLOduration=2.033572141 podStartE2EDuration="5.286356971s" podCreationTimestamp="2026-03-09 09:18:20 +0000 UTC" firstStartedPulling="2026-03-09 09:18:21.191732819 +0000 UTC m=+744.276772230" lastFinishedPulling="2026-03-09 09:18:24.444517609 +0000 UTC m=+747.529557060" observedRunningTime="2026-03-09 09:18:25.280733488 +0000 UTC m=+748.365772899" watchObservedRunningTime="2026-03-09 09:18:25.286356971 +0000 UTC m=+748.371396382" Mar 09 09:18:28 crc kubenswrapper[4861]: I0309 09:18:28.281547 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-wz5b2" event={"ID":"194a66ef-2b30-40b7-bbfd-5a2c3a51ad55","Type":"ContainerStarted","Data":"ddb7fb210ba1117491d9c791f2ca790f7fe2843a5316c8b76065e66506816499"} Mar 09 09:18:28 crc kubenswrapper[4861]: I0309 09:18:28.307325 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-69594cc75-wz5b2" podStartSLOduration=2.103543191 podStartE2EDuration="8.307295945s" podCreationTimestamp="2026-03-09 09:18:20 +0000 UTC" firstStartedPulling="2026-03-09 09:18:20.913647707 +0000 UTC m=+743.998687108" lastFinishedPulling="2026-03-09 09:18:27.117400461 +0000 UTC m=+750.202439862" observedRunningTime="2026-03-09 09:18:28.304794263 +0000 UTC m=+751.389833714" watchObservedRunningTime="2026-03-09 09:18:28.307295945 +0000 UTC m=+751.392335386" Mar 09 09:18:30 crc kubenswrapper[4861]: I0309 09:18:30.567039 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jb6vv" Mar 09 09:18:30 crc kubenswrapper[4861]: I0309 09:18:30.567481 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jb6vv" Mar 09 09:18:30 crc kubenswrapper[4861]: I0309 09:18:30.627312 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-qvk5q" Mar 09 09:18:30 crc kubenswrapper[4861]: I0309 09:18:30.643096 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jb6vv" Mar 09 09:18:30 crc kubenswrapper[4861]: I0309 09:18:30.924510 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5d7868ff7-8vxms" Mar 09 09:18:30 crc kubenswrapper[4861]: I0309 09:18:30.925507 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5d7868ff7-8vxms" Mar 09 09:18:30 crc kubenswrapper[4861]: I0309 09:18:30.931755 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5d7868ff7-8vxms" Mar 09 09:18:31 crc kubenswrapper[4861]: I0309 09:18:31.311926 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5d7868ff7-8vxms" Mar 09 09:18:31 crc kubenswrapper[4861]: I0309 09:18:31.396798 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-qh9sg"] Mar 09 09:18:31 crc kubenswrapper[4861]: I0309 09:18:31.414953 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jb6vv" Mar 09 09:18:31 crc kubenswrapper[4861]: I0309 09:18:31.475092 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jb6vv"] Mar 09 09:18:33 crc kubenswrapper[4861]: I0309 09:18:33.319758 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jb6vv" podUID="d45d7871-0d6e-4307-a739-9ae6d31f6ee1" containerName="registry-server" containerID="cri-o://28badb1f6ded10564a77f80d961e4c6ece938042b81e6499d6bb4168d7ec877c" gracePeriod=2 Mar 09 09:18:33 crc kubenswrapper[4861]: I0309 09:18:33.708832 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jb6vv" Mar 09 09:18:33 crc kubenswrapper[4861]: I0309 09:18:33.760329 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khww2\" (UniqueName: \"kubernetes.io/projected/d45d7871-0d6e-4307-a739-9ae6d31f6ee1-kube-api-access-khww2\") pod \"d45d7871-0d6e-4307-a739-9ae6d31f6ee1\" (UID: \"d45d7871-0d6e-4307-a739-9ae6d31f6ee1\") " Mar 09 09:18:33 crc kubenswrapper[4861]: I0309 09:18:33.760413 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d45d7871-0d6e-4307-a739-9ae6d31f6ee1-utilities\") pod \"d45d7871-0d6e-4307-a739-9ae6d31f6ee1\" (UID: \"d45d7871-0d6e-4307-a739-9ae6d31f6ee1\") " Mar 09 09:18:33 crc kubenswrapper[4861]: I0309 09:18:33.760436 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d45d7871-0d6e-4307-a739-9ae6d31f6ee1-catalog-content\") pod \"d45d7871-0d6e-4307-a739-9ae6d31f6ee1\" (UID: \"d45d7871-0d6e-4307-a739-9ae6d31f6ee1\") " Mar 09 09:18:33 crc kubenswrapper[4861]: I0309 09:18:33.761278 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d45d7871-0d6e-4307-a739-9ae6d31f6ee1-utilities" (OuterVolumeSpecName: "utilities") pod "d45d7871-0d6e-4307-a739-9ae6d31f6ee1" (UID: "d45d7871-0d6e-4307-a739-9ae6d31f6ee1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:18:33 crc kubenswrapper[4861]: I0309 09:18:33.765594 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d45d7871-0d6e-4307-a739-9ae6d31f6ee1-kube-api-access-khww2" (OuterVolumeSpecName: "kube-api-access-khww2") pod "d45d7871-0d6e-4307-a739-9ae6d31f6ee1" (UID: "d45d7871-0d6e-4307-a739-9ae6d31f6ee1"). InnerVolumeSpecName "kube-api-access-khww2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:18:33 crc kubenswrapper[4861]: I0309 09:18:33.813309 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d45d7871-0d6e-4307-a739-9ae6d31f6ee1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d45d7871-0d6e-4307-a739-9ae6d31f6ee1" (UID: "d45d7871-0d6e-4307-a739-9ae6d31f6ee1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:18:33 crc kubenswrapper[4861]: I0309 09:18:33.861734 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khww2\" (UniqueName: \"kubernetes.io/projected/d45d7871-0d6e-4307-a739-9ae6d31f6ee1-kube-api-access-khww2\") on node \"crc\" DevicePath \"\"" Mar 09 09:18:33 crc kubenswrapper[4861]: I0309 09:18:33.861766 4861 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d45d7871-0d6e-4307-a739-9ae6d31f6ee1-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:18:33 crc kubenswrapper[4861]: I0309 09:18:33.861777 4861 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d45d7871-0d6e-4307-a739-9ae6d31f6ee1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:18:34 crc kubenswrapper[4861]: I0309 09:18:34.331157 4861 generic.go:334] "Generic (PLEG): container finished" podID="d45d7871-0d6e-4307-a739-9ae6d31f6ee1" containerID="28badb1f6ded10564a77f80d961e4c6ece938042b81e6499d6bb4168d7ec877c" exitCode=0 Mar 09 09:18:34 crc kubenswrapper[4861]: I0309 09:18:34.331216 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jb6vv" event={"ID":"d45d7871-0d6e-4307-a739-9ae6d31f6ee1","Type":"ContainerDied","Data":"28badb1f6ded10564a77f80d961e4c6ece938042b81e6499d6bb4168d7ec877c"} Mar 09 09:18:34 crc kubenswrapper[4861]: I0309 09:18:34.331565 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jb6vv" event={"ID":"d45d7871-0d6e-4307-a739-9ae6d31f6ee1","Type":"ContainerDied","Data":"5c049116a9d00690cdf7bdbea888b95c491f9bedef53c3f55c739a87968ee1e0"} Mar 09 09:18:34 crc kubenswrapper[4861]: I0309 09:18:34.331595 4861 scope.go:117] "RemoveContainer" containerID="28badb1f6ded10564a77f80d961e4c6ece938042b81e6499d6bb4168d7ec877c" Mar 09 09:18:34 crc kubenswrapper[4861]: I0309 09:18:34.331271 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jb6vv" Mar 09 09:18:34 crc kubenswrapper[4861]: I0309 09:18:34.360845 4861 scope.go:117] "RemoveContainer" containerID="1dc636adff0c2b4e3b5b51945777df97a2c74f1e9fdc53a52b4b3ceb42db3da3" Mar 09 09:18:34 crc kubenswrapper[4861]: I0309 09:18:34.383228 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jb6vv"] Mar 09 09:18:34 crc kubenswrapper[4861]: I0309 09:18:34.390935 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jb6vv"] Mar 09 09:18:34 crc kubenswrapper[4861]: I0309 09:18:34.408087 4861 scope.go:117] "RemoveContainer" containerID="89877e982b80d31bf72bb67312668848aca782aee35e2af0835f52863432d6eb" Mar 09 09:18:34 crc kubenswrapper[4861]: I0309 09:18:34.424492 4861 scope.go:117] "RemoveContainer" containerID="28badb1f6ded10564a77f80d961e4c6ece938042b81e6499d6bb4168d7ec877c" Mar 09 09:18:34 crc kubenswrapper[4861]: E0309 09:18:34.425216 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28badb1f6ded10564a77f80d961e4c6ece938042b81e6499d6bb4168d7ec877c\": container with ID starting with 28badb1f6ded10564a77f80d961e4c6ece938042b81e6499d6bb4168d7ec877c not found: ID does not exist" containerID="28badb1f6ded10564a77f80d961e4c6ece938042b81e6499d6bb4168d7ec877c" Mar 09 09:18:34 crc kubenswrapper[4861]: I0309 09:18:34.425447 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28badb1f6ded10564a77f80d961e4c6ece938042b81e6499d6bb4168d7ec877c"} err="failed to get container status \"28badb1f6ded10564a77f80d961e4c6ece938042b81e6499d6bb4168d7ec877c\": rpc error: code = NotFound desc = could not find container \"28badb1f6ded10564a77f80d961e4c6ece938042b81e6499d6bb4168d7ec877c\": container with ID starting with 28badb1f6ded10564a77f80d961e4c6ece938042b81e6499d6bb4168d7ec877c not found: ID does not exist" Mar 09 09:18:34 crc kubenswrapper[4861]: I0309 09:18:34.425610 4861 scope.go:117] "RemoveContainer" containerID="1dc636adff0c2b4e3b5b51945777df97a2c74f1e9fdc53a52b4b3ceb42db3da3" Mar 09 09:18:34 crc kubenswrapper[4861]: E0309 09:18:34.426121 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1dc636adff0c2b4e3b5b51945777df97a2c74f1e9fdc53a52b4b3ceb42db3da3\": container with ID starting with 1dc636adff0c2b4e3b5b51945777df97a2c74f1e9fdc53a52b4b3ceb42db3da3 not found: ID does not exist" containerID="1dc636adff0c2b4e3b5b51945777df97a2c74f1e9fdc53a52b4b3ceb42db3da3" Mar 09 09:18:34 crc kubenswrapper[4861]: I0309 09:18:34.426151 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1dc636adff0c2b4e3b5b51945777df97a2c74f1e9fdc53a52b4b3ceb42db3da3"} err="failed to get container status \"1dc636adff0c2b4e3b5b51945777df97a2c74f1e9fdc53a52b4b3ceb42db3da3\": rpc error: code = NotFound desc = could not find container \"1dc636adff0c2b4e3b5b51945777df97a2c74f1e9fdc53a52b4b3ceb42db3da3\": container with ID starting with 1dc636adff0c2b4e3b5b51945777df97a2c74f1e9fdc53a52b4b3ceb42db3da3 not found: ID does not exist" Mar 09 09:18:34 crc kubenswrapper[4861]: I0309 09:18:34.426171 4861 scope.go:117] "RemoveContainer" containerID="89877e982b80d31bf72bb67312668848aca782aee35e2af0835f52863432d6eb" Mar 09 09:18:34 crc kubenswrapper[4861]: E0309 09:18:34.426644 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89877e982b80d31bf72bb67312668848aca782aee35e2af0835f52863432d6eb\": container with ID starting with 89877e982b80d31bf72bb67312668848aca782aee35e2af0835f52863432d6eb not found: ID does not exist" containerID="89877e982b80d31bf72bb67312668848aca782aee35e2af0835f52863432d6eb" Mar 09 09:18:34 crc kubenswrapper[4861]: I0309 09:18:34.426738 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89877e982b80d31bf72bb67312668848aca782aee35e2af0835f52863432d6eb"} err="failed to get container status \"89877e982b80d31bf72bb67312668848aca782aee35e2af0835f52863432d6eb\": rpc error: code = NotFound desc = could not find container \"89877e982b80d31bf72bb67312668848aca782aee35e2af0835f52863432d6eb\": container with ID starting with 89877e982b80d31bf72bb67312668848aca782aee35e2af0835f52863432d6eb not found: ID does not exist" Mar 09 09:18:34 crc kubenswrapper[4861]: I0309 09:18:34.713667 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-g8xgb"] Mar 09 09:18:34 crc kubenswrapper[4861]: E0309 09:18:34.714023 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d45d7871-0d6e-4307-a739-9ae6d31f6ee1" containerName="registry-server" Mar 09 09:18:34 crc kubenswrapper[4861]: I0309 09:18:34.714044 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="d45d7871-0d6e-4307-a739-9ae6d31f6ee1" containerName="registry-server" Mar 09 09:18:34 crc kubenswrapper[4861]: E0309 09:18:34.714059 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d45d7871-0d6e-4307-a739-9ae6d31f6ee1" containerName="extract-utilities" Mar 09 09:18:34 crc kubenswrapper[4861]: I0309 09:18:34.714070 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="d45d7871-0d6e-4307-a739-9ae6d31f6ee1" containerName="extract-utilities" Mar 09 09:18:34 crc kubenswrapper[4861]: E0309 09:18:34.714097 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d45d7871-0d6e-4307-a739-9ae6d31f6ee1" containerName="extract-content" Mar 09 09:18:34 crc kubenswrapper[4861]: I0309 09:18:34.714111 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="d45d7871-0d6e-4307-a739-9ae6d31f6ee1" containerName="extract-content" Mar 09 09:18:34 crc kubenswrapper[4861]: I0309 09:18:34.714283 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="d45d7871-0d6e-4307-a739-9ae6d31f6ee1" containerName="registry-server" Mar 09 09:18:34 crc kubenswrapper[4861]: I0309 09:18:34.715535 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g8xgb" Mar 09 09:18:34 crc kubenswrapper[4861]: I0309 09:18:34.718109 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g8xgb"] Mar 09 09:18:34 crc kubenswrapper[4861]: I0309 09:18:34.779585 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efd4992d-ceb0-465e-925d-4f8784c8c803-catalog-content\") pod \"certified-operators-g8xgb\" (UID: \"efd4992d-ceb0-465e-925d-4f8784c8c803\") " pod="openshift-marketplace/certified-operators-g8xgb" Mar 09 09:18:34 crc kubenswrapper[4861]: I0309 09:18:34.779997 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efd4992d-ceb0-465e-925d-4f8784c8c803-utilities\") pod \"certified-operators-g8xgb\" (UID: \"efd4992d-ceb0-465e-925d-4f8784c8c803\") " pod="openshift-marketplace/certified-operators-g8xgb" Mar 09 09:18:34 crc kubenswrapper[4861]: I0309 09:18:34.780083 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8d9w\" (UniqueName: \"kubernetes.io/projected/efd4992d-ceb0-465e-925d-4f8784c8c803-kube-api-access-c8d9w\") pod \"certified-operators-g8xgb\" (UID: \"efd4992d-ceb0-465e-925d-4f8784c8c803\") " pod="openshift-marketplace/certified-operators-g8xgb" Mar 09 09:18:34 crc kubenswrapper[4861]: I0309 09:18:34.881485 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efd4992d-ceb0-465e-925d-4f8784c8c803-catalog-content\") pod \"certified-operators-g8xgb\" (UID: \"efd4992d-ceb0-465e-925d-4f8784c8c803\") " pod="openshift-marketplace/certified-operators-g8xgb" Mar 09 09:18:34 crc kubenswrapper[4861]: I0309 09:18:34.881907 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efd4992d-ceb0-465e-925d-4f8784c8c803-utilities\") pod \"certified-operators-g8xgb\" (UID: \"efd4992d-ceb0-465e-925d-4f8784c8c803\") " pod="openshift-marketplace/certified-operators-g8xgb" Mar 09 09:18:34 crc kubenswrapper[4861]: I0309 09:18:34.882100 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8d9w\" (UniqueName: \"kubernetes.io/projected/efd4992d-ceb0-465e-925d-4f8784c8c803-kube-api-access-c8d9w\") pod \"certified-operators-g8xgb\" (UID: \"efd4992d-ceb0-465e-925d-4f8784c8c803\") " pod="openshift-marketplace/certified-operators-g8xgb" Mar 09 09:18:34 crc kubenswrapper[4861]: I0309 09:18:34.882403 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efd4992d-ceb0-465e-925d-4f8784c8c803-catalog-content\") pod \"certified-operators-g8xgb\" (UID: \"efd4992d-ceb0-465e-925d-4f8784c8c803\") " pod="openshift-marketplace/certified-operators-g8xgb" Mar 09 09:18:34 crc kubenswrapper[4861]: I0309 09:18:34.882428 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efd4992d-ceb0-465e-925d-4f8784c8c803-utilities\") pod \"certified-operators-g8xgb\" (UID: \"efd4992d-ceb0-465e-925d-4f8784c8c803\") " pod="openshift-marketplace/certified-operators-g8xgb" Mar 09 09:18:34 crc kubenswrapper[4861]: I0309 09:18:34.910406 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8d9w\" (UniqueName: \"kubernetes.io/projected/efd4992d-ceb0-465e-925d-4f8784c8c803-kube-api-access-c8d9w\") pod \"certified-operators-g8xgb\" (UID: \"efd4992d-ceb0-465e-925d-4f8784c8c803\") " pod="openshift-marketplace/certified-operators-g8xgb" Mar 09 09:18:35 crc kubenswrapper[4861]: I0309 09:18:35.043044 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g8xgb" Mar 09 09:18:35 crc kubenswrapper[4861]: I0309 09:18:35.493970 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g8xgb"] Mar 09 09:18:35 crc kubenswrapper[4861]: W0309 09:18:35.498630 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podefd4992d_ceb0_465e_925d_4f8784c8c803.slice/crio-a45c54596e195b5e87f6be828e16027ba6f711c911430b1253f69848cbb29a78 WatchSource:0}: Error finding container a45c54596e195b5e87f6be828e16027ba6f711c911430b1253f69848cbb29a78: Status 404 returned error can't find the container with id a45c54596e195b5e87f6be828e16027ba6f711c911430b1253f69848cbb29a78 Mar 09 09:18:35 crc kubenswrapper[4861]: I0309 09:18:35.665700 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d45d7871-0d6e-4307-a739-9ae6d31f6ee1" path="/var/lib/kubelet/pods/d45d7871-0d6e-4307-a739-9ae6d31f6ee1/volumes" Mar 09 09:18:36 crc kubenswrapper[4861]: I0309 09:18:36.357050 4861 generic.go:334] "Generic (PLEG): container finished" podID="efd4992d-ceb0-465e-925d-4f8784c8c803" containerID="dae4fea06f99dea35d2ff2f209bf85e3ed0f44e10047ff5ceff4f4734083faba" exitCode=0 Mar 09 09:18:36 crc kubenswrapper[4861]: I0309 09:18:36.357164 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g8xgb" event={"ID":"efd4992d-ceb0-465e-925d-4f8784c8c803","Type":"ContainerDied","Data":"dae4fea06f99dea35d2ff2f209bf85e3ed0f44e10047ff5ceff4f4734083faba"} Mar 09 09:18:36 crc kubenswrapper[4861]: I0309 09:18:36.357584 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g8xgb" event={"ID":"efd4992d-ceb0-465e-925d-4f8784c8c803","Type":"ContainerStarted","Data":"a45c54596e195b5e87f6be828e16027ba6f711c911430b1253f69848cbb29a78"} Mar 09 09:18:37 crc kubenswrapper[4861]: I0309 09:18:37.378410 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g8xgb" event={"ID":"efd4992d-ceb0-465e-925d-4f8784c8c803","Type":"ContainerStarted","Data":"ff5d0d3c183b5e153945f524d408bf926b704ce8aeb33e7be65db1366dbcc118"} Mar 09 09:18:38 crc kubenswrapper[4861]: I0309 09:18:38.389346 4861 generic.go:334] "Generic (PLEG): container finished" podID="efd4992d-ceb0-465e-925d-4f8784c8c803" containerID="ff5d0d3c183b5e153945f524d408bf926b704ce8aeb33e7be65db1366dbcc118" exitCode=0 Mar 09 09:18:38 crc kubenswrapper[4861]: I0309 09:18:38.389467 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g8xgb" event={"ID":"efd4992d-ceb0-465e-925d-4f8784c8c803","Type":"ContainerDied","Data":"ff5d0d3c183b5e153945f524d408bf926b704ce8aeb33e7be65db1366dbcc118"} Mar 09 09:18:39 crc kubenswrapper[4861]: I0309 09:18:39.398453 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g8xgb" event={"ID":"efd4992d-ceb0-465e-925d-4f8784c8c803","Type":"ContainerStarted","Data":"38a0030576cf4060ff77abf6c999affadd566a85ecd6bd94e1c2c5f20622b906"} Mar 09 09:18:39 crc kubenswrapper[4861]: I0309 09:18:39.425432 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-g8xgb" podStartSLOduration=2.825137071 podStartE2EDuration="5.425403119s" podCreationTimestamp="2026-03-09 09:18:34 +0000 UTC" firstStartedPulling="2026-03-09 09:18:36.359132162 +0000 UTC m=+759.444171573" lastFinishedPulling="2026-03-09 09:18:38.95939822 +0000 UTC m=+762.044437621" observedRunningTime="2026-03-09 09:18:39.420787035 +0000 UTC m=+762.505826436" watchObservedRunningTime="2026-03-09 09:18:39.425403119 +0000 UTC m=+762.510442560" Mar 09 09:18:40 crc kubenswrapper[4861]: I0309 09:18:40.564276 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-786f45cff4-27zgp" Mar 09 09:18:45 crc kubenswrapper[4861]: I0309 09:18:45.043183 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-g8xgb" Mar 09 09:18:45 crc kubenswrapper[4861]: I0309 09:18:45.045052 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-g8xgb" Mar 09 09:18:45 crc kubenswrapper[4861]: I0309 09:18:45.088843 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-g8xgb" Mar 09 09:18:45 crc kubenswrapper[4861]: I0309 09:18:45.509891 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-g8xgb" Mar 09 09:18:45 crc kubenswrapper[4861]: I0309 09:18:45.558814 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-g8xgb"] Mar 09 09:18:47 crc kubenswrapper[4861]: I0309 09:18:47.463158 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-g8xgb" podUID="efd4992d-ceb0-465e-925d-4f8784c8c803" containerName="registry-server" containerID="cri-o://38a0030576cf4060ff77abf6c999affadd566a85ecd6bd94e1c2c5f20622b906" gracePeriod=2 Mar 09 09:18:48 crc kubenswrapper[4861]: I0309 09:18:48.483165 4861 generic.go:334] "Generic (PLEG): container finished" podID="efd4992d-ceb0-465e-925d-4f8784c8c803" containerID="38a0030576cf4060ff77abf6c999affadd566a85ecd6bd94e1c2c5f20622b906" exitCode=0 Mar 09 09:18:48 crc kubenswrapper[4861]: I0309 09:18:48.483238 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g8xgb" event={"ID":"efd4992d-ceb0-465e-925d-4f8784c8c803","Type":"ContainerDied","Data":"38a0030576cf4060ff77abf6c999affadd566a85ecd6bd94e1c2c5f20622b906"} Mar 09 09:18:49 crc kubenswrapper[4861]: I0309 09:18:49.100499 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g8xgb" Mar 09 09:18:49 crc kubenswrapper[4861]: I0309 09:18:49.189541 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8d9w\" (UniqueName: \"kubernetes.io/projected/efd4992d-ceb0-465e-925d-4f8784c8c803-kube-api-access-c8d9w\") pod \"efd4992d-ceb0-465e-925d-4f8784c8c803\" (UID: \"efd4992d-ceb0-465e-925d-4f8784c8c803\") " Mar 09 09:18:49 crc kubenswrapper[4861]: I0309 09:18:49.189970 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efd4992d-ceb0-465e-925d-4f8784c8c803-catalog-content\") pod \"efd4992d-ceb0-465e-925d-4f8784c8c803\" (UID: \"efd4992d-ceb0-465e-925d-4f8784c8c803\") " Mar 09 09:18:49 crc kubenswrapper[4861]: I0309 09:18:49.190018 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efd4992d-ceb0-465e-925d-4f8784c8c803-utilities\") pod \"efd4992d-ceb0-465e-925d-4f8784c8c803\" (UID: \"efd4992d-ceb0-465e-925d-4f8784c8c803\") " Mar 09 09:18:49 crc kubenswrapper[4861]: I0309 09:18:49.191039 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efd4992d-ceb0-465e-925d-4f8784c8c803-utilities" (OuterVolumeSpecName: "utilities") pod "efd4992d-ceb0-465e-925d-4f8784c8c803" (UID: "efd4992d-ceb0-465e-925d-4f8784c8c803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:18:49 crc kubenswrapper[4861]: I0309 09:18:49.195094 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efd4992d-ceb0-465e-925d-4f8784c8c803-kube-api-access-c8d9w" (OuterVolumeSpecName: "kube-api-access-c8d9w") pod "efd4992d-ceb0-465e-925d-4f8784c8c803" (UID: "efd4992d-ceb0-465e-925d-4f8784c8c803"). InnerVolumeSpecName "kube-api-access-c8d9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:18:49 crc kubenswrapper[4861]: I0309 09:18:49.240074 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efd4992d-ceb0-465e-925d-4f8784c8c803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "efd4992d-ceb0-465e-925d-4f8784c8c803" (UID: "efd4992d-ceb0-465e-925d-4f8784c8c803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:18:49 crc kubenswrapper[4861]: I0309 09:18:49.292501 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8d9w\" (UniqueName: \"kubernetes.io/projected/efd4992d-ceb0-465e-925d-4f8784c8c803-kube-api-access-c8d9w\") on node \"crc\" DevicePath \"\"" Mar 09 09:18:49 crc kubenswrapper[4861]: I0309 09:18:49.292542 4861 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/efd4992d-ceb0-465e-925d-4f8784c8c803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:18:49 crc kubenswrapper[4861]: I0309 09:18:49.292555 4861 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/efd4992d-ceb0-465e-925d-4f8784c8c803-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:18:49 crc kubenswrapper[4861]: I0309 09:18:49.492634 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g8xgb" event={"ID":"efd4992d-ceb0-465e-925d-4f8784c8c803","Type":"ContainerDied","Data":"a45c54596e195b5e87f6be828e16027ba6f711c911430b1253f69848cbb29a78"} Mar 09 09:18:49 crc kubenswrapper[4861]: I0309 09:18:49.492801 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g8xgb" Mar 09 09:18:49 crc kubenswrapper[4861]: I0309 09:18:49.493025 4861 scope.go:117] "RemoveContainer" containerID="38a0030576cf4060ff77abf6c999affadd566a85ecd6bd94e1c2c5f20622b906" Mar 09 09:18:49 crc kubenswrapper[4861]: I0309 09:18:49.511087 4861 scope.go:117] "RemoveContainer" containerID="ff5d0d3c183b5e153945f524d408bf926b704ce8aeb33e7be65db1366dbcc118" Mar 09 09:18:49 crc kubenswrapper[4861]: I0309 09:18:49.525285 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-g8xgb"] Mar 09 09:18:49 crc kubenswrapper[4861]: I0309 09:18:49.531198 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-g8xgb"] Mar 09 09:18:49 crc kubenswrapper[4861]: I0309 09:18:49.551530 4861 scope.go:117] "RemoveContainer" containerID="dae4fea06f99dea35d2ff2f209bf85e3ed0f44e10047ff5ceff4f4734083faba" Mar 09 09:18:49 crc kubenswrapper[4861]: I0309 09:18:49.666159 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efd4992d-ceb0-465e-925d-4f8784c8c803" path="/var/lib/kubelet/pods/efd4992d-ceb0-465e-925d-4f8784c8c803/volumes" Mar 09 09:18:54 crc kubenswrapper[4861]: I0309 09:18:54.606807 4861 patch_prober.go:28] interesting pod/machine-config-daemon-5g7gc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:18:54 crc kubenswrapper[4861]: I0309 09:18:54.607444 4861 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:18:56 crc kubenswrapper[4861]: I0309 09:18:56.451568 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-qh9sg" podUID="85a3bbcb-e663-4a97-980c-606c979409d7" containerName="console" containerID="cri-o://b7d6201629b31402e71bf297f3c26e30d1eebf556b0e934c4a6cae1f37951cd1" gracePeriod=15 Mar 09 09:18:56 crc kubenswrapper[4861]: I0309 09:18:56.815798 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s7b95"] Mar 09 09:18:56 crc kubenswrapper[4861]: E0309 09:18:56.816068 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efd4992d-ceb0-465e-925d-4f8784c8c803" containerName="registry-server" Mar 09 09:18:56 crc kubenswrapper[4861]: I0309 09:18:56.816102 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="efd4992d-ceb0-465e-925d-4f8784c8c803" containerName="registry-server" Mar 09 09:18:56 crc kubenswrapper[4861]: E0309 09:18:56.816117 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efd4992d-ceb0-465e-925d-4f8784c8c803" containerName="extract-utilities" Mar 09 09:18:56 crc kubenswrapper[4861]: I0309 09:18:56.816123 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="efd4992d-ceb0-465e-925d-4f8784c8c803" containerName="extract-utilities" Mar 09 09:18:56 crc kubenswrapper[4861]: E0309 09:18:56.816133 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efd4992d-ceb0-465e-925d-4f8784c8c803" containerName="extract-content" Mar 09 09:18:56 crc kubenswrapper[4861]: I0309 09:18:56.816140 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="efd4992d-ceb0-465e-925d-4f8784c8c803" containerName="extract-content" Mar 09 09:18:56 crc kubenswrapper[4861]: I0309 09:18:56.816280 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="efd4992d-ceb0-465e-925d-4f8784c8c803" containerName="registry-server" Mar 09 09:18:56 crc kubenswrapper[4861]: I0309 09:18:56.817228 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s7b95" Mar 09 09:18:56 crc kubenswrapper[4861]: I0309 09:18:56.818856 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 09 09:18:56 crc kubenswrapper[4861]: I0309 09:18:56.826352 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s7b95"] Mar 09 09:18:56 crc kubenswrapper[4861]: I0309 09:18:56.861953 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-qh9sg_85a3bbcb-e663-4a97-980c-606c979409d7/console/0.log" Mar 09 09:18:56 crc kubenswrapper[4861]: I0309 09:18:56.862013 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-qh9sg" Mar 09 09:18:56 crc kubenswrapper[4861]: I0309 09:18:56.895058 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4410072b-3e11-4357-9ce8-2c754f336515-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s7b95\" (UID: \"4410072b-3e11-4357-9ce8-2c754f336515\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s7b95" Mar 09 09:18:56 crc kubenswrapper[4861]: I0309 09:18:56.895139 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4410072b-3e11-4357-9ce8-2c754f336515-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s7b95\" (UID: \"4410072b-3e11-4357-9ce8-2c754f336515\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s7b95" Mar 09 09:18:56 crc kubenswrapper[4861]: I0309 09:18:56.895228 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nfzj\" (UniqueName: \"kubernetes.io/projected/4410072b-3e11-4357-9ce8-2c754f336515-kube-api-access-9nfzj\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s7b95\" (UID: \"4410072b-3e11-4357-9ce8-2c754f336515\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s7b95" Mar 09 09:18:56 crc kubenswrapper[4861]: I0309 09:18:56.995940 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/85a3bbcb-e663-4a97-980c-606c979409d7-service-ca\") pod \"85a3bbcb-e663-4a97-980c-606c979409d7\" (UID: \"85a3bbcb-e663-4a97-980c-606c979409d7\") " Mar 09 09:18:56 crc kubenswrapper[4861]: I0309 09:18:56.996005 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/85a3bbcb-e663-4a97-980c-606c979409d7-console-serving-cert\") pod \"85a3bbcb-e663-4a97-980c-606c979409d7\" (UID: \"85a3bbcb-e663-4a97-980c-606c979409d7\") " Mar 09 09:18:56 crc kubenswrapper[4861]: I0309 09:18:56.996070 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/85a3bbcb-e663-4a97-980c-606c979409d7-console-config\") pod \"85a3bbcb-e663-4a97-980c-606c979409d7\" (UID: \"85a3bbcb-e663-4a97-980c-606c979409d7\") " Mar 09 09:18:56 crc kubenswrapper[4861]: I0309 09:18:56.996141 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pkvv\" (UniqueName: \"kubernetes.io/projected/85a3bbcb-e663-4a97-980c-606c979409d7-kube-api-access-8pkvv\") pod \"85a3bbcb-e663-4a97-980c-606c979409d7\" (UID: \"85a3bbcb-e663-4a97-980c-606c979409d7\") " Mar 09 09:18:56 crc kubenswrapper[4861]: I0309 09:18:56.996180 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/85a3bbcb-e663-4a97-980c-606c979409d7-oauth-serving-cert\") pod \"85a3bbcb-e663-4a97-980c-606c979409d7\" (UID: \"85a3bbcb-e663-4a97-980c-606c979409d7\") " Mar 09 09:18:56 crc kubenswrapper[4861]: I0309 09:18:56.996284 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/85a3bbcb-e663-4a97-980c-606c979409d7-console-oauth-config\") pod \"85a3bbcb-e663-4a97-980c-606c979409d7\" (UID: \"85a3bbcb-e663-4a97-980c-606c979409d7\") " Mar 09 09:18:56 crc kubenswrapper[4861]: I0309 09:18:56.996322 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85a3bbcb-e663-4a97-980c-606c979409d7-trusted-ca-bundle\") pod \"85a3bbcb-e663-4a97-980c-606c979409d7\" (UID: \"85a3bbcb-e663-4a97-980c-606c979409d7\") " Mar 09 09:18:56 crc kubenswrapper[4861]: I0309 09:18:56.996528 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nfzj\" (UniqueName: \"kubernetes.io/projected/4410072b-3e11-4357-9ce8-2c754f336515-kube-api-access-9nfzj\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s7b95\" (UID: \"4410072b-3e11-4357-9ce8-2c754f336515\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s7b95" Mar 09 09:18:56 crc kubenswrapper[4861]: I0309 09:18:56.996611 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4410072b-3e11-4357-9ce8-2c754f336515-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s7b95\" (UID: \"4410072b-3e11-4357-9ce8-2c754f336515\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s7b95" Mar 09 09:18:56 crc kubenswrapper[4861]: I0309 09:18:56.996683 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4410072b-3e11-4357-9ce8-2c754f336515-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s7b95\" (UID: \"4410072b-3e11-4357-9ce8-2c754f336515\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s7b95" Mar 09 09:18:56 crc kubenswrapper[4861]: I0309 09:18:56.996901 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85a3bbcb-e663-4a97-980c-606c979409d7-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "85a3bbcb-e663-4a97-980c-606c979409d7" (UID: "85a3bbcb-e663-4a97-980c-606c979409d7"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:18:56 crc kubenswrapper[4861]: I0309 09:18:56.997048 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85a3bbcb-e663-4a97-980c-606c979409d7-service-ca" (OuterVolumeSpecName: "service-ca") pod "85a3bbcb-e663-4a97-980c-606c979409d7" (UID: "85a3bbcb-e663-4a97-980c-606c979409d7"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:18:56 crc kubenswrapper[4861]: I0309 09:18:56.997201 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85a3bbcb-e663-4a97-980c-606c979409d7-console-config" (OuterVolumeSpecName: "console-config") pod "85a3bbcb-e663-4a97-980c-606c979409d7" (UID: "85a3bbcb-e663-4a97-980c-606c979409d7"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:18:56 crc kubenswrapper[4861]: I0309 09:18:56.997494 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85a3bbcb-e663-4a97-980c-606c979409d7-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "85a3bbcb-e663-4a97-980c-606c979409d7" (UID: "85a3bbcb-e663-4a97-980c-606c979409d7"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:18:56 crc kubenswrapper[4861]: I0309 09:18:56.997596 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4410072b-3e11-4357-9ce8-2c754f336515-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s7b95\" (UID: \"4410072b-3e11-4357-9ce8-2c754f336515\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s7b95" Mar 09 09:18:56 crc kubenswrapper[4861]: I0309 09:18:56.997653 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4410072b-3e11-4357-9ce8-2c754f336515-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s7b95\" (UID: \"4410072b-3e11-4357-9ce8-2c754f336515\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s7b95" Mar 09 09:18:57 crc kubenswrapper[4861]: I0309 09:18:57.001861 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85a3bbcb-e663-4a97-980c-606c979409d7-kube-api-access-8pkvv" (OuterVolumeSpecName: "kube-api-access-8pkvv") pod "85a3bbcb-e663-4a97-980c-606c979409d7" (UID: "85a3bbcb-e663-4a97-980c-606c979409d7"). InnerVolumeSpecName "kube-api-access-8pkvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:18:57 crc kubenswrapper[4861]: I0309 09:18:57.002568 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85a3bbcb-e663-4a97-980c-606c979409d7-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "85a3bbcb-e663-4a97-980c-606c979409d7" (UID: "85a3bbcb-e663-4a97-980c-606c979409d7"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:18:57 crc kubenswrapper[4861]: I0309 09:18:57.002859 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85a3bbcb-e663-4a97-980c-606c979409d7-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "85a3bbcb-e663-4a97-980c-606c979409d7" (UID: "85a3bbcb-e663-4a97-980c-606c979409d7"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:18:57 crc kubenswrapper[4861]: I0309 09:18:57.011843 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nfzj\" (UniqueName: \"kubernetes.io/projected/4410072b-3e11-4357-9ce8-2c754f336515-kube-api-access-9nfzj\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s7b95\" (UID: \"4410072b-3e11-4357-9ce8-2c754f336515\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s7b95" Mar 09 09:18:57 crc kubenswrapper[4861]: I0309 09:18:57.097933 4861 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/85a3bbcb-e663-4a97-980c-606c979409d7-service-ca\") on node \"crc\" DevicePath \"\"" Mar 09 09:18:57 crc kubenswrapper[4861]: I0309 09:18:57.097965 4861 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/85a3bbcb-e663-4a97-980c-606c979409d7-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:18:57 crc kubenswrapper[4861]: I0309 09:18:57.097975 4861 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/85a3bbcb-e663-4a97-980c-606c979409d7-console-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:18:57 crc kubenswrapper[4861]: I0309 09:18:57.097984 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8pkvv\" (UniqueName: \"kubernetes.io/projected/85a3bbcb-e663-4a97-980c-606c979409d7-kube-api-access-8pkvv\") on node \"crc\" DevicePath \"\"" Mar 09 09:18:57 crc kubenswrapper[4861]: I0309 09:18:57.097992 4861 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/85a3bbcb-e663-4a97-980c-606c979409d7-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 09:18:57 crc kubenswrapper[4861]: I0309 09:18:57.098001 4861 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/85a3bbcb-e663-4a97-980c-606c979409d7-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:18:57 crc kubenswrapper[4861]: I0309 09:18:57.098009 4861 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85a3bbcb-e663-4a97-980c-606c979409d7-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:18:57 crc kubenswrapper[4861]: I0309 09:18:57.177894 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s7b95" Mar 09 09:18:57 crc kubenswrapper[4861]: I0309 09:18:57.455440 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s7b95"] Mar 09 09:18:57 crc kubenswrapper[4861]: W0309 09:18:57.463335 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4410072b_3e11_4357_9ce8_2c754f336515.slice/crio-71df96bcbcf2211317daa8fbca64134a713f7910fa3709e0813ce2e09be4ea11 WatchSource:0}: Error finding container 71df96bcbcf2211317daa8fbca64134a713f7910fa3709e0813ce2e09be4ea11: Status 404 returned error can't find the container with id 71df96bcbcf2211317daa8fbca64134a713f7910fa3709e0813ce2e09be4ea11 Mar 09 09:18:57 crc kubenswrapper[4861]: I0309 09:18:57.551979 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-qh9sg_85a3bbcb-e663-4a97-980c-606c979409d7/console/0.log" Mar 09 09:18:57 crc kubenswrapper[4861]: I0309 09:18:57.552033 4861 generic.go:334] "Generic (PLEG): container finished" podID="85a3bbcb-e663-4a97-980c-606c979409d7" containerID="b7d6201629b31402e71bf297f3c26e30d1eebf556b0e934c4a6cae1f37951cd1" exitCode=2 Mar 09 09:18:57 crc kubenswrapper[4861]: I0309 09:18:57.552108 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-qh9sg" Mar 09 09:18:57 crc kubenswrapper[4861]: I0309 09:18:57.552126 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-qh9sg" event={"ID":"85a3bbcb-e663-4a97-980c-606c979409d7","Type":"ContainerDied","Data":"b7d6201629b31402e71bf297f3c26e30d1eebf556b0e934c4a6cae1f37951cd1"} Mar 09 09:18:57 crc kubenswrapper[4861]: I0309 09:18:57.552154 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-qh9sg" event={"ID":"85a3bbcb-e663-4a97-980c-606c979409d7","Type":"ContainerDied","Data":"7a6771f05c9be9277ced67aa4fa1ab94db723fa076712b772ef67122474e30f3"} Mar 09 09:18:57 crc kubenswrapper[4861]: I0309 09:18:57.552172 4861 scope.go:117] "RemoveContainer" containerID="b7d6201629b31402e71bf297f3c26e30d1eebf556b0e934c4a6cae1f37951cd1" Mar 09 09:18:57 crc kubenswrapper[4861]: I0309 09:18:57.553796 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s7b95" event={"ID":"4410072b-3e11-4357-9ce8-2c754f336515","Type":"ContainerStarted","Data":"71df96bcbcf2211317daa8fbca64134a713f7910fa3709e0813ce2e09be4ea11"} Mar 09 09:18:57 crc kubenswrapper[4861]: I0309 09:18:57.567694 4861 scope.go:117] "RemoveContainer" containerID="b7d6201629b31402e71bf297f3c26e30d1eebf556b0e934c4a6cae1f37951cd1" Mar 09 09:18:57 crc kubenswrapper[4861]: E0309 09:18:57.568140 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7d6201629b31402e71bf297f3c26e30d1eebf556b0e934c4a6cae1f37951cd1\": container with ID starting with b7d6201629b31402e71bf297f3c26e30d1eebf556b0e934c4a6cae1f37951cd1 not found: ID does not exist" containerID="b7d6201629b31402e71bf297f3c26e30d1eebf556b0e934c4a6cae1f37951cd1" Mar 09 09:18:57 crc kubenswrapper[4861]: I0309 09:18:57.568224 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7d6201629b31402e71bf297f3c26e30d1eebf556b0e934c4a6cae1f37951cd1"} err="failed to get container status \"b7d6201629b31402e71bf297f3c26e30d1eebf556b0e934c4a6cae1f37951cd1\": rpc error: code = NotFound desc = could not find container \"b7d6201629b31402e71bf297f3c26e30d1eebf556b0e934c4a6cae1f37951cd1\": container with ID starting with b7d6201629b31402e71bf297f3c26e30d1eebf556b0e934c4a6cae1f37951cd1 not found: ID does not exist" Mar 09 09:18:57 crc kubenswrapper[4861]: I0309 09:18:57.624635 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-qh9sg"] Mar 09 09:18:57 crc kubenswrapper[4861]: I0309 09:18:57.628271 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-qh9sg"] Mar 09 09:18:57 crc kubenswrapper[4861]: I0309 09:18:57.666392 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85a3bbcb-e663-4a97-980c-606c979409d7" path="/var/lib/kubelet/pods/85a3bbcb-e663-4a97-980c-606c979409d7/volumes" Mar 09 09:18:58 crc kubenswrapper[4861]: I0309 09:18:58.563634 4861 generic.go:334] "Generic (PLEG): container finished" podID="4410072b-3e11-4357-9ce8-2c754f336515" containerID="b9805490ff05a3e35a9da88cc378421048aa0dceccc7ea53622436d1b6d8e7bc" exitCode=0 Mar 09 09:18:58 crc kubenswrapper[4861]: I0309 09:18:58.563747 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s7b95" event={"ID":"4410072b-3e11-4357-9ce8-2c754f336515","Type":"ContainerDied","Data":"b9805490ff05a3e35a9da88cc378421048aa0dceccc7ea53622436d1b6d8e7bc"} Mar 09 09:19:00 crc kubenswrapper[4861]: I0309 09:19:00.332946 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8dgpl"] Mar 09 09:19:00 crc kubenswrapper[4861]: E0309 09:19:00.333810 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85a3bbcb-e663-4a97-980c-606c979409d7" containerName="console" Mar 09 09:19:00 crc kubenswrapper[4861]: I0309 09:19:00.333836 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="85a3bbcb-e663-4a97-980c-606c979409d7" containerName="console" Mar 09 09:19:00 crc kubenswrapper[4861]: I0309 09:19:00.334091 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="85a3bbcb-e663-4a97-980c-606c979409d7" containerName="console" Mar 09 09:19:00 crc kubenswrapper[4861]: I0309 09:19:00.335601 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8dgpl" Mar 09 09:19:00 crc kubenswrapper[4861]: I0309 09:19:00.353910 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8dgpl"] Mar 09 09:19:00 crc kubenswrapper[4861]: I0309 09:19:00.455272 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5d1c31f-69d9-46c6-a2c6-94a3f9f04010-catalog-content\") pod \"redhat-marketplace-8dgpl\" (UID: \"c5d1c31f-69d9-46c6-a2c6-94a3f9f04010\") " pod="openshift-marketplace/redhat-marketplace-8dgpl" Mar 09 09:19:00 crc kubenswrapper[4861]: I0309 09:19:00.455341 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxfxl\" (UniqueName: \"kubernetes.io/projected/c5d1c31f-69d9-46c6-a2c6-94a3f9f04010-kube-api-access-bxfxl\") pod \"redhat-marketplace-8dgpl\" (UID: \"c5d1c31f-69d9-46c6-a2c6-94a3f9f04010\") " pod="openshift-marketplace/redhat-marketplace-8dgpl" Mar 09 09:19:00 crc kubenswrapper[4861]: I0309 09:19:00.455421 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5d1c31f-69d9-46c6-a2c6-94a3f9f04010-utilities\") pod \"redhat-marketplace-8dgpl\" (UID: \"c5d1c31f-69d9-46c6-a2c6-94a3f9f04010\") " pod="openshift-marketplace/redhat-marketplace-8dgpl" Mar 09 09:19:00 crc kubenswrapper[4861]: I0309 09:19:00.556653 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5d1c31f-69d9-46c6-a2c6-94a3f9f04010-utilities\") pod \"redhat-marketplace-8dgpl\" (UID: \"c5d1c31f-69d9-46c6-a2c6-94a3f9f04010\") " pod="openshift-marketplace/redhat-marketplace-8dgpl" Mar 09 09:19:00 crc kubenswrapper[4861]: I0309 09:19:00.556788 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5d1c31f-69d9-46c6-a2c6-94a3f9f04010-catalog-content\") pod \"redhat-marketplace-8dgpl\" (UID: \"c5d1c31f-69d9-46c6-a2c6-94a3f9f04010\") " pod="openshift-marketplace/redhat-marketplace-8dgpl" Mar 09 09:19:00 crc kubenswrapper[4861]: I0309 09:19:00.556816 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxfxl\" (UniqueName: \"kubernetes.io/projected/c5d1c31f-69d9-46c6-a2c6-94a3f9f04010-kube-api-access-bxfxl\") pod \"redhat-marketplace-8dgpl\" (UID: \"c5d1c31f-69d9-46c6-a2c6-94a3f9f04010\") " pod="openshift-marketplace/redhat-marketplace-8dgpl" Mar 09 09:19:00 crc kubenswrapper[4861]: I0309 09:19:00.557773 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5d1c31f-69d9-46c6-a2c6-94a3f9f04010-utilities\") pod \"redhat-marketplace-8dgpl\" (UID: \"c5d1c31f-69d9-46c6-a2c6-94a3f9f04010\") " pod="openshift-marketplace/redhat-marketplace-8dgpl" Mar 09 09:19:00 crc kubenswrapper[4861]: I0309 09:19:00.557963 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5d1c31f-69d9-46c6-a2c6-94a3f9f04010-catalog-content\") pod \"redhat-marketplace-8dgpl\" (UID: \"c5d1c31f-69d9-46c6-a2c6-94a3f9f04010\") " pod="openshift-marketplace/redhat-marketplace-8dgpl" Mar 09 09:19:00 crc kubenswrapper[4861]: I0309 09:19:00.593357 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxfxl\" (UniqueName: \"kubernetes.io/projected/c5d1c31f-69d9-46c6-a2c6-94a3f9f04010-kube-api-access-bxfxl\") pod \"redhat-marketplace-8dgpl\" (UID: \"c5d1c31f-69d9-46c6-a2c6-94a3f9f04010\") " pod="openshift-marketplace/redhat-marketplace-8dgpl" Mar 09 09:19:00 crc kubenswrapper[4861]: I0309 09:19:00.698641 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8dgpl" Mar 09 09:19:00 crc kubenswrapper[4861]: I0309 09:19:00.939397 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8dgpl"] Mar 09 09:19:00 crc kubenswrapper[4861]: W0309 09:19:00.946243 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5d1c31f_69d9_46c6_a2c6_94a3f9f04010.slice/crio-453759b8faddb44035217c262e4c778f79f6c87d29f207ab38e98b6d08b54ea0 WatchSource:0}: Error finding container 453759b8faddb44035217c262e4c778f79f6c87d29f207ab38e98b6d08b54ea0: Status 404 returned error can't find the container with id 453759b8faddb44035217c262e4c778f79f6c87d29f207ab38e98b6d08b54ea0 Mar 09 09:19:01 crc kubenswrapper[4861]: I0309 09:19:01.584648 4861 generic.go:334] "Generic (PLEG): container finished" podID="c5d1c31f-69d9-46c6-a2c6-94a3f9f04010" containerID="99a8228d4e40b8047399d471bfaff7b0d3259454a26a1b68313eb1f67fb2ea71" exitCode=0 Mar 09 09:19:01 crc kubenswrapper[4861]: I0309 09:19:01.584700 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8dgpl" event={"ID":"c5d1c31f-69d9-46c6-a2c6-94a3f9f04010","Type":"ContainerDied","Data":"99a8228d4e40b8047399d471bfaff7b0d3259454a26a1b68313eb1f67fb2ea71"} Mar 09 09:19:01 crc kubenswrapper[4861]: I0309 09:19:01.584757 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8dgpl" event={"ID":"c5d1c31f-69d9-46c6-a2c6-94a3f9f04010","Type":"ContainerStarted","Data":"453759b8faddb44035217c262e4c778f79f6c87d29f207ab38e98b6d08b54ea0"} Mar 09 09:19:01 crc kubenswrapper[4861]: I0309 09:19:01.587130 4861 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 09:19:01 crc kubenswrapper[4861]: I0309 09:19:01.588008 4861 generic.go:334] "Generic (PLEG): container finished" podID="4410072b-3e11-4357-9ce8-2c754f336515" containerID="47a37caab68446bfc37729fbc560249c8489e60f04a7c79cfb352c3dd4bb9df9" exitCode=0 Mar 09 09:19:01 crc kubenswrapper[4861]: I0309 09:19:01.588043 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s7b95" event={"ID":"4410072b-3e11-4357-9ce8-2c754f336515","Type":"ContainerDied","Data":"47a37caab68446bfc37729fbc560249c8489e60f04a7c79cfb352c3dd4bb9df9"} Mar 09 09:19:02 crc kubenswrapper[4861]: I0309 09:19:02.596186 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8dgpl" event={"ID":"c5d1c31f-69d9-46c6-a2c6-94a3f9f04010","Type":"ContainerStarted","Data":"d1246b723309c40ff1d7ef73feaf645bd8466513499efffb31c85062e29841ab"} Mar 09 09:19:02 crc kubenswrapper[4861]: I0309 09:19:02.600140 4861 generic.go:334] "Generic (PLEG): container finished" podID="4410072b-3e11-4357-9ce8-2c754f336515" containerID="59a461e97d196c30ff700b0938859293c866b90abfbc58c88dbec4204d59e4db" exitCode=0 Mar 09 09:19:02 crc kubenswrapper[4861]: I0309 09:19:02.600189 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s7b95" event={"ID":"4410072b-3e11-4357-9ce8-2c754f336515","Type":"ContainerDied","Data":"59a461e97d196c30ff700b0938859293c866b90abfbc58c88dbec4204d59e4db"} Mar 09 09:19:02 crc kubenswrapper[4861]: I0309 09:19:02.703083 4861 scope.go:117] "RemoveContainer" containerID="32762a937646933984976064473504d2b6624d005ad9b09d571b815645b18e11" Mar 09 09:19:03 crc kubenswrapper[4861]: I0309 09:19:03.606737 4861 generic.go:334] "Generic (PLEG): container finished" podID="c5d1c31f-69d9-46c6-a2c6-94a3f9f04010" containerID="d1246b723309c40ff1d7ef73feaf645bd8466513499efffb31c85062e29841ab" exitCode=0 Mar 09 09:19:03 crc kubenswrapper[4861]: I0309 09:19:03.606792 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8dgpl" event={"ID":"c5d1c31f-69d9-46c6-a2c6-94a3f9f04010","Type":"ContainerDied","Data":"d1246b723309c40ff1d7ef73feaf645bd8466513499efffb31c85062e29841ab"} Mar 09 09:19:03 crc kubenswrapper[4861]: I0309 09:19:03.881441 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s7b95" Mar 09 09:19:04 crc kubenswrapper[4861]: I0309 09:19:04.003911 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4410072b-3e11-4357-9ce8-2c754f336515-util\") pod \"4410072b-3e11-4357-9ce8-2c754f336515\" (UID: \"4410072b-3e11-4357-9ce8-2c754f336515\") " Mar 09 09:19:04 crc kubenswrapper[4861]: I0309 09:19:04.003969 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nfzj\" (UniqueName: \"kubernetes.io/projected/4410072b-3e11-4357-9ce8-2c754f336515-kube-api-access-9nfzj\") pod \"4410072b-3e11-4357-9ce8-2c754f336515\" (UID: \"4410072b-3e11-4357-9ce8-2c754f336515\") " Mar 09 09:19:04 crc kubenswrapper[4861]: I0309 09:19:04.004018 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4410072b-3e11-4357-9ce8-2c754f336515-bundle\") pod \"4410072b-3e11-4357-9ce8-2c754f336515\" (UID: \"4410072b-3e11-4357-9ce8-2c754f336515\") " Mar 09 09:19:04 crc kubenswrapper[4861]: I0309 09:19:04.005726 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4410072b-3e11-4357-9ce8-2c754f336515-bundle" (OuterVolumeSpecName: "bundle") pod "4410072b-3e11-4357-9ce8-2c754f336515" (UID: "4410072b-3e11-4357-9ce8-2c754f336515"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:19:04 crc kubenswrapper[4861]: I0309 09:19:04.012816 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4410072b-3e11-4357-9ce8-2c754f336515-kube-api-access-9nfzj" (OuterVolumeSpecName: "kube-api-access-9nfzj") pod "4410072b-3e11-4357-9ce8-2c754f336515" (UID: "4410072b-3e11-4357-9ce8-2c754f336515"). InnerVolumeSpecName "kube-api-access-9nfzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:19:04 crc kubenswrapper[4861]: I0309 09:19:04.027322 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4410072b-3e11-4357-9ce8-2c754f336515-util" (OuterVolumeSpecName: "util") pod "4410072b-3e11-4357-9ce8-2c754f336515" (UID: "4410072b-3e11-4357-9ce8-2c754f336515"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:19:04 crc kubenswrapper[4861]: I0309 09:19:04.105903 4861 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4410072b-3e11-4357-9ce8-2c754f336515-util\") on node \"crc\" DevicePath \"\"" Mar 09 09:19:04 crc kubenswrapper[4861]: I0309 09:19:04.105952 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nfzj\" (UniqueName: \"kubernetes.io/projected/4410072b-3e11-4357-9ce8-2c754f336515-kube-api-access-9nfzj\") on node \"crc\" DevicePath \"\"" Mar 09 09:19:04 crc kubenswrapper[4861]: I0309 09:19:04.105973 4861 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4410072b-3e11-4357-9ce8-2c754f336515-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:19:04 crc kubenswrapper[4861]: I0309 09:19:04.620045 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s7b95" event={"ID":"4410072b-3e11-4357-9ce8-2c754f336515","Type":"ContainerDied","Data":"71df96bcbcf2211317daa8fbca64134a713f7910fa3709e0813ce2e09be4ea11"} Mar 09 09:19:04 crc kubenswrapper[4861]: I0309 09:19:04.620431 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71df96bcbcf2211317daa8fbca64134a713f7910fa3709e0813ce2e09be4ea11" Mar 09 09:19:04 crc kubenswrapper[4861]: I0309 09:19:04.620154 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s7b95" Mar 09 09:19:04 crc kubenswrapper[4861]: I0309 09:19:04.624537 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8dgpl" event={"ID":"c5d1c31f-69d9-46c6-a2c6-94a3f9f04010","Type":"ContainerStarted","Data":"d0a89210eb8502e2286b86d68eb37397a142dd34ed8f27506ebd0de47fabac74"} Mar 09 09:19:04 crc kubenswrapper[4861]: I0309 09:19:04.651152 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8dgpl" podStartSLOduration=1.900846546 podStartE2EDuration="4.651133135s" podCreationTimestamp="2026-03-09 09:19:00 +0000 UTC" firstStartedPulling="2026-03-09 09:19:01.586613952 +0000 UTC m=+784.671653353" lastFinishedPulling="2026-03-09 09:19:04.336900541 +0000 UTC m=+787.421939942" observedRunningTime="2026-03-09 09:19:04.647793169 +0000 UTC m=+787.732832580" watchObservedRunningTime="2026-03-09 09:19:04.651133135 +0000 UTC m=+787.736172536" Mar 09 09:19:10 crc kubenswrapper[4861]: I0309 09:19:10.699507 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8dgpl" Mar 09 09:19:10 crc kubenswrapper[4861]: I0309 09:19:10.700055 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8dgpl" Mar 09 09:19:10 crc kubenswrapper[4861]: I0309 09:19:10.739835 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8dgpl" Mar 09 09:19:11 crc kubenswrapper[4861]: I0309 09:19:11.729892 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8dgpl" Mar 09 09:19:13 crc kubenswrapper[4861]: I0309 09:19:13.120625 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8dgpl"] Mar 09 09:19:13 crc kubenswrapper[4861]: I0309 09:19:13.334273 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-57f84dc5b8-mv5gg"] Mar 09 09:19:13 crc kubenswrapper[4861]: E0309 09:19:13.334500 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4410072b-3e11-4357-9ce8-2c754f336515" containerName="util" Mar 09 09:19:13 crc kubenswrapper[4861]: I0309 09:19:13.334513 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="4410072b-3e11-4357-9ce8-2c754f336515" containerName="util" Mar 09 09:19:13 crc kubenswrapper[4861]: E0309 09:19:13.334521 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4410072b-3e11-4357-9ce8-2c754f336515" containerName="extract" Mar 09 09:19:13 crc kubenswrapper[4861]: I0309 09:19:13.334527 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="4410072b-3e11-4357-9ce8-2c754f336515" containerName="extract" Mar 09 09:19:13 crc kubenswrapper[4861]: E0309 09:19:13.334533 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4410072b-3e11-4357-9ce8-2c754f336515" containerName="pull" Mar 09 09:19:13 crc kubenswrapper[4861]: I0309 09:19:13.334539 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="4410072b-3e11-4357-9ce8-2c754f336515" containerName="pull" Mar 09 09:19:13 crc kubenswrapper[4861]: I0309 09:19:13.334632 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="4410072b-3e11-4357-9ce8-2c754f336515" containerName="extract" Mar 09 09:19:13 crc kubenswrapper[4861]: I0309 09:19:13.335211 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-57f84dc5b8-mv5gg" Mar 09 09:19:13 crc kubenswrapper[4861]: I0309 09:19:13.337496 4861 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 09 09:19:13 crc kubenswrapper[4861]: I0309 09:19:13.337603 4861 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-97sqh" Mar 09 09:19:13 crc kubenswrapper[4861]: I0309 09:19:13.338947 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 09 09:19:13 crc kubenswrapper[4861]: I0309 09:19:13.339238 4861 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 09 09:19:13 crc kubenswrapper[4861]: I0309 09:19:13.339428 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 09 09:19:13 crc kubenswrapper[4861]: I0309 09:19:13.355110 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-57f84dc5b8-mv5gg"] Mar 09 09:19:13 crc kubenswrapper[4861]: I0309 09:19:13.425532 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zw4r8\" (UniqueName: \"kubernetes.io/projected/bcfd9e4a-11a5-40dc-aa44-e6348ac2069b-kube-api-access-zw4r8\") pod \"metallb-operator-controller-manager-57f84dc5b8-mv5gg\" (UID: \"bcfd9e4a-11a5-40dc-aa44-e6348ac2069b\") " pod="metallb-system/metallb-operator-controller-manager-57f84dc5b8-mv5gg" Mar 09 09:19:13 crc kubenswrapper[4861]: I0309 09:19:13.425571 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bcfd9e4a-11a5-40dc-aa44-e6348ac2069b-webhook-cert\") pod \"metallb-operator-controller-manager-57f84dc5b8-mv5gg\" (UID: \"bcfd9e4a-11a5-40dc-aa44-e6348ac2069b\") " pod="metallb-system/metallb-operator-controller-manager-57f84dc5b8-mv5gg" Mar 09 09:19:13 crc kubenswrapper[4861]: I0309 09:19:13.425650 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bcfd9e4a-11a5-40dc-aa44-e6348ac2069b-apiservice-cert\") pod \"metallb-operator-controller-manager-57f84dc5b8-mv5gg\" (UID: \"bcfd9e4a-11a5-40dc-aa44-e6348ac2069b\") " pod="metallb-system/metallb-operator-controller-manager-57f84dc5b8-mv5gg" Mar 09 09:19:13 crc kubenswrapper[4861]: I0309 09:19:13.526539 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bcfd9e4a-11a5-40dc-aa44-e6348ac2069b-apiservice-cert\") pod \"metallb-operator-controller-manager-57f84dc5b8-mv5gg\" (UID: \"bcfd9e4a-11a5-40dc-aa44-e6348ac2069b\") " pod="metallb-system/metallb-operator-controller-manager-57f84dc5b8-mv5gg" Mar 09 09:19:13 crc kubenswrapper[4861]: I0309 09:19:13.526646 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zw4r8\" (UniqueName: \"kubernetes.io/projected/bcfd9e4a-11a5-40dc-aa44-e6348ac2069b-kube-api-access-zw4r8\") pod \"metallb-operator-controller-manager-57f84dc5b8-mv5gg\" (UID: \"bcfd9e4a-11a5-40dc-aa44-e6348ac2069b\") " pod="metallb-system/metallb-operator-controller-manager-57f84dc5b8-mv5gg" Mar 09 09:19:13 crc kubenswrapper[4861]: I0309 09:19:13.526671 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bcfd9e4a-11a5-40dc-aa44-e6348ac2069b-webhook-cert\") pod \"metallb-operator-controller-manager-57f84dc5b8-mv5gg\" (UID: \"bcfd9e4a-11a5-40dc-aa44-e6348ac2069b\") " pod="metallb-system/metallb-operator-controller-manager-57f84dc5b8-mv5gg" Mar 09 09:19:13 crc kubenswrapper[4861]: I0309 09:19:13.533648 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bcfd9e4a-11a5-40dc-aa44-e6348ac2069b-webhook-cert\") pod \"metallb-operator-controller-manager-57f84dc5b8-mv5gg\" (UID: \"bcfd9e4a-11a5-40dc-aa44-e6348ac2069b\") " pod="metallb-system/metallb-operator-controller-manager-57f84dc5b8-mv5gg" Mar 09 09:19:13 crc kubenswrapper[4861]: I0309 09:19:13.543982 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bcfd9e4a-11a5-40dc-aa44-e6348ac2069b-apiservice-cert\") pod \"metallb-operator-controller-manager-57f84dc5b8-mv5gg\" (UID: \"bcfd9e4a-11a5-40dc-aa44-e6348ac2069b\") " pod="metallb-system/metallb-operator-controller-manager-57f84dc5b8-mv5gg" Mar 09 09:19:13 crc kubenswrapper[4861]: I0309 09:19:13.544661 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zw4r8\" (UniqueName: \"kubernetes.io/projected/bcfd9e4a-11a5-40dc-aa44-e6348ac2069b-kube-api-access-zw4r8\") pod \"metallb-operator-controller-manager-57f84dc5b8-mv5gg\" (UID: \"bcfd9e4a-11a5-40dc-aa44-e6348ac2069b\") " pod="metallb-system/metallb-operator-controller-manager-57f84dc5b8-mv5gg" Mar 09 09:19:13 crc kubenswrapper[4861]: I0309 09:19:13.650278 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-57f84dc5b8-mv5gg" Mar 09 09:19:13 crc kubenswrapper[4861]: I0309 09:19:13.675597 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-7569c9dcdc-v9vw2"] Mar 09 09:19:13 crc kubenswrapper[4861]: I0309 09:19:13.676394 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7569c9dcdc-v9vw2" Mar 09 09:19:13 crc kubenswrapper[4861]: I0309 09:19:13.678943 4861 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 09 09:19:13 crc kubenswrapper[4861]: I0309 09:19:13.679228 4861 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 09 09:19:13 crc kubenswrapper[4861]: I0309 09:19:13.680035 4861 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-ppfzd" Mar 09 09:19:13 crc kubenswrapper[4861]: I0309 09:19:13.692544 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7569c9dcdc-v9vw2"] Mar 09 09:19:13 crc kubenswrapper[4861]: I0309 09:19:13.694596 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8dgpl" podUID="c5d1c31f-69d9-46c6-a2c6-94a3f9f04010" containerName="registry-server" containerID="cri-o://d0a89210eb8502e2286b86d68eb37397a142dd34ed8f27506ebd0de47fabac74" gracePeriod=2 Mar 09 09:19:13 crc kubenswrapper[4861]: I0309 09:19:13.730738 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9eb5d549-165a-4d97-8526-e082c80ed71b-webhook-cert\") pod \"metallb-operator-webhook-server-7569c9dcdc-v9vw2\" (UID: \"9eb5d549-165a-4d97-8526-e082c80ed71b\") " pod="metallb-system/metallb-operator-webhook-server-7569c9dcdc-v9vw2" Mar 09 09:19:13 crc kubenswrapper[4861]: I0309 09:19:13.730798 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9eb5d549-165a-4d97-8526-e082c80ed71b-apiservice-cert\") pod \"metallb-operator-webhook-server-7569c9dcdc-v9vw2\" (UID: \"9eb5d549-165a-4d97-8526-e082c80ed71b\") " pod="metallb-system/metallb-operator-webhook-server-7569c9dcdc-v9vw2" Mar 09 09:19:13 crc kubenswrapper[4861]: I0309 09:19:13.730842 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x92m8\" (UniqueName: \"kubernetes.io/projected/9eb5d549-165a-4d97-8526-e082c80ed71b-kube-api-access-x92m8\") pod \"metallb-operator-webhook-server-7569c9dcdc-v9vw2\" (UID: \"9eb5d549-165a-4d97-8526-e082c80ed71b\") " pod="metallb-system/metallb-operator-webhook-server-7569c9dcdc-v9vw2" Mar 09 09:19:13 crc kubenswrapper[4861]: I0309 09:19:13.839239 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9eb5d549-165a-4d97-8526-e082c80ed71b-webhook-cert\") pod \"metallb-operator-webhook-server-7569c9dcdc-v9vw2\" (UID: \"9eb5d549-165a-4d97-8526-e082c80ed71b\") " pod="metallb-system/metallb-operator-webhook-server-7569c9dcdc-v9vw2" Mar 09 09:19:13 crc kubenswrapper[4861]: I0309 09:19:13.839350 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9eb5d549-165a-4d97-8526-e082c80ed71b-apiservice-cert\") pod \"metallb-operator-webhook-server-7569c9dcdc-v9vw2\" (UID: \"9eb5d549-165a-4d97-8526-e082c80ed71b\") " pod="metallb-system/metallb-operator-webhook-server-7569c9dcdc-v9vw2" Mar 09 09:19:13 crc kubenswrapper[4861]: I0309 09:19:13.839436 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x92m8\" (UniqueName: \"kubernetes.io/projected/9eb5d549-165a-4d97-8526-e082c80ed71b-kube-api-access-x92m8\") pod \"metallb-operator-webhook-server-7569c9dcdc-v9vw2\" (UID: \"9eb5d549-165a-4d97-8526-e082c80ed71b\") " pod="metallb-system/metallb-operator-webhook-server-7569c9dcdc-v9vw2" Mar 09 09:19:13 crc kubenswrapper[4861]: I0309 09:19:13.850193 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9eb5d549-165a-4d97-8526-e082c80ed71b-apiservice-cert\") pod \"metallb-operator-webhook-server-7569c9dcdc-v9vw2\" (UID: \"9eb5d549-165a-4d97-8526-e082c80ed71b\") " pod="metallb-system/metallb-operator-webhook-server-7569c9dcdc-v9vw2" Mar 09 09:19:13 crc kubenswrapper[4861]: I0309 09:19:13.851309 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9eb5d549-165a-4d97-8526-e082c80ed71b-webhook-cert\") pod \"metallb-operator-webhook-server-7569c9dcdc-v9vw2\" (UID: \"9eb5d549-165a-4d97-8526-e082c80ed71b\") " pod="metallb-system/metallb-operator-webhook-server-7569c9dcdc-v9vw2" Mar 09 09:19:13 crc kubenswrapper[4861]: I0309 09:19:13.857283 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x92m8\" (UniqueName: \"kubernetes.io/projected/9eb5d549-165a-4d97-8526-e082c80ed71b-kube-api-access-x92m8\") pod \"metallb-operator-webhook-server-7569c9dcdc-v9vw2\" (UID: \"9eb5d549-165a-4d97-8526-e082c80ed71b\") " pod="metallb-system/metallb-operator-webhook-server-7569c9dcdc-v9vw2" Mar 09 09:19:14 crc kubenswrapper[4861]: I0309 09:19:14.032649 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7569c9dcdc-v9vw2" Mar 09 09:19:14 crc kubenswrapper[4861]: I0309 09:19:14.061225 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8dgpl" Mar 09 09:19:14 crc kubenswrapper[4861]: I0309 09:19:14.144365 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5d1c31f-69d9-46c6-a2c6-94a3f9f04010-catalog-content\") pod \"c5d1c31f-69d9-46c6-a2c6-94a3f9f04010\" (UID: \"c5d1c31f-69d9-46c6-a2c6-94a3f9f04010\") " Mar 09 09:19:14 crc kubenswrapper[4861]: I0309 09:19:14.144472 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxfxl\" (UniqueName: \"kubernetes.io/projected/c5d1c31f-69d9-46c6-a2c6-94a3f9f04010-kube-api-access-bxfxl\") pod \"c5d1c31f-69d9-46c6-a2c6-94a3f9f04010\" (UID: \"c5d1c31f-69d9-46c6-a2c6-94a3f9f04010\") " Mar 09 09:19:14 crc kubenswrapper[4861]: I0309 09:19:14.144531 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5d1c31f-69d9-46c6-a2c6-94a3f9f04010-utilities\") pod \"c5d1c31f-69d9-46c6-a2c6-94a3f9f04010\" (UID: \"c5d1c31f-69d9-46c6-a2c6-94a3f9f04010\") " Mar 09 09:19:14 crc kubenswrapper[4861]: I0309 09:19:14.145592 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5d1c31f-69d9-46c6-a2c6-94a3f9f04010-utilities" (OuterVolumeSpecName: "utilities") pod "c5d1c31f-69d9-46c6-a2c6-94a3f9f04010" (UID: "c5d1c31f-69d9-46c6-a2c6-94a3f9f04010"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:19:14 crc kubenswrapper[4861]: I0309 09:19:14.149833 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5d1c31f-69d9-46c6-a2c6-94a3f9f04010-kube-api-access-bxfxl" (OuterVolumeSpecName: "kube-api-access-bxfxl") pod "c5d1c31f-69d9-46c6-a2c6-94a3f9f04010" (UID: "c5d1c31f-69d9-46c6-a2c6-94a3f9f04010"). InnerVolumeSpecName "kube-api-access-bxfxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:19:14 crc kubenswrapper[4861]: I0309 09:19:14.157989 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-57f84dc5b8-mv5gg"] Mar 09 09:19:14 crc kubenswrapper[4861]: I0309 09:19:14.180606 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5d1c31f-69d9-46c6-a2c6-94a3f9f04010-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c5d1c31f-69d9-46c6-a2c6-94a3f9f04010" (UID: "c5d1c31f-69d9-46c6-a2c6-94a3f9f04010"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:19:14 crc kubenswrapper[4861]: I0309 09:19:14.245589 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxfxl\" (UniqueName: \"kubernetes.io/projected/c5d1c31f-69d9-46c6-a2c6-94a3f9f04010-kube-api-access-bxfxl\") on node \"crc\" DevicePath \"\"" Mar 09 09:19:14 crc kubenswrapper[4861]: I0309 09:19:14.245633 4861 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5d1c31f-69d9-46c6-a2c6-94a3f9f04010-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:19:14 crc kubenswrapper[4861]: I0309 09:19:14.245644 4861 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5d1c31f-69d9-46c6-a2c6-94a3f9f04010-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:19:14 crc kubenswrapper[4861]: I0309 09:19:14.263832 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7569c9dcdc-v9vw2"] Mar 09 09:19:14 crc kubenswrapper[4861]: W0309 09:19:14.273052 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9eb5d549_165a_4d97_8526_e082c80ed71b.slice/crio-9c9859d91251eee07aa00c93b0ee125b2243fadf00ea23bfa8dd4dcaf7dbad61 WatchSource:0}: Error finding container 9c9859d91251eee07aa00c93b0ee125b2243fadf00ea23bfa8dd4dcaf7dbad61: Status 404 returned error can't find the container with id 9c9859d91251eee07aa00c93b0ee125b2243fadf00ea23bfa8dd4dcaf7dbad61 Mar 09 09:19:14 crc kubenswrapper[4861]: I0309 09:19:14.702419 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-57f84dc5b8-mv5gg" event={"ID":"bcfd9e4a-11a5-40dc-aa44-e6348ac2069b","Type":"ContainerStarted","Data":"e178ae5123bf7f9c653d3e5c8c8688de9fb290f8a67a80e67483caf2c124ba55"} Mar 09 09:19:14 crc kubenswrapper[4861]: I0309 09:19:14.703850 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7569c9dcdc-v9vw2" event={"ID":"9eb5d549-165a-4d97-8526-e082c80ed71b","Type":"ContainerStarted","Data":"9c9859d91251eee07aa00c93b0ee125b2243fadf00ea23bfa8dd4dcaf7dbad61"} Mar 09 09:19:14 crc kubenswrapper[4861]: I0309 09:19:14.706773 4861 generic.go:334] "Generic (PLEG): container finished" podID="c5d1c31f-69d9-46c6-a2c6-94a3f9f04010" containerID="d0a89210eb8502e2286b86d68eb37397a142dd34ed8f27506ebd0de47fabac74" exitCode=0 Mar 09 09:19:14 crc kubenswrapper[4861]: I0309 09:19:14.706820 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8dgpl" event={"ID":"c5d1c31f-69d9-46c6-a2c6-94a3f9f04010","Type":"ContainerDied","Data":"d0a89210eb8502e2286b86d68eb37397a142dd34ed8f27506ebd0de47fabac74"} Mar 09 09:19:14 crc kubenswrapper[4861]: I0309 09:19:14.706852 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8dgpl" event={"ID":"c5d1c31f-69d9-46c6-a2c6-94a3f9f04010","Type":"ContainerDied","Data":"453759b8faddb44035217c262e4c778f79f6c87d29f207ab38e98b6d08b54ea0"} Mar 09 09:19:14 crc kubenswrapper[4861]: I0309 09:19:14.706879 4861 scope.go:117] "RemoveContainer" containerID="d0a89210eb8502e2286b86d68eb37397a142dd34ed8f27506ebd0de47fabac74" Mar 09 09:19:14 crc kubenswrapper[4861]: I0309 09:19:14.707030 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8dgpl" Mar 09 09:19:14 crc kubenswrapper[4861]: I0309 09:19:14.738687 4861 scope.go:117] "RemoveContainer" containerID="d1246b723309c40ff1d7ef73feaf645bd8466513499efffb31c85062e29841ab" Mar 09 09:19:14 crc kubenswrapper[4861]: I0309 09:19:14.754703 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8dgpl"] Mar 09 09:19:14 crc kubenswrapper[4861]: I0309 09:19:14.759318 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8dgpl"] Mar 09 09:19:14 crc kubenswrapper[4861]: I0309 09:19:14.774755 4861 scope.go:117] "RemoveContainer" containerID="99a8228d4e40b8047399d471bfaff7b0d3259454a26a1b68313eb1f67fb2ea71" Mar 09 09:19:14 crc kubenswrapper[4861]: I0309 09:19:14.799040 4861 scope.go:117] "RemoveContainer" containerID="d0a89210eb8502e2286b86d68eb37397a142dd34ed8f27506ebd0de47fabac74" Mar 09 09:19:14 crc kubenswrapper[4861]: E0309 09:19:14.800894 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0a89210eb8502e2286b86d68eb37397a142dd34ed8f27506ebd0de47fabac74\": container with ID starting with d0a89210eb8502e2286b86d68eb37397a142dd34ed8f27506ebd0de47fabac74 not found: ID does not exist" containerID="d0a89210eb8502e2286b86d68eb37397a142dd34ed8f27506ebd0de47fabac74" Mar 09 09:19:14 crc kubenswrapper[4861]: I0309 09:19:14.800937 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0a89210eb8502e2286b86d68eb37397a142dd34ed8f27506ebd0de47fabac74"} err="failed to get container status \"d0a89210eb8502e2286b86d68eb37397a142dd34ed8f27506ebd0de47fabac74\": rpc error: code = NotFound desc = could not find container \"d0a89210eb8502e2286b86d68eb37397a142dd34ed8f27506ebd0de47fabac74\": container with ID starting with d0a89210eb8502e2286b86d68eb37397a142dd34ed8f27506ebd0de47fabac74 not found: ID does not exist" Mar 09 09:19:14 crc kubenswrapper[4861]: I0309 09:19:14.800967 4861 scope.go:117] "RemoveContainer" containerID="d1246b723309c40ff1d7ef73feaf645bd8466513499efffb31c85062e29841ab" Mar 09 09:19:14 crc kubenswrapper[4861]: E0309 09:19:14.801483 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1246b723309c40ff1d7ef73feaf645bd8466513499efffb31c85062e29841ab\": container with ID starting with d1246b723309c40ff1d7ef73feaf645bd8466513499efffb31c85062e29841ab not found: ID does not exist" containerID="d1246b723309c40ff1d7ef73feaf645bd8466513499efffb31c85062e29841ab" Mar 09 09:19:14 crc kubenswrapper[4861]: I0309 09:19:14.801518 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1246b723309c40ff1d7ef73feaf645bd8466513499efffb31c85062e29841ab"} err="failed to get container status \"d1246b723309c40ff1d7ef73feaf645bd8466513499efffb31c85062e29841ab\": rpc error: code = NotFound desc = could not find container \"d1246b723309c40ff1d7ef73feaf645bd8466513499efffb31c85062e29841ab\": container with ID starting with d1246b723309c40ff1d7ef73feaf645bd8466513499efffb31c85062e29841ab not found: ID does not exist" Mar 09 09:19:14 crc kubenswrapper[4861]: I0309 09:19:14.801544 4861 scope.go:117] "RemoveContainer" containerID="99a8228d4e40b8047399d471bfaff7b0d3259454a26a1b68313eb1f67fb2ea71" Mar 09 09:19:14 crc kubenswrapper[4861]: E0309 09:19:14.801966 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99a8228d4e40b8047399d471bfaff7b0d3259454a26a1b68313eb1f67fb2ea71\": container with ID starting with 99a8228d4e40b8047399d471bfaff7b0d3259454a26a1b68313eb1f67fb2ea71 not found: ID does not exist" containerID="99a8228d4e40b8047399d471bfaff7b0d3259454a26a1b68313eb1f67fb2ea71" Mar 09 09:19:14 crc kubenswrapper[4861]: I0309 09:19:14.801992 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99a8228d4e40b8047399d471bfaff7b0d3259454a26a1b68313eb1f67fb2ea71"} err="failed to get container status \"99a8228d4e40b8047399d471bfaff7b0d3259454a26a1b68313eb1f67fb2ea71\": rpc error: code = NotFound desc = could not find container \"99a8228d4e40b8047399d471bfaff7b0d3259454a26a1b68313eb1f67fb2ea71\": container with ID starting with 99a8228d4e40b8047399d471bfaff7b0d3259454a26a1b68313eb1f67fb2ea71 not found: ID does not exist" Mar 09 09:19:15 crc kubenswrapper[4861]: I0309 09:19:15.673141 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5d1c31f-69d9-46c6-a2c6-94a3f9f04010" path="/var/lib/kubelet/pods/c5d1c31f-69d9-46c6-a2c6-94a3f9f04010/volumes" Mar 09 09:19:20 crc kubenswrapper[4861]: I0309 09:19:20.751173 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-57f84dc5b8-mv5gg" event={"ID":"bcfd9e4a-11a5-40dc-aa44-e6348ac2069b","Type":"ContainerStarted","Data":"184d419bfdcb102aea1b504ac5d0206be0bcc109892efe2de00d72990c5e6af9"} Mar 09 09:19:20 crc kubenswrapper[4861]: I0309 09:19:20.751880 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-57f84dc5b8-mv5gg" Mar 09 09:19:20 crc kubenswrapper[4861]: I0309 09:19:20.756875 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7569c9dcdc-v9vw2" event={"ID":"9eb5d549-165a-4d97-8526-e082c80ed71b","Type":"ContainerStarted","Data":"48b15a40f2d84f856e4a49462902514b9f8e7b900ba9204d19e9fbd2c6a93129"} Mar 09 09:19:20 crc kubenswrapper[4861]: I0309 09:19:20.757054 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-7569c9dcdc-v9vw2" Mar 09 09:19:20 crc kubenswrapper[4861]: I0309 09:19:20.779887 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-57f84dc5b8-mv5gg" podStartSLOduration=1.914164001 podStartE2EDuration="7.779868623s" podCreationTimestamp="2026-03-09 09:19:13 +0000 UTC" firstStartedPulling="2026-03-09 09:19:14.161244868 +0000 UTC m=+797.246284269" lastFinishedPulling="2026-03-09 09:19:20.02694945 +0000 UTC m=+803.111988891" observedRunningTime="2026-03-09 09:19:20.773314365 +0000 UTC m=+803.858353776" watchObservedRunningTime="2026-03-09 09:19:20.779868623 +0000 UTC m=+803.864908024" Mar 09 09:19:20 crc kubenswrapper[4861]: I0309 09:19:20.804292 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-7569c9dcdc-v9vw2" podStartSLOduration=2.024144508 podStartE2EDuration="7.804265608s" podCreationTimestamp="2026-03-09 09:19:13 +0000 UTC" firstStartedPulling="2026-03-09 09:19:14.275051555 +0000 UTC m=+797.360090956" lastFinishedPulling="2026-03-09 09:19:20.055172645 +0000 UTC m=+803.140212056" observedRunningTime="2026-03-09 09:19:20.797230635 +0000 UTC m=+803.882270056" watchObservedRunningTime="2026-03-09 09:19:20.804265608 +0000 UTC m=+803.889305029" Mar 09 09:19:24 crc kubenswrapper[4861]: I0309 09:19:24.606028 4861 patch_prober.go:28] interesting pod/machine-config-daemon-5g7gc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:19:24 crc kubenswrapper[4861]: I0309 09:19:24.606499 4861 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:19:24 crc kubenswrapper[4861]: I0309 09:19:24.606565 4861 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" Mar 09 09:19:24 crc kubenswrapper[4861]: I0309 09:19:24.607288 4861 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e0df4d92a9184d4707aae8be303bba70127fc6e2155c0877c558c19c847ce33b"} pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 09:19:24 crc kubenswrapper[4861]: I0309 09:19:24.607517 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" containerID="cri-o://e0df4d92a9184d4707aae8be303bba70127fc6e2155c0877c558c19c847ce33b" gracePeriod=600 Mar 09 09:19:24 crc kubenswrapper[4861]: I0309 09:19:24.789442 4861 generic.go:334] "Generic (PLEG): container finished" podID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerID="e0df4d92a9184d4707aae8be303bba70127fc6e2155c0877c558c19c847ce33b" exitCode=0 Mar 09 09:19:24 crc kubenswrapper[4861]: I0309 09:19:24.789494 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" event={"ID":"6f7875e3-174f-4c67-8675-d878de74aa4f","Type":"ContainerDied","Data":"e0df4d92a9184d4707aae8be303bba70127fc6e2155c0877c558c19c847ce33b"} Mar 09 09:19:24 crc kubenswrapper[4861]: I0309 09:19:24.789782 4861 scope.go:117] "RemoveContainer" containerID="647e93428b4f8c4a06ed5cd023dd7ae9e5817d85b57d9999c6b9891ba5cdb78e" Mar 09 09:19:26 crc kubenswrapper[4861]: I0309 09:19:26.806452 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" event={"ID":"6f7875e3-174f-4c67-8675-d878de74aa4f","Type":"ContainerStarted","Data":"496f42623ddb6e7fafcb7e74986e05b309e695c4366816fec0ffd01b5c0a1be9"} Mar 09 09:19:34 crc kubenswrapper[4861]: I0309 09:19:34.039340 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-7569c9dcdc-v9vw2" Mar 09 09:19:53 crc kubenswrapper[4861]: I0309 09:19:53.652690 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-57f84dc5b8-mv5gg" Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.345708 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-2qds9"] Mar 09 09:19:54 crc kubenswrapper[4861]: E0309 09:19:54.346360 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5d1c31f-69d9-46c6-a2c6-94a3f9f04010" containerName="extract-utilities" Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.346413 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5d1c31f-69d9-46c6-a2c6-94a3f9f04010" containerName="extract-utilities" Mar 09 09:19:54 crc kubenswrapper[4861]: E0309 09:19:54.346443 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5d1c31f-69d9-46c6-a2c6-94a3f9f04010" containerName="extract-content" Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.346455 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5d1c31f-69d9-46c6-a2c6-94a3f9f04010" containerName="extract-content" Mar 09 09:19:54 crc kubenswrapper[4861]: E0309 09:19:54.346472 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5d1c31f-69d9-46c6-a2c6-94a3f9f04010" containerName="registry-server" Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.346482 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5d1c31f-69d9-46c6-a2c6-94a3f9f04010" containerName="registry-server" Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.346679 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5d1c31f-69d9-46c6-a2c6-94a3f9f04010" containerName="registry-server" Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.347246 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-2qds9" Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.349209 4861 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.349267 4861 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-shjnt" Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.354105 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-p5vjx"] Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.356268 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-p5vjx" Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.357681 4861 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.359936 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.369492 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-2qds9"] Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.382454 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/416fe11c-136c-4b38-9f85-2ba8df311664-reloader\") pod \"frr-k8s-p5vjx\" (UID: \"416fe11c-136c-4b38-9f85-2ba8df311664\") " pod="metallb-system/frr-k8s-p5vjx" Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.382506 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/416fe11c-136c-4b38-9f85-2ba8df311664-frr-sockets\") pod \"frr-k8s-p5vjx\" (UID: \"416fe11c-136c-4b38-9f85-2ba8df311664\") " pod="metallb-system/frr-k8s-p5vjx" Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.382566 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/416fe11c-136c-4b38-9f85-2ba8df311664-frr-conf\") pod \"frr-k8s-p5vjx\" (UID: \"416fe11c-136c-4b38-9f85-2ba8df311664\") " pod="metallb-system/frr-k8s-p5vjx" Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.382586 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/416fe11c-136c-4b38-9f85-2ba8df311664-metrics\") pod \"frr-k8s-p5vjx\" (UID: \"416fe11c-136c-4b38-9f85-2ba8df311664\") " pod="metallb-system/frr-k8s-p5vjx" Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.382607 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b5b4c14f-550c-483f-8a1d-5b596130b713-cert\") pod \"frr-k8s-webhook-server-7f989f654f-2qds9\" (UID: \"b5b4c14f-550c-483f-8a1d-5b596130b713\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-2qds9" Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.382640 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxhwz\" (UniqueName: \"kubernetes.io/projected/416fe11c-136c-4b38-9f85-2ba8df311664-kube-api-access-bxhwz\") pod \"frr-k8s-p5vjx\" (UID: \"416fe11c-136c-4b38-9f85-2ba8df311664\") " pod="metallb-system/frr-k8s-p5vjx" Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.382686 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9rpl\" (UniqueName: \"kubernetes.io/projected/b5b4c14f-550c-483f-8a1d-5b596130b713-kube-api-access-m9rpl\") pod \"frr-k8s-webhook-server-7f989f654f-2qds9\" (UID: \"b5b4c14f-550c-483f-8a1d-5b596130b713\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-2qds9" Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.382718 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/416fe11c-136c-4b38-9f85-2ba8df311664-metrics-certs\") pod \"frr-k8s-p5vjx\" (UID: \"416fe11c-136c-4b38-9f85-2ba8df311664\") " pod="metallb-system/frr-k8s-p5vjx" Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.382744 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/416fe11c-136c-4b38-9f85-2ba8df311664-frr-startup\") pod \"frr-k8s-p5vjx\" (UID: \"416fe11c-136c-4b38-9f85-2ba8df311664\") " pod="metallb-system/frr-k8s-p5vjx" Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.469486 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-knlc5"] Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.470597 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-knlc5" Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.478882 4861 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.479159 4861 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-9t9hj" Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.481703 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-86ddb6bd46-hxljm"] Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.482224 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.482548 4861 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.482721 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-hxljm" Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.483833 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9rpl\" (UniqueName: \"kubernetes.io/projected/b5b4c14f-550c-483f-8a1d-5b596130b713-kube-api-access-m9rpl\") pod \"frr-k8s-webhook-server-7f989f654f-2qds9\" (UID: \"b5b4c14f-550c-483f-8a1d-5b596130b713\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-2qds9" Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.483891 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/416fe11c-136c-4b38-9f85-2ba8df311664-metrics-certs\") pod \"frr-k8s-p5vjx\" (UID: \"416fe11c-136c-4b38-9f85-2ba8df311664\") " pod="metallb-system/frr-k8s-p5vjx" Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.483925 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/416fe11c-136c-4b38-9f85-2ba8df311664-frr-startup\") pod \"frr-k8s-p5vjx\" (UID: \"416fe11c-136c-4b38-9f85-2ba8df311664\") " pod="metallb-system/frr-k8s-p5vjx" Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.483979 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/48e2ee3b-9d66-4ffb-a9e7-f30dbe52d03b-metallb-excludel2\") pod \"speaker-knlc5\" (UID: \"48e2ee3b-9d66-4ffb-a9e7-f30dbe52d03b\") " pod="metallb-system/speaker-knlc5" Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.484005 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/416fe11c-136c-4b38-9f85-2ba8df311664-reloader\") pod \"frr-k8s-p5vjx\" (UID: \"416fe11c-136c-4b38-9f85-2ba8df311664\") " pod="metallb-system/frr-k8s-p5vjx" Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.484031 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/416fe11c-136c-4b38-9f85-2ba8df311664-frr-sockets\") pod \"frr-k8s-p5vjx\" (UID: \"416fe11c-136c-4b38-9f85-2ba8df311664\") " pod="metallb-system/frr-k8s-p5vjx" Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.484059 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blkwh\" (UniqueName: \"kubernetes.io/projected/48e2ee3b-9d66-4ffb-a9e7-f30dbe52d03b-kube-api-access-blkwh\") pod \"speaker-knlc5\" (UID: \"48e2ee3b-9d66-4ffb-a9e7-f30dbe52d03b\") " pod="metallb-system/speaker-knlc5" Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.484087 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/48e2ee3b-9d66-4ffb-a9e7-f30dbe52d03b-metrics-certs\") pod \"speaker-knlc5\" (UID: \"48e2ee3b-9d66-4ffb-a9e7-f30dbe52d03b\") " pod="metallb-system/speaker-knlc5" Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.484138 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/416fe11c-136c-4b38-9f85-2ba8df311664-frr-conf\") pod \"frr-k8s-p5vjx\" (UID: \"416fe11c-136c-4b38-9f85-2ba8df311664\") " pod="metallb-system/frr-k8s-p5vjx" Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.484162 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/416fe11c-136c-4b38-9f85-2ba8df311664-metrics\") pod \"frr-k8s-p5vjx\" (UID: \"416fe11c-136c-4b38-9f85-2ba8df311664\") " pod="metallb-system/frr-k8s-p5vjx" Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.484188 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b5b4c14f-550c-483f-8a1d-5b596130b713-cert\") pod \"frr-k8s-webhook-server-7f989f654f-2qds9\" (UID: \"b5b4c14f-550c-483f-8a1d-5b596130b713\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-2qds9" Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.484230 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/48e2ee3b-9d66-4ffb-a9e7-f30dbe52d03b-memberlist\") pod \"speaker-knlc5\" (UID: \"48e2ee3b-9d66-4ffb-a9e7-f30dbe52d03b\") " pod="metallb-system/speaker-knlc5" Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.484258 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxhwz\" (UniqueName: \"kubernetes.io/projected/416fe11c-136c-4b38-9f85-2ba8df311664-kube-api-access-bxhwz\") pod \"frr-k8s-p5vjx\" (UID: \"416fe11c-136c-4b38-9f85-2ba8df311664\") " pod="metallb-system/frr-k8s-p5vjx" Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.484640 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/416fe11c-136c-4b38-9f85-2ba8df311664-reloader\") pod \"frr-k8s-p5vjx\" (UID: \"416fe11c-136c-4b38-9f85-2ba8df311664\") " pod="metallb-system/frr-k8s-p5vjx" Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.484916 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/416fe11c-136c-4b38-9f85-2ba8df311664-frr-sockets\") pod \"frr-k8s-p5vjx\" (UID: \"416fe11c-136c-4b38-9f85-2ba8df311664\") " pod="metallb-system/frr-k8s-p5vjx" Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.485189 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/416fe11c-136c-4b38-9f85-2ba8df311664-frr-conf\") pod \"frr-k8s-p5vjx\" (UID: \"416fe11c-136c-4b38-9f85-2ba8df311664\") " pod="metallb-system/frr-k8s-p5vjx" Mar 09 09:19:54 crc kubenswrapper[4861]: E0309 09:19:54.485412 4861 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.485435 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/416fe11c-136c-4b38-9f85-2ba8df311664-metrics\") pod \"frr-k8s-p5vjx\" (UID: \"416fe11c-136c-4b38-9f85-2ba8df311664\") " pod="metallb-system/frr-k8s-p5vjx" Mar 09 09:19:54 crc kubenswrapper[4861]: E0309 09:19:54.485462 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b5b4c14f-550c-483f-8a1d-5b596130b713-cert podName:b5b4c14f-550c-483f-8a1d-5b596130b713 nodeName:}" failed. No retries permitted until 2026-03-09 09:19:54.98544826 +0000 UTC m=+838.070487661 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b5b4c14f-550c-483f-8a1d-5b596130b713-cert") pod "frr-k8s-webhook-server-7f989f654f-2qds9" (UID: "b5b4c14f-550c-483f-8a1d-5b596130b713") : secret "frr-k8s-webhook-server-cert" not found Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.486255 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/416fe11c-136c-4b38-9f85-2ba8df311664-frr-startup\") pod \"frr-k8s-p5vjx\" (UID: \"416fe11c-136c-4b38-9f85-2ba8df311664\") " pod="metallb-system/frr-k8s-p5vjx" Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.490255 4861 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.491462 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/416fe11c-136c-4b38-9f85-2ba8df311664-metrics-certs\") pod \"frr-k8s-p5vjx\" (UID: \"416fe11c-136c-4b38-9f85-2ba8df311664\") " pod="metallb-system/frr-k8s-p5vjx" Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.495411 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-hxljm"] Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.505905 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9rpl\" (UniqueName: \"kubernetes.io/projected/b5b4c14f-550c-483f-8a1d-5b596130b713-kube-api-access-m9rpl\") pod \"frr-k8s-webhook-server-7f989f654f-2qds9\" (UID: \"b5b4c14f-550c-483f-8a1d-5b596130b713\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-2qds9" Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.509887 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxhwz\" (UniqueName: \"kubernetes.io/projected/416fe11c-136c-4b38-9f85-2ba8df311664-kube-api-access-bxhwz\") pod \"frr-k8s-p5vjx\" (UID: \"416fe11c-136c-4b38-9f85-2ba8df311664\") " pod="metallb-system/frr-k8s-p5vjx" Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.586282 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/48e2ee3b-9d66-4ffb-a9e7-f30dbe52d03b-memberlist\") pod \"speaker-knlc5\" (UID: \"48e2ee3b-9d66-4ffb-a9e7-f30dbe52d03b\") " pod="metallb-system/speaker-knlc5" Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.586436 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2e2a9a00-e47a-4d97-9b06-58dd635a7a55-metrics-certs\") pod \"controller-86ddb6bd46-hxljm\" (UID: \"2e2a9a00-e47a-4d97-9b06-58dd635a7a55\") " pod="metallb-system/controller-86ddb6bd46-hxljm" Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.586482 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgxhd\" (UniqueName: \"kubernetes.io/projected/2e2a9a00-e47a-4d97-9b06-58dd635a7a55-kube-api-access-lgxhd\") pod \"controller-86ddb6bd46-hxljm\" (UID: \"2e2a9a00-e47a-4d97-9b06-58dd635a7a55\") " pod="metallb-system/controller-86ddb6bd46-hxljm" Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.586529 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/48e2ee3b-9d66-4ffb-a9e7-f30dbe52d03b-metallb-excludel2\") pod \"speaker-knlc5\" (UID: \"48e2ee3b-9d66-4ffb-a9e7-f30dbe52d03b\") " pod="metallb-system/speaker-knlc5" Mar 09 09:19:54 crc kubenswrapper[4861]: E0309 09:19:54.586569 4861 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 09 09:19:54 crc kubenswrapper[4861]: E0309 09:19:54.586677 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48e2ee3b-9d66-4ffb-a9e7-f30dbe52d03b-memberlist podName:48e2ee3b-9d66-4ffb-a9e7-f30dbe52d03b nodeName:}" failed. No retries permitted until 2026-03-09 09:19:55.086649633 +0000 UTC m=+838.171689034 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/48e2ee3b-9d66-4ffb-a9e7-f30dbe52d03b-memberlist") pod "speaker-knlc5" (UID: "48e2ee3b-9d66-4ffb-a9e7-f30dbe52d03b") : secret "metallb-memberlist" not found Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.586712 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blkwh\" (UniqueName: \"kubernetes.io/projected/48e2ee3b-9d66-4ffb-a9e7-f30dbe52d03b-kube-api-access-blkwh\") pod \"speaker-knlc5\" (UID: \"48e2ee3b-9d66-4ffb-a9e7-f30dbe52d03b\") " pod="metallb-system/speaker-knlc5" Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.586778 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/48e2ee3b-9d66-4ffb-a9e7-f30dbe52d03b-metrics-certs\") pod \"speaker-knlc5\" (UID: \"48e2ee3b-9d66-4ffb-a9e7-f30dbe52d03b\") " pod="metallb-system/speaker-knlc5" Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.586808 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e2a9a00-e47a-4d97-9b06-58dd635a7a55-cert\") pod \"controller-86ddb6bd46-hxljm\" (UID: \"2e2a9a00-e47a-4d97-9b06-58dd635a7a55\") " pod="metallb-system/controller-86ddb6bd46-hxljm" Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.587675 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/48e2ee3b-9d66-4ffb-a9e7-f30dbe52d03b-metallb-excludel2\") pod \"speaker-knlc5\" (UID: \"48e2ee3b-9d66-4ffb-a9e7-f30dbe52d03b\") " pod="metallb-system/speaker-knlc5" Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.593978 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/48e2ee3b-9d66-4ffb-a9e7-f30dbe52d03b-metrics-certs\") pod \"speaker-knlc5\" (UID: \"48e2ee3b-9d66-4ffb-a9e7-f30dbe52d03b\") " pod="metallb-system/speaker-knlc5" Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.611125 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blkwh\" (UniqueName: \"kubernetes.io/projected/48e2ee3b-9d66-4ffb-a9e7-f30dbe52d03b-kube-api-access-blkwh\") pod \"speaker-knlc5\" (UID: \"48e2ee3b-9d66-4ffb-a9e7-f30dbe52d03b\") " pod="metallb-system/speaker-knlc5" Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.688478 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2e2a9a00-e47a-4d97-9b06-58dd635a7a55-metrics-certs\") pod \"controller-86ddb6bd46-hxljm\" (UID: \"2e2a9a00-e47a-4d97-9b06-58dd635a7a55\") " pod="metallb-system/controller-86ddb6bd46-hxljm" Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.688531 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgxhd\" (UniqueName: \"kubernetes.io/projected/2e2a9a00-e47a-4d97-9b06-58dd635a7a55-kube-api-access-lgxhd\") pod \"controller-86ddb6bd46-hxljm\" (UID: \"2e2a9a00-e47a-4d97-9b06-58dd635a7a55\") " pod="metallb-system/controller-86ddb6bd46-hxljm" Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.688572 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e2a9a00-e47a-4d97-9b06-58dd635a7a55-cert\") pod \"controller-86ddb6bd46-hxljm\" (UID: \"2e2a9a00-e47a-4d97-9b06-58dd635a7a55\") " pod="metallb-system/controller-86ddb6bd46-hxljm" Mar 09 09:19:54 crc kubenswrapper[4861]: E0309 09:19:54.688690 4861 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Mar 09 09:19:54 crc kubenswrapper[4861]: E0309 09:19:54.688782 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2e2a9a00-e47a-4d97-9b06-58dd635a7a55-metrics-certs podName:2e2a9a00-e47a-4d97-9b06-58dd635a7a55 nodeName:}" failed. No retries permitted until 2026-03-09 09:19:55.188756102 +0000 UTC m=+838.273795523 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2e2a9a00-e47a-4d97-9b06-58dd635a7a55-metrics-certs") pod "controller-86ddb6bd46-hxljm" (UID: "2e2a9a00-e47a-4d97-9b06-58dd635a7a55") : secret "controller-certs-secret" not found Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.689925 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-p5vjx" Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.693687 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2e2a9a00-e47a-4d97-9b06-58dd635a7a55-cert\") pod \"controller-86ddb6bd46-hxljm\" (UID: \"2e2a9a00-e47a-4d97-9b06-58dd635a7a55\") " pod="metallb-system/controller-86ddb6bd46-hxljm" Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.703776 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgxhd\" (UniqueName: \"kubernetes.io/projected/2e2a9a00-e47a-4d97-9b06-58dd635a7a55-kube-api-access-lgxhd\") pod \"controller-86ddb6bd46-hxljm\" (UID: \"2e2a9a00-e47a-4d97-9b06-58dd635a7a55\") " pod="metallb-system/controller-86ddb6bd46-hxljm" Mar 09 09:19:54 crc kubenswrapper[4861]: I0309 09:19:54.994020 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b5b4c14f-550c-483f-8a1d-5b596130b713-cert\") pod \"frr-k8s-webhook-server-7f989f654f-2qds9\" (UID: \"b5b4c14f-550c-483f-8a1d-5b596130b713\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-2qds9" Mar 09 09:19:55 crc kubenswrapper[4861]: I0309 09:19:55.000175 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b5b4c14f-550c-483f-8a1d-5b596130b713-cert\") pod \"frr-k8s-webhook-server-7f989f654f-2qds9\" (UID: \"b5b4c14f-550c-483f-8a1d-5b596130b713\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-2qds9" Mar 09 09:19:55 crc kubenswrapper[4861]: I0309 09:19:55.022576 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-p5vjx" event={"ID":"416fe11c-136c-4b38-9f85-2ba8df311664","Type":"ContainerStarted","Data":"4fb1a475b8cbac68a22cb6c64814321ad5569134445dd680c2ecd51082a918c9"} Mar 09 09:19:55 crc kubenswrapper[4861]: I0309 09:19:55.095837 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/48e2ee3b-9d66-4ffb-a9e7-f30dbe52d03b-memberlist\") pod \"speaker-knlc5\" (UID: \"48e2ee3b-9d66-4ffb-a9e7-f30dbe52d03b\") " pod="metallb-system/speaker-knlc5" Mar 09 09:19:55 crc kubenswrapper[4861]: E0309 09:19:55.096049 4861 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 09 09:19:55 crc kubenswrapper[4861]: E0309 09:19:55.096173 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48e2ee3b-9d66-4ffb-a9e7-f30dbe52d03b-memberlist podName:48e2ee3b-9d66-4ffb-a9e7-f30dbe52d03b nodeName:}" failed. No retries permitted until 2026-03-09 09:19:56.096141727 +0000 UTC m=+839.181181168 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/48e2ee3b-9d66-4ffb-a9e7-f30dbe52d03b-memberlist") pod "speaker-knlc5" (UID: "48e2ee3b-9d66-4ffb-a9e7-f30dbe52d03b") : secret "metallb-memberlist" not found Mar 09 09:19:55 crc kubenswrapper[4861]: I0309 09:19:55.197299 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2e2a9a00-e47a-4d97-9b06-58dd635a7a55-metrics-certs\") pod \"controller-86ddb6bd46-hxljm\" (UID: \"2e2a9a00-e47a-4d97-9b06-58dd635a7a55\") " pod="metallb-system/controller-86ddb6bd46-hxljm" Mar 09 09:19:55 crc kubenswrapper[4861]: I0309 09:19:55.200876 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2e2a9a00-e47a-4d97-9b06-58dd635a7a55-metrics-certs\") pod \"controller-86ddb6bd46-hxljm\" (UID: \"2e2a9a00-e47a-4d97-9b06-58dd635a7a55\") " pod="metallb-system/controller-86ddb6bd46-hxljm" Mar 09 09:19:55 crc kubenswrapper[4861]: I0309 09:19:55.276900 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-2qds9" Mar 09 09:19:55 crc kubenswrapper[4861]: I0309 09:19:55.459339 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-hxljm" Mar 09 09:19:55 crc kubenswrapper[4861]: I0309 09:19:55.500247 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-2qds9"] Mar 09 09:19:55 crc kubenswrapper[4861]: I0309 09:19:55.865320 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-hxljm"] Mar 09 09:19:56 crc kubenswrapper[4861]: I0309 09:19:56.029939 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-2qds9" event={"ID":"b5b4c14f-550c-483f-8a1d-5b596130b713","Type":"ContainerStarted","Data":"8c69f97b7f8f760c9cb8669c1b0c0ddf4291c234e9a7f573efc03bb11b7f897d"} Mar 09 09:19:56 crc kubenswrapper[4861]: I0309 09:19:56.032084 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-hxljm" event={"ID":"2e2a9a00-e47a-4d97-9b06-58dd635a7a55","Type":"ContainerStarted","Data":"8d00ef4b2d09286b55372539083e2915411b6c6fd0e43cec0b8029544268f2c3"} Mar 09 09:19:56 crc kubenswrapper[4861]: I0309 09:19:56.109428 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/48e2ee3b-9d66-4ffb-a9e7-f30dbe52d03b-memberlist\") pod \"speaker-knlc5\" (UID: \"48e2ee3b-9d66-4ffb-a9e7-f30dbe52d03b\") " pod="metallb-system/speaker-knlc5" Mar 09 09:19:56 crc kubenswrapper[4861]: I0309 09:19:56.120409 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/48e2ee3b-9d66-4ffb-a9e7-f30dbe52d03b-memberlist\") pod \"speaker-knlc5\" (UID: \"48e2ee3b-9d66-4ffb-a9e7-f30dbe52d03b\") " pod="metallb-system/speaker-knlc5" Mar 09 09:19:56 crc kubenswrapper[4861]: I0309 09:19:56.286820 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-knlc5" Mar 09 09:19:57 crc kubenswrapper[4861]: I0309 09:19:57.040405 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-hxljm" event={"ID":"2e2a9a00-e47a-4d97-9b06-58dd635a7a55","Type":"ContainerStarted","Data":"6eb88cb4d16a3ac6f2db553312fc0fe7e3a7b5b2722c08ee08acb26bdb4b611d"} Mar 09 09:19:57 crc kubenswrapper[4861]: I0309 09:19:57.040695 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-86ddb6bd46-hxljm" Mar 09 09:19:57 crc kubenswrapper[4861]: I0309 09:19:57.040708 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-hxljm" event={"ID":"2e2a9a00-e47a-4d97-9b06-58dd635a7a55","Type":"ContainerStarted","Data":"fe40a50fae1fdcb40f889be0c475d7fe3fd1a06fda3390eeb845e33fa9185280"} Mar 09 09:19:57 crc kubenswrapper[4861]: I0309 09:19:57.044485 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-knlc5" event={"ID":"48e2ee3b-9d66-4ffb-a9e7-f30dbe52d03b","Type":"ContainerStarted","Data":"432ca5a72b60c5f4d42e6c792b32e604e11695fcd0e2bd6e931b10bf136a663b"} Mar 09 09:19:57 crc kubenswrapper[4861]: I0309 09:19:57.044515 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-knlc5" event={"ID":"48e2ee3b-9d66-4ffb-a9e7-f30dbe52d03b","Type":"ContainerStarted","Data":"1787402235dbe7da40e5adfc5698503f87f2114dbd0e27855c11734fbc7bf0cc"} Mar 09 09:19:57 crc kubenswrapper[4861]: I0309 09:19:57.044525 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-knlc5" event={"ID":"48e2ee3b-9d66-4ffb-a9e7-f30dbe52d03b","Type":"ContainerStarted","Data":"bc6bef73dd95f7da565022d0634869a69b8e73371efc804cb1f969630c414ff0"} Mar 09 09:19:57 crc kubenswrapper[4861]: I0309 09:19:57.044912 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-knlc5" Mar 09 09:19:57 crc kubenswrapper[4861]: I0309 09:19:57.053841 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-86ddb6bd46-hxljm" podStartSLOduration=3.053827065 podStartE2EDuration="3.053827065s" podCreationTimestamp="2026-03-09 09:19:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:19:57.052509477 +0000 UTC m=+840.137548878" watchObservedRunningTime="2026-03-09 09:19:57.053827065 +0000 UTC m=+840.138866466" Mar 09 09:19:57 crc kubenswrapper[4861]: I0309 09:19:57.070179 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-knlc5" podStartSLOduration=3.0701642160000002 podStartE2EDuration="3.070164216s" podCreationTimestamp="2026-03-09 09:19:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:19:57.066273804 +0000 UTC m=+840.151313215" watchObservedRunningTime="2026-03-09 09:19:57.070164216 +0000 UTC m=+840.155203617" Mar 09 09:20:00 crc kubenswrapper[4861]: I0309 09:20:00.122176 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550800-twnhd"] Mar 09 09:20:00 crc kubenswrapper[4861]: I0309 09:20:00.123179 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550800-twnhd" Mar 09 09:20:00 crc kubenswrapper[4861]: I0309 09:20:00.126325 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:20:00 crc kubenswrapper[4861]: I0309 09:20:00.126422 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:20:00 crc kubenswrapper[4861]: I0309 09:20:00.126589 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k86l8" Mar 09 09:20:00 crc kubenswrapper[4861]: I0309 09:20:00.129289 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550800-twnhd"] Mar 09 09:20:00 crc kubenswrapper[4861]: I0309 09:20:00.165499 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54tf6\" (UniqueName: \"kubernetes.io/projected/677b13c8-497a-4807-a2ef-13f1bfe09db3-kube-api-access-54tf6\") pod \"auto-csr-approver-29550800-twnhd\" (UID: \"677b13c8-497a-4807-a2ef-13f1bfe09db3\") " pod="openshift-infra/auto-csr-approver-29550800-twnhd" Mar 09 09:20:00 crc kubenswrapper[4861]: I0309 09:20:00.266105 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54tf6\" (UniqueName: \"kubernetes.io/projected/677b13c8-497a-4807-a2ef-13f1bfe09db3-kube-api-access-54tf6\") pod \"auto-csr-approver-29550800-twnhd\" (UID: \"677b13c8-497a-4807-a2ef-13f1bfe09db3\") " pod="openshift-infra/auto-csr-approver-29550800-twnhd" Mar 09 09:20:00 crc kubenswrapper[4861]: I0309 09:20:00.285729 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54tf6\" (UniqueName: \"kubernetes.io/projected/677b13c8-497a-4807-a2ef-13f1bfe09db3-kube-api-access-54tf6\") pod \"auto-csr-approver-29550800-twnhd\" (UID: \"677b13c8-497a-4807-a2ef-13f1bfe09db3\") " pod="openshift-infra/auto-csr-approver-29550800-twnhd" Mar 09 09:20:00 crc kubenswrapper[4861]: I0309 09:20:00.448858 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550800-twnhd" Mar 09 09:20:02 crc kubenswrapper[4861]: I0309 09:20:02.187243 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550800-twnhd"] Mar 09 09:20:02 crc kubenswrapper[4861]: W0309 09:20:02.198140 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod677b13c8_497a_4807_a2ef_13f1bfe09db3.slice/crio-a5828f5e80eae086c19779e0a33bc2eee1faa32d124e276ed6c0c4fc8f071eac WatchSource:0}: Error finding container a5828f5e80eae086c19779e0a33bc2eee1faa32d124e276ed6c0c4fc8f071eac: Status 404 returned error can't find the container with id a5828f5e80eae086c19779e0a33bc2eee1faa32d124e276ed6c0c4fc8f071eac Mar 09 09:20:03 crc kubenswrapper[4861]: I0309 09:20:03.085322 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-2qds9" event={"ID":"b5b4c14f-550c-483f-8a1d-5b596130b713","Type":"ContainerStarted","Data":"9486f12aee6fca6e648aa274791b3e9e83242a23986e3fb09e3c3be6fe15a11b"} Mar 09 09:20:03 crc kubenswrapper[4861]: I0309 09:20:03.085474 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-2qds9" Mar 09 09:20:03 crc kubenswrapper[4861]: I0309 09:20:03.087567 4861 generic.go:334] "Generic (PLEG): container finished" podID="416fe11c-136c-4b38-9f85-2ba8df311664" containerID="f08dfd0a838c7d7409818d7272f56b8a26c015cd6835c0ed1c388e7dabe09aeb" exitCode=0 Mar 09 09:20:03 crc kubenswrapper[4861]: I0309 09:20:03.087627 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-p5vjx" event={"ID":"416fe11c-136c-4b38-9f85-2ba8df311664","Type":"ContainerDied","Data":"f08dfd0a838c7d7409818d7272f56b8a26c015cd6835c0ed1c388e7dabe09aeb"} Mar 09 09:20:03 crc kubenswrapper[4861]: I0309 09:20:03.089697 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550800-twnhd" event={"ID":"677b13c8-497a-4807-a2ef-13f1bfe09db3","Type":"ContainerStarted","Data":"a5828f5e80eae086c19779e0a33bc2eee1faa32d124e276ed6c0c4fc8f071eac"} Mar 09 09:20:03 crc kubenswrapper[4861]: I0309 09:20:03.111145 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-2qds9" podStartSLOduration=2.548850168 podStartE2EDuration="9.111115486s" podCreationTimestamp="2026-03-09 09:19:54 +0000 UTC" firstStartedPulling="2026-03-09 09:19:55.538521082 +0000 UTC m=+838.623560483" lastFinishedPulling="2026-03-09 09:20:02.10078641 +0000 UTC m=+845.185825801" observedRunningTime="2026-03-09 09:20:03.103599129 +0000 UTC m=+846.188638560" watchObservedRunningTime="2026-03-09 09:20:03.111115486 +0000 UTC m=+846.196154907" Mar 09 09:20:04 crc kubenswrapper[4861]: I0309 09:20:04.096840 4861 generic.go:334] "Generic (PLEG): container finished" podID="416fe11c-136c-4b38-9f85-2ba8df311664" containerID="d853d6c4c07b1c4a7c7892a3259f921bbcb5a68dad979f05be52c53c4fd5d519" exitCode=0 Mar 09 09:20:04 crc kubenswrapper[4861]: I0309 09:20:04.096952 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-p5vjx" event={"ID":"416fe11c-136c-4b38-9f85-2ba8df311664","Type":"ContainerDied","Data":"d853d6c4c07b1c4a7c7892a3259f921bbcb5a68dad979f05be52c53c4fd5d519"} Mar 09 09:20:04 crc kubenswrapper[4861]: I0309 09:20:04.100557 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550800-twnhd" event={"ID":"677b13c8-497a-4807-a2ef-13f1bfe09db3","Type":"ContainerStarted","Data":"585f3fac79c08536861720b7ac44790472fcc0efb6247b2bbeebe0f35028a3af"} Mar 09 09:20:04 crc kubenswrapper[4861]: I0309 09:20:04.148625 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550800-twnhd" podStartSLOduration=2.6015689699999998 podStartE2EDuration="4.148607474s" podCreationTimestamp="2026-03-09 09:20:00 +0000 UTC" firstStartedPulling="2026-03-09 09:20:02.200760231 +0000 UTC m=+845.285799632" lastFinishedPulling="2026-03-09 09:20:03.747798735 +0000 UTC m=+846.832838136" observedRunningTime="2026-03-09 09:20:04.147053619 +0000 UTC m=+847.232093060" watchObservedRunningTime="2026-03-09 09:20:04.148607474 +0000 UTC m=+847.233646885" Mar 09 09:20:05 crc kubenswrapper[4861]: I0309 09:20:05.107266 4861 generic.go:334] "Generic (PLEG): container finished" podID="416fe11c-136c-4b38-9f85-2ba8df311664" containerID="571d0ae5643ca283edc9bacf1aeec826c20251e1d8096421e406a9add2c1787f" exitCode=0 Mar 09 09:20:05 crc kubenswrapper[4861]: I0309 09:20:05.107527 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-p5vjx" event={"ID":"416fe11c-136c-4b38-9f85-2ba8df311664","Type":"ContainerDied","Data":"571d0ae5643ca283edc9bacf1aeec826c20251e1d8096421e406a9add2c1787f"} Mar 09 09:20:05 crc kubenswrapper[4861]: I0309 09:20:05.128901 4861 generic.go:334] "Generic (PLEG): container finished" podID="677b13c8-497a-4807-a2ef-13f1bfe09db3" containerID="585f3fac79c08536861720b7ac44790472fcc0efb6247b2bbeebe0f35028a3af" exitCode=0 Mar 09 09:20:05 crc kubenswrapper[4861]: I0309 09:20:05.129067 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550800-twnhd" event={"ID":"677b13c8-497a-4807-a2ef-13f1bfe09db3","Type":"ContainerDied","Data":"585f3fac79c08536861720b7ac44790472fcc0efb6247b2bbeebe0f35028a3af"} Mar 09 09:20:06 crc kubenswrapper[4861]: I0309 09:20:06.143580 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-p5vjx" event={"ID":"416fe11c-136c-4b38-9f85-2ba8df311664","Type":"ContainerStarted","Data":"8f988bc5aae81a9bc4a6a5081629f2843ba0879f29c087d3da72279eaa49aa08"} Mar 09 09:20:06 crc kubenswrapper[4861]: I0309 09:20:06.144069 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-p5vjx" event={"ID":"416fe11c-136c-4b38-9f85-2ba8df311664","Type":"ContainerStarted","Data":"2080d2e8822e16be488ef0ffd58ed08f82ed32a40e4042a2f4c187aea6d4c04c"} Mar 09 09:20:06 crc kubenswrapper[4861]: I0309 09:20:06.144101 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-p5vjx" event={"ID":"416fe11c-136c-4b38-9f85-2ba8df311664","Type":"ContainerStarted","Data":"0f8f94cab97b4bb58231ac3e8faddc56fc3cb7def6b80b83fd5790b551ca5ccc"} Mar 09 09:20:06 crc kubenswrapper[4861]: I0309 09:20:06.144125 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-p5vjx" event={"ID":"416fe11c-136c-4b38-9f85-2ba8df311664","Type":"ContainerStarted","Data":"81dda1df6f46e758c5acf3b3b0499225ef6acee1afcd1abf2437240c53fac27d"} Mar 09 09:20:06 crc kubenswrapper[4861]: I0309 09:20:06.291905 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-knlc5" Mar 09 09:20:06 crc kubenswrapper[4861]: I0309 09:20:06.445534 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550800-twnhd" Mar 09 09:20:06 crc kubenswrapper[4861]: I0309 09:20:06.468220 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54tf6\" (UniqueName: \"kubernetes.io/projected/677b13c8-497a-4807-a2ef-13f1bfe09db3-kube-api-access-54tf6\") pod \"677b13c8-497a-4807-a2ef-13f1bfe09db3\" (UID: \"677b13c8-497a-4807-a2ef-13f1bfe09db3\") " Mar 09 09:20:06 crc kubenswrapper[4861]: I0309 09:20:06.482615 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/677b13c8-497a-4807-a2ef-13f1bfe09db3-kube-api-access-54tf6" (OuterVolumeSpecName: "kube-api-access-54tf6") pod "677b13c8-497a-4807-a2ef-13f1bfe09db3" (UID: "677b13c8-497a-4807-a2ef-13f1bfe09db3"). InnerVolumeSpecName "kube-api-access-54tf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:20:06 crc kubenswrapper[4861]: I0309 09:20:06.569815 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54tf6\" (UniqueName: \"kubernetes.io/projected/677b13c8-497a-4807-a2ef-13f1bfe09db3-kube-api-access-54tf6\") on node \"crc\" DevicePath \"\"" Mar 09 09:20:07 crc kubenswrapper[4861]: I0309 09:20:07.163753 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-p5vjx" event={"ID":"416fe11c-136c-4b38-9f85-2ba8df311664","Type":"ContainerStarted","Data":"4357dd3fffadcefe323803169b7feeaaac16df2d907e281f7bc778731a755f38"} Mar 09 09:20:07 crc kubenswrapper[4861]: I0309 09:20:07.163822 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-p5vjx" event={"ID":"416fe11c-136c-4b38-9f85-2ba8df311664","Type":"ContainerStarted","Data":"1de847634801fca55042380de1f0a6805ceb6c941fd7946b24dc237f264985d0"} Mar 09 09:20:07 crc kubenswrapper[4861]: I0309 09:20:07.163952 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-p5vjx" Mar 09 09:20:07 crc kubenswrapper[4861]: I0309 09:20:07.168671 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550800-twnhd" event={"ID":"677b13c8-497a-4807-a2ef-13f1bfe09db3","Type":"ContainerDied","Data":"a5828f5e80eae086c19779e0a33bc2eee1faa32d124e276ed6c0c4fc8f071eac"} Mar 09 09:20:07 crc kubenswrapper[4861]: I0309 09:20:07.168710 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5828f5e80eae086c19779e0a33bc2eee1faa32d124e276ed6c0c4fc8f071eac" Mar 09 09:20:07 crc kubenswrapper[4861]: I0309 09:20:07.168764 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550800-twnhd" Mar 09 09:20:07 crc kubenswrapper[4861]: I0309 09:20:07.191300 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-p5vjx" podStartSLOduration=5.923116435 podStartE2EDuration="13.191282389s" podCreationTimestamp="2026-03-09 09:19:54 +0000 UTC" firstStartedPulling="2026-03-09 09:19:54.826014526 +0000 UTC m=+837.911053927" lastFinishedPulling="2026-03-09 09:20:02.09418047 +0000 UTC m=+845.179219881" observedRunningTime="2026-03-09 09:20:07.187769486 +0000 UTC m=+850.272808887" watchObservedRunningTime="2026-03-09 09:20:07.191282389 +0000 UTC m=+850.276321790" Mar 09 09:20:07 crc kubenswrapper[4861]: I0309 09:20:07.212047 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550794-4fbmp"] Mar 09 09:20:07 crc kubenswrapper[4861]: I0309 09:20:07.216096 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550794-4fbmp"] Mar 09 09:20:07 crc kubenswrapper[4861]: I0309 09:20:07.666575 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78d6ee67-9e4d-4e04-a77d-e8d6fb552bc2" path="/var/lib/kubelet/pods/78d6ee67-9e4d-4e04-a77d-e8d6fb552bc2/volumes" Mar 09 09:20:09 crc kubenswrapper[4861]: I0309 09:20:09.266852 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-jvkx5"] Mar 09 09:20:09 crc kubenswrapper[4861]: E0309 09:20:09.267256 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="677b13c8-497a-4807-a2ef-13f1bfe09db3" containerName="oc" Mar 09 09:20:09 crc kubenswrapper[4861]: I0309 09:20:09.267277 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="677b13c8-497a-4807-a2ef-13f1bfe09db3" containerName="oc" Mar 09 09:20:09 crc kubenswrapper[4861]: I0309 09:20:09.267554 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="677b13c8-497a-4807-a2ef-13f1bfe09db3" containerName="oc" Mar 09 09:20:09 crc kubenswrapper[4861]: I0309 09:20:09.268190 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jvkx5" Mar 09 09:20:09 crc kubenswrapper[4861]: I0309 09:20:09.272811 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-zfgkp" Mar 09 09:20:09 crc kubenswrapper[4861]: I0309 09:20:09.273098 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 09 09:20:09 crc kubenswrapper[4861]: I0309 09:20:09.273397 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 09 09:20:09 crc kubenswrapper[4861]: I0309 09:20:09.273849 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-jvkx5"] Mar 09 09:20:09 crc kubenswrapper[4861]: I0309 09:20:09.305117 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdsmf\" (UniqueName: \"kubernetes.io/projected/6ccf7b1d-069e-4be6-9375-90e65844cfbb-kube-api-access-tdsmf\") pod \"openstack-operator-index-jvkx5\" (UID: \"6ccf7b1d-069e-4be6-9375-90e65844cfbb\") " pod="openstack-operators/openstack-operator-index-jvkx5" Mar 09 09:20:09 crc kubenswrapper[4861]: I0309 09:20:09.406066 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdsmf\" (UniqueName: \"kubernetes.io/projected/6ccf7b1d-069e-4be6-9375-90e65844cfbb-kube-api-access-tdsmf\") pod \"openstack-operator-index-jvkx5\" (UID: \"6ccf7b1d-069e-4be6-9375-90e65844cfbb\") " pod="openstack-operators/openstack-operator-index-jvkx5" Mar 09 09:20:09 crc kubenswrapper[4861]: I0309 09:20:09.424319 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdsmf\" (UniqueName: \"kubernetes.io/projected/6ccf7b1d-069e-4be6-9375-90e65844cfbb-kube-api-access-tdsmf\") pod \"openstack-operator-index-jvkx5\" (UID: \"6ccf7b1d-069e-4be6-9375-90e65844cfbb\") " pod="openstack-operators/openstack-operator-index-jvkx5" Mar 09 09:20:09 crc kubenswrapper[4861]: I0309 09:20:09.606737 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jvkx5" Mar 09 09:20:09 crc kubenswrapper[4861]: I0309 09:20:09.691617 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-p5vjx" Mar 09 09:20:09 crc kubenswrapper[4861]: I0309 09:20:09.748901 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-p5vjx" Mar 09 09:20:09 crc kubenswrapper[4861]: I0309 09:20:09.814762 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-jvkx5"] Mar 09 09:20:09 crc kubenswrapper[4861]: W0309 09:20:09.823051 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ccf7b1d_069e_4be6_9375_90e65844cfbb.slice/crio-9645c2a7d671eaafc5d9c3bd2b26af2fe331fd6417f5e461fa4d124b58c1d5fa WatchSource:0}: Error finding container 9645c2a7d671eaafc5d9c3bd2b26af2fe331fd6417f5e461fa4d124b58c1d5fa: Status 404 returned error can't find the container with id 9645c2a7d671eaafc5d9c3bd2b26af2fe331fd6417f5e461fa4d124b58c1d5fa Mar 09 09:20:10 crc kubenswrapper[4861]: I0309 09:20:10.188573 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-jvkx5" event={"ID":"6ccf7b1d-069e-4be6-9375-90e65844cfbb","Type":"ContainerStarted","Data":"9645c2a7d671eaafc5d9c3bd2b26af2fe331fd6417f5e461fa4d124b58c1d5fa"} Mar 09 09:20:11 crc kubenswrapper[4861]: I0309 09:20:11.197544 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-jvkx5" event={"ID":"6ccf7b1d-069e-4be6-9375-90e65844cfbb","Type":"ContainerStarted","Data":"beba3f1ea6ae3f55baea8edc259829ea38244e8b2a4f8997034ba73d5e77ff18"} Mar 09 09:20:11 crc kubenswrapper[4861]: I0309 09:20:11.225319 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-jvkx5" podStartSLOduration=1.063935075 podStartE2EDuration="2.2252903s" podCreationTimestamp="2026-03-09 09:20:09 +0000 UTC" firstStartedPulling="2026-03-09 09:20:09.825262075 +0000 UTC m=+852.910301476" lastFinishedPulling="2026-03-09 09:20:10.9866173 +0000 UTC m=+854.071656701" observedRunningTime="2026-03-09 09:20:11.21565194 +0000 UTC m=+854.300691341" watchObservedRunningTime="2026-03-09 09:20:11.2252903 +0000 UTC m=+854.310329731" Mar 09 09:20:12 crc kubenswrapper[4861]: I0309 09:20:12.634639 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-jvkx5"] Mar 09 09:20:13 crc kubenswrapper[4861]: I0309 09:20:13.209952 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-jvkx5" podUID="6ccf7b1d-069e-4be6-9375-90e65844cfbb" containerName="registry-server" containerID="cri-o://beba3f1ea6ae3f55baea8edc259829ea38244e8b2a4f8997034ba73d5e77ff18" gracePeriod=2 Mar 09 09:20:13 crc kubenswrapper[4861]: I0309 09:20:13.239793 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-cqfnf"] Mar 09 09:20:13 crc kubenswrapper[4861]: I0309 09:20:13.240995 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-cqfnf" Mar 09 09:20:13 crc kubenswrapper[4861]: I0309 09:20:13.253004 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-cqfnf"] Mar 09 09:20:13 crc kubenswrapper[4861]: I0309 09:20:13.258590 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j29g5\" (UniqueName: \"kubernetes.io/projected/793c9771-3185-4264-b109-a94fcc50a305-kube-api-access-j29g5\") pod \"openstack-operator-index-cqfnf\" (UID: \"793c9771-3185-4264-b109-a94fcc50a305\") " pod="openstack-operators/openstack-operator-index-cqfnf" Mar 09 09:20:13 crc kubenswrapper[4861]: I0309 09:20:13.360349 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j29g5\" (UniqueName: \"kubernetes.io/projected/793c9771-3185-4264-b109-a94fcc50a305-kube-api-access-j29g5\") pod \"openstack-operator-index-cqfnf\" (UID: \"793c9771-3185-4264-b109-a94fcc50a305\") " pod="openstack-operators/openstack-operator-index-cqfnf" Mar 09 09:20:13 crc kubenswrapper[4861]: I0309 09:20:13.395974 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j29g5\" (UniqueName: \"kubernetes.io/projected/793c9771-3185-4264-b109-a94fcc50a305-kube-api-access-j29g5\") pod \"openstack-operator-index-cqfnf\" (UID: \"793c9771-3185-4264-b109-a94fcc50a305\") " pod="openstack-operators/openstack-operator-index-cqfnf" Mar 09 09:20:13 crc kubenswrapper[4861]: I0309 09:20:13.602801 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jvkx5" Mar 09 09:20:13 crc kubenswrapper[4861]: I0309 09:20:13.618434 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-cqfnf" Mar 09 09:20:13 crc kubenswrapper[4861]: I0309 09:20:13.663914 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdsmf\" (UniqueName: \"kubernetes.io/projected/6ccf7b1d-069e-4be6-9375-90e65844cfbb-kube-api-access-tdsmf\") pod \"6ccf7b1d-069e-4be6-9375-90e65844cfbb\" (UID: \"6ccf7b1d-069e-4be6-9375-90e65844cfbb\") " Mar 09 09:20:13 crc kubenswrapper[4861]: I0309 09:20:13.668254 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ccf7b1d-069e-4be6-9375-90e65844cfbb-kube-api-access-tdsmf" (OuterVolumeSpecName: "kube-api-access-tdsmf") pod "6ccf7b1d-069e-4be6-9375-90e65844cfbb" (UID: "6ccf7b1d-069e-4be6-9375-90e65844cfbb"). InnerVolumeSpecName "kube-api-access-tdsmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:20:13 crc kubenswrapper[4861]: I0309 09:20:13.765521 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdsmf\" (UniqueName: \"kubernetes.io/projected/6ccf7b1d-069e-4be6-9375-90e65844cfbb-kube-api-access-tdsmf\") on node \"crc\" DevicePath \"\"" Mar 09 09:20:14 crc kubenswrapper[4861]: I0309 09:20:14.004535 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-cqfnf"] Mar 09 09:20:14 crc kubenswrapper[4861]: W0309 09:20:14.009787 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod793c9771_3185_4264_b109_a94fcc50a305.slice/crio-43b1db99e21f7d95c688974052ed2e7b3df5b43e07b5f370682d1c5d011e948e WatchSource:0}: Error finding container 43b1db99e21f7d95c688974052ed2e7b3df5b43e07b5f370682d1c5d011e948e: Status 404 returned error can't find the container with id 43b1db99e21f7d95c688974052ed2e7b3df5b43e07b5f370682d1c5d011e948e Mar 09 09:20:14 crc kubenswrapper[4861]: I0309 09:20:14.218313 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-cqfnf" event={"ID":"793c9771-3185-4264-b109-a94fcc50a305","Type":"ContainerStarted","Data":"43b1db99e21f7d95c688974052ed2e7b3df5b43e07b5f370682d1c5d011e948e"} Mar 09 09:20:14 crc kubenswrapper[4861]: I0309 09:20:14.220838 4861 generic.go:334] "Generic (PLEG): container finished" podID="6ccf7b1d-069e-4be6-9375-90e65844cfbb" containerID="beba3f1ea6ae3f55baea8edc259829ea38244e8b2a4f8997034ba73d5e77ff18" exitCode=0 Mar 09 09:20:14 crc kubenswrapper[4861]: I0309 09:20:14.220902 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-jvkx5" event={"ID":"6ccf7b1d-069e-4be6-9375-90e65844cfbb","Type":"ContainerDied","Data":"beba3f1ea6ae3f55baea8edc259829ea38244e8b2a4f8997034ba73d5e77ff18"} Mar 09 09:20:14 crc kubenswrapper[4861]: I0309 09:20:14.220945 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jvkx5" Mar 09 09:20:14 crc kubenswrapper[4861]: I0309 09:20:14.220979 4861 scope.go:117] "RemoveContainer" containerID="beba3f1ea6ae3f55baea8edc259829ea38244e8b2a4f8997034ba73d5e77ff18" Mar 09 09:20:14 crc kubenswrapper[4861]: I0309 09:20:14.220956 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-jvkx5" event={"ID":"6ccf7b1d-069e-4be6-9375-90e65844cfbb","Type":"ContainerDied","Data":"9645c2a7d671eaafc5d9c3bd2b26af2fe331fd6417f5e461fa4d124b58c1d5fa"} Mar 09 09:20:14 crc kubenswrapper[4861]: I0309 09:20:14.254137 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-jvkx5"] Mar 09 09:20:14 crc kubenswrapper[4861]: I0309 09:20:14.260150 4861 scope.go:117] "RemoveContainer" containerID="beba3f1ea6ae3f55baea8edc259829ea38244e8b2a4f8997034ba73d5e77ff18" Mar 09 09:20:14 crc kubenswrapper[4861]: E0309 09:20:14.260763 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"beba3f1ea6ae3f55baea8edc259829ea38244e8b2a4f8997034ba73d5e77ff18\": container with ID starting with beba3f1ea6ae3f55baea8edc259829ea38244e8b2a4f8997034ba73d5e77ff18 not found: ID does not exist" containerID="beba3f1ea6ae3f55baea8edc259829ea38244e8b2a4f8997034ba73d5e77ff18" Mar 09 09:20:14 crc kubenswrapper[4861]: I0309 09:20:14.260821 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"beba3f1ea6ae3f55baea8edc259829ea38244e8b2a4f8997034ba73d5e77ff18"} err="failed to get container status \"beba3f1ea6ae3f55baea8edc259829ea38244e8b2a4f8997034ba73d5e77ff18\": rpc error: code = NotFound desc = could not find container \"beba3f1ea6ae3f55baea8edc259829ea38244e8b2a4f8997034ba73d5e77ff18\": container with ID starting with beba3f1ea6ae3f55baea8edc259829ea38244e8b2a4f8997034ba73d5e77ff18 not found: ID does not exist" Mar 09 09:20:14 crc kubenswrapper[4861]: I0309 09:20:14.262807 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-jvkx5"] Mar 09 09:20:15 crc kubenswrapper[4861]: I0309 09:20:15.228684 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-cqfnf" event={"ID":"793c9771-3185-4264-b109-a94fcc50a305","Type":"ContainerStarted","Data":"3ae4073188b16a363fb10b3bf731d16f4e66323cf4a6fe7f870417f344961e82"} Mar 09 09:20:15 crc kubenswrapper[4861]: I0309 09:20:15.242680 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-cqfnf" podStartSLOduration=1.779463378 podStartE2EDuration="2.242658949s" podCreationTimestamp="2026-03-09 09:20:13 +0000 UTC" firstStartedPulling="2026-03-09 09:20:14.014657109 +0000 UTC m=+857.099696530" lastFinishedPulling="2026-03-09 09:20:14.47785266 +0000 UTC m=+857.562892101" observedRunningTime="2026-03-09 09:20:15.242560046 +0000 UTC m=+858.327599457" watchObservedRunningTime="2026-03-09 09:20:15.242658949 +0000 UTC m=+858.327698360" Mar 09 09:20:15 crc kubenswrapper[4861]: I0309 09:20:15.286022 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-2qds9" Mar 09 09:20:15 crc kubenswrapper[4861]: I0309 09:20:15.463994 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-86ddb6bd46-hxljm" Mar 09 09:20:15 crc kubenswrapper[4861]: I0309 09:20:15.664997 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ccf7b1d-069e-4be6-9375-90e65844cfbb" path="/var/lib/kubelet/pods/6ccf7b1d-069e-4be6-9375-90e65844cfbb/volumes" Mar 09 09:20:23 crc kubenswrapper[4861]: I0309 09:20:23.619363 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-cqfnf" Mar 09 09:20:23 crc kubenswrapper[4861]: I0309 09:20:23.620202 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-cqfnf" Mar 09 09:20:23 crc kubenswrapper[4861]: I0309 09:20:23.667916 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-cqfnf" Mar 09 09:20:24 crc kubenswrapper[4861]: I0309 09:20:24.325111 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-cqfnf" Mar 09 09:20:24 crc kubenswrapper[4861]: I0309 09:20:24.693651 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-p5vjx" Mar 09 09:20:31 crc kubenswrapper[4861]: I0309 09:20:31.087973 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfa6w5lp"] Mar 09 09:20:31 crc kubenswrapper[4861]: E0309 09:20:31.088476 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ccf7b1d-069e-4be6-9375-90e65844cfbb" containerName="registry-server" Mar 09 09:20:31 crc kubenswrapper[4861]: I0309 09:20:31.088490 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ccf7b1d-069e-4be6-9375-90e65844cfbb" containerName="registry-server" Mar 09 09:20:31 crc kubenswrapper[4861]: I0309 09:20:31.088651 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ccf7b1d-069e-4be6-9375-90e65844cfbb" containerName="registry-server" Mar 09 09:20:31 crc kubenswrapper[4861]: I0309 09:20:31.089635 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfa6w5lp" Mar 09 09:20:31 crc kubenswrapper[4861]: I0309 09:20:31.091444 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-vjvvs" Mar 09 09:20:31 crc kubenswrapper[4861]: I0309 09:20:31.096431 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfa6w5lp"] Mar 09 09:20:31 crc kubenswrapper[4861]: I0309 09:20:31.108871 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2530cbbb-c2de-41ce-b6d2-a9593ed9226d-bundle\") pod \"ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfa6w5lp\" (UID: \"2530cbbb-c2de-41ce-b6d2-a9593ed9226d\") " pod="openstack-operators/ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfa6w5lp" Mar 09 09:20:31 crc kubenswrapper[4861]: I0309 09:20:31.109123 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sff9\" (UniqueName: \"kubernetes.io/projected/2530cbbb-c2de-41ce-b6d2-a9593ed9226d-kube-api-access-5sff9\") pod \"ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfa6w5lp\" (UID: \"2530cbbb-c2de-41ce-b6d2-a9593ed9226d\") " pod="openstack-operators/ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfa6w5lp" Mar 09 09:20:31 crc kubenswrapper[4861]: I0309 09:20:31.109196 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2530cbbb-c2de-41ce-b6d2-a9593ed9226d-util\") pod \"ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfa6w5lp\" (UID: \"2530cbbb-c2de-41ce-b6d2-a9593ed9226d\") " pod="openstack-operators/ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfa6w5lp" Mar 09 09:20:31 crc kubenswrapper[4861]: I0309 09:20:31.210173 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2530cbbb-c2de-41ce-b6d2-a9593ed9226d-bundle\") pod \"ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfa6w5lp\" (UID: \"2530cbbb-c2de-41ce-b6d2-a9593ed9226d\") " pod="openstack-operators/ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfa6w5lp" Mar 09 09:20:31 crc kubenswrapper[4861]: I0309 09:20:31.210243 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sff9\" (UniqueName: \"kubernetes.io/projected/2530cbbb-c2de-41ce-b6d2-a9593ed9226d-kube-api-access-5sff9\") pod \"ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfa6w5lp\" (UID: \"2530cbbb-c2de-41ce-b6d2-a9593ed9226d\") " pod="openstack-operators/ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfa6w5lp" Mar 09 09:20:31 crc kubenswrapper[4861]: I0309 09:20:31.210303 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2530cbbb-c2de-41ce-b6d2-a9593ed9226d-util\") pod \"ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfa6w5lp\" (UID: \"2530cbbb-c2de-41ce-b6d2-a9593ed9226d\") " pod="openstack-operators/ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfa6w5lp" Mar 09 09:20:31 crc kubenswrapper[4861]: I0309 09:20:31.210817 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2530cbbb-c2de-41ce-b6d2-a9593ed9226d-util\") pod \"ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfa6w5lp\" (UID: \"2530cbbb-c2de-41ce-b6d2-a9593ed9226d\") " pod="openstack-operators/ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfa6w5lp" Mar 09 09:20:31 crc kubenswrapper[4861]: I0309 09:20:31.211083 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2530cbbb-c2de-41ce-b6d2-a9593ed9226d-bundle\") pod \"ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfa6w5lp\" (UID: \"2530cbbb-c2de-41ce-b6d2-a9593ed9226d\") " pod="openstack-operators/ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfa6w5lp" Mar 09 09:20:31 crc kubenswrapper[4861]: I0309 09:20:31.246593 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sff9\" (UniqueName: \"kubernetes.io/projected/2530cbbb-c2de-41ce-b6d2-a9593ed9226d-kube-api-access-5sff9\") pod \"ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfa6w5lp\" (UID: \"2530cbbb-c2de-41ce-b6d2-a9593ed9226d\") " pod="openstack-operators/ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfa6w5lp" Mar 09 09:20:31 crc kubenswrapper[4861]: I0309 09:20:31.414096 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfa6w5lp" Mar 09 09:20:31 crc kubenswrapper[4861]: I0309 09:20:31.641866 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfa6w5lp"] Mar 09 09:20:32 crc kubenswrapper[4861]: I0309 09:20:32.353675 4861 generic.go:334] "Generic (PLEG): container finished" podID="2530cbbb-c2de-41ce-b6d2-a9593ed9226d" containerID="0f82289ab9747b463bfff5565fc8cd248d8fc91223b710671d0d642b2989ac21" exitCode=0 Mar 09 09:20:32 crc kubenswrapper[4861]: I0309 09:20:32.353810 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfa6w5lp" event={"ID":"2530cbbb-c2de-41ce-b6d2-a9593ed9226d","Type":"ContainerDied","Data":"0f82289ab9747b463bfff5565fc8cd248d8fc91223b710671d0d642b2989ac21"} Mar 09 09:20:32 crc kubenswrapper[4861]: I0309 09:20:32.354168 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfa6w5lp" event={"ID":"2530cbbb-c2de-41ce-b6d2-a9593ed9226d","Type":"ContainerStarted","Data":"838b9b5c4bb792819f332fe29230566727b6402b32547384980afd013fffb7b3"} Mar 09 09:20:34 crc kubenswrapper[4861]: I0309 09:20:34.371699 4861 generic.go:334] "Generic (PLEG): container finished" podID="2530cbbb-c2de-41ce-b6d2-a9593ed9226d" containerID="15c3d8444ee72009876239deeb4abf9542c8f8ca10d56e56a15c876b01b6e464" exitCode=0 Mar 09 09:20:34 crc kubenswrapper[4861]: I0309 09:20:34.371778 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfa6w5lp" event={"ID":"2530cbbb-c2de-41ce-b6d2-a9593ed9226d","Type":"ContainerDied","Data":"15c3d8444ee72009876239deeb4abf9542c8f8ca10d56e56a15c876b01b6e464"} Mar 09 09:20:35 crc kubenswrapper[4861]: I0309 09:20:35.382144 4861 generic.go:334] "Generic (PLEG): container finished" podID="2530cbbb-c2de-41ce-b6d2-a9593ed9226d" containerID="b091172fa28829d8e02c2c2029c56b5c2b830406208860d2686953a2fa12c4f8" exitCode=0 Mar 09 09:20:35 crc kubenswrapper[4861]: I0309 09:20:35.382259 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfa6w5lp" event={"ID":"2530cbbb-c2de-41ce-b6d2-a9593ed9226d","Type":"ContainerDied","Data":"b091172fa28829d8e02c2c2029c56b5c2b830406208860d2686953a2fa12c4f8"} Mar 09 09:20:36 crc kubenswrapper[4861]: I0309 09:20:36.720239 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfa6w5lp" Mar 09 09:20:36 crc kubenswrapper[4861]: I0309 09:20:36.787938 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5sff9\" (UniqueName: \"kubernetes.io/projected/2530cbbb-c2de-41ce-b6d2-a9593ed9226d-kube-api-access-5sff9\") pod \"2530cbbb-c2de-41ce-b6d2-a9593ed9226d\" (UID: \"2530cbbb-c2de-41ce-b6d2-a9593ed9226d\") " Mar 09 09:20:36 crc kubenswrapper[4861]: I0309 09:20:36.787982 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2530cbbb-c2de-41ce-b6d2-a9593ed9226d-bundle\") pod \"2530cbbb-c2de-41ce-b6d2-a9593ed9226d\" (UID: \"2530cbbb-c2de-41ce-b6d2-a9593ed9226d\") " Mar 09 09:20:36 crc kubenswrapper[4861]: I0309 09:20:36.788026 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2530cbbb-c2de-41ce-b6d2-a9593ed9226d-util\") pod \"2530cbbb-c2de-41ce-b6d2-a9593ed9226d\" (UID: \"2530cbbb-c2de-41ce-b6d2-a9593ed9226d\") " Mar 09 09:20:36 crc kubenswrapper[4861]: I0309 09:20:36.788647 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2530cbbb-c2de-41ce-b6d2-a9593ed9226d-bundle" (OuterVolumeSpecName: "bundle") pod "2530cbbb-c2de-41ce-b6d2-a9593ed9226d" (UID: "2530cbbb-c2de-41ce-b6d2-a9593ed9226d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:20:36 crc kubenswrapper[4861]: I0309 09:20:36.799673 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2530cbbb-c2de-41ce-b6d2-a9593ed9226d-kube-api-access-5sff9" (OuterVolumeSpecName: "kube-api-access-5sff9") pod "2530cbbb-c2de-41ce-b6d2-a9593ed9226d" (UID: "2530cbbb-c2de-41ce-b6d2-a9593ed9226d"). InnerVolumeSpecName "kube-api-access-5sff9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:20:36 crc kubenswrapper[4861]: I0309 09:20:36.801387 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2530cbbb-c2de-41ce-b6d2-a9593ed9226d-util" (OuterVolumeSpecName: "util") pod "2530cbbb-c2de-41ce-b6d2-a9593ed9226d" (UID: "2530cbbb-c2de-41ce-b6d2-a9593ed9226d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:20:36 crc kubenswrapper[4861]: I0309 09:20:36.889090 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5sff9\" (UniqueName: \"kubernetes.io/projected/2530cbbb-c2de-41ce-b6d2-a9593ed9226d-kube-api-access-5sff9\") on node \"crc\" DevicePath \"\"" Mar 09 09:20:36 crc kubenswrapper[4861]: I0309 09:20:36.889135 4861 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2530cbbb-c2de-41ce-b6d2-a9593ed9226d-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:20:36 crc kubenswrapper[4861]: I0309 09:20:36.889149 4861 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2530cbbb-c2de-41ce-b6d2-a9593ed9226d-util\") on node \"crc\" DevicePath \"\"" Mar 09 09:20:37 crc kubenswrapper[4861]: I0309 09:20:37.425447 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfa6w5lp" event={"ID":"2530cbbb-c2de-41ce-b6d2-a9593ed9226d","Type":"ContainerDied","Data":"838b9b5c4bb792819f332fe29230566727b6402b32547384980afd013fffb7b3"} Mar 09 09:20:37 crc kubenswrapper[4861]: I0309 09:20:37.425502 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="838b9b5c4bb792819f332fe29230566727b6402b32547384980afd013fffb7b3" Mar 09 09:20:37 crc kubenswrapper[4861]: I0309 09:20:37.425584 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfa6w5lp" Mar 09 09:20:44 crc kubenswrapper[4861]: I0309 09:20:44.470350 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-568b7cf6db-zpgv9"] Mar 09 09:20:44 crc kubenswrapper[4861]: E0309 09:20:44.471329 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2530cbbb-c2de-41ce-b6d2-a9593ed9226d" containerName="pull" Mar 09 09:20:44 crc kubenswrapper[4861]: I0309 09:20:44.471350 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="2530cbbb-c2de-41ce-b6d2-a9593ed9226d" containerName="pull" Mar 09 09:20:44 crc kubenswrapper[4861]: E0309 09:20:44.471402 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2530cbbb-c2de-41ce-b6d2-a9593ed9226d" containerName="util" Mar 09 09:20:44 crc kubenswrapper[4861]: I0309 09:20:44.471415 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="2530cbbb-c2de-41ce-b6d2-a9593ed9226d" containerName="util" Mar 09 09:20:44 crc kubenswrapper[4861]: E0309 09:20:44.471433 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2530cbbb-c2de-41ce-b6d2-a9593ed9226d" containerName="extract" Mar 09 09:20:44 crc kubenswrapper[4861]: I0309 09:20:44.471445 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="2530cbbb-c2de-41ce-b6d2-a9593ed9226d" containerName="extract" Mar 09 09:20:44 crc kubenswrapper[4861]: I0309 09:20:44.471629 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="2530cbbb-c2de-41ce-b6d2-a9593ed9226d" containerName="extract" Mar 09 09:20:44 crc kubenswrapper[4861]: I0309 09:20:44.472233 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-568b7cf6db-zpgv9" Mar 09 09:20:44 crc kubenswrapper[4861]: I0309 09:20:44.475331 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-gpmlp" Mar 09 09:20:44 crc kubenswrapper[4861]: I0309 09:20:44.494972 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ss8k\" (UniqueName: \"kubernetes.io/projected/4f92d232-a96d-4774-8bfb-ece261f9b9d4-kube-api-access-4ss8k\") pod \"openstack-operator-controller-init-568b7cf6db-zpgv9\" (UID: \"4f92d232-a96d-4774-8bfb-ece261f9b9d4\") " pod="openstack-operators/openstack-operator-controller-init-568b7cf6db-zpgv9" Mar 09 09:20:44 crc kubenswrapper[4861]: I0309 09:20:44.502054 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-568b7cf6db-zpgv9"] Mar 09 09:20:44 crc kubenswrapper[4861]: I0309 09:20:44.596483 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ss8k\" (UniqueName: \"kubernetes.io/projected/4f92d232-a96d-4774-8bfb-ece261f9b9d4-kube-api-access-4ss8k\") pod \"openstack-operator-controller-init-568b7cf6db-zpgv9\" (UID: \"4f92d232-a96d-4774-8bfb-ece261f9b9d4\") " pod="openstack-operators/openstack-operator-controller-init-568b7cf6db-zpgv9" Mar 09 09:20:44 crc kubenswrapper[4861]: I0309 09:20:44.613776 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ss8k\" (UniqueName: \"kubernetes.io/projected/4f92d232-a96d-4774-8bfb-ece261f9b9d4-kube-api-access-4ss8k\") pod \"openstack-operator-controller-init-568b7cf6db-zpgv9\" (UID: \"4f92d232-a96d-4774-8bfb-ece261f9b9d4\") " pod="openstack-operators/openstack-operator-controller-init-568b7cf6db-zpgv9" Mar 09 09:20:44 crc kubenswrapper[4861]: I0309 09:20:44.794874 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-568b7cf6db-zpgv9" Mar 09 09:20:45 crc kubenswrapper[4861]: I0309 09:20:45.047924 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-568b7cf6db-zpgv9"] Mar 09 09:20:45 crc kubenswrapper[4861]: W0309 09:20:45.066221 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f92d232_a96d_4774_8bfb_ece261f9b9d4.slice/crio-267c806f92ae38bf3194a75058a40eeda61a3f2e894e36463079cf4d7a1a4774 WatchSource:0}: Error finding container 267c806f92ae38bf3194a75058a40eeda61a3f2e894e36463079cf4d7a1a4774: Status 404 returned error can't find the container with id 267c806f92ae38bf3194a75058a40eeda61a3f2e894e36463079cf4d7a1a4774 Mar 09 09:20:45 crc kubenswrapper[4861]: I0309 09:20:45.490919 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-568b7cf6db-zpgv9" event={"ID":"4f92d232-a96d-4774-8bfb-ece261f9b9d4","Type":"ContainerStarted","Data":"267c806f92ae38bf3194a75058a40eeda61a3f2e894e36463079cf4d7a1a4774"} Mar 09 09:20:50 crc kubenswrapper[4861]: I0309 09:20:50.530353 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-568b7cf6db-zpgv9" event={"ID":"4f92d232-a96d-4774-8bfb-ece261f9b9d4","Type":"ContainerStarted","Data":"98994a2cd67c06e2af436d711d55758fe0cee5ccfb59a70fe3e19bca37cd7ba0"} Mar 09 09:20:50 crc kubenswrapper[4861]: I0309 09:20:50.530979 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-568b7cf6db-zpgv9" Mar 09 09:20:50 crc kubenswrapper[4861]: I0309 09:20:50.565076 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-568b7cf6db-zpgv9" podStartSLOduration=1.910097926 podStartE2EDuration="6.56505699s" podCreationTimestamp="2026-03-09 09:20:44 +0000 UTC" firstStartedPulling="2026-03-09 09:20:45.072298738 +0000 UTC m=+888.157338139" lastFinishedPulling="2026-03-09 09:20:49.727257802 +0000 UTC m=+892.812297203" observedRunningTime="2026-03-09 09:20:50.561683992 +0000 UTC m=+893.646723393" watchObservedRunningTime="2026-03-09 09:20:50.56505699 +0000 UTC m=+893.650096401" Mar 09 09:20:54 crc kubenswrapper[4861]: I0309 09:20:54.798015 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-568b7cf6db-zpgv9" Mar 09 09:21:03 crc kubenswrapper[4861]: I0309 09:21:03.341092 4861 scope.go:117] "RemoveContainer" containerID="359b847262f54581c00f715fa1d567f5271b3130f3dc14a31f8df8eb7ab9860c" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.530258 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6db6876945-s4tc6"] Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.532017 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-s4tc6" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.536684 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-bvm65" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.542241 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-89f2d"] Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.543517 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-89f2d" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.547116 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-8bb45" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.548244 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6db6876945-s4tc6"] Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.581479 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-5d87c9d997-2wbrm"] Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.582431 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-2wbrm" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.584651 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-4798p" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.589456 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-5d87c9d997-2wbrm"] Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.592615 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-64db6967f8-pdxgs"] Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.593302 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-pdxgs" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.597888 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-5886v" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.617703 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-cf99c678f-jw5nb"] Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.618460 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-jw5nb" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.626578 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-sf7kz" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.656790 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-89f2d"] Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.669250 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-bwxj2"] Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.670060 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-cf99c678f-jw5nb"] Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.670153 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-bwxj2" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.689221 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cb5t\" (UniqueName: \"kubernetes.io/projected/861a14c0-5dcd-4126-b386-65467726a9dd-kube-api-access-5cb5t\") pod \"cinder-operator-controller-manager-55d77d7b5c-89f2d\" (UID: \"861a14c0-5dcd-4126-b386-65467726a9dd\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-89f2d" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.689275 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7knf\" (UniqueName: \"kubernetes.io/projected/8fb737b1-978f-4f1e-98db-f1c542ef77d9-kube-api-access-w7knf\") pod \"glance-operator-controller-manager-64db6967f8-pdxgs\" (UID: \"8fb737b1-978f-4f1e-98db-f1c542ef77d9\") " pod="openstack-operators/glance-operator-controller-manager-64db6967f8-pdxgs" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.689302 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5d5l\" (UniqueName: \"kubernetes.io/projected/eaaa08cd-22f8-40a3-9cac-7e29137ea358-kube-api-access-m5d5l\") pod \"barbican-operator-controller-manager-6db6876945-s4tc6\" (UID: \"eaaa08cd-22f8-40a3-9cac-7e29137ea358\") " pod="openstack-operators/barbican-operator-controller-manager-6db6876945-s4tc6" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.689388 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sswjt\" (UniqueName: \"kubernetes.io/projected/5cccaa46-1901-457b-b093-9edfb512b68f-kube-api-access-sswjt\") pod \"designate-operator-controller-manager-5d87c9d997-2wbrm\" (UID: \"5cccaa46-1901-457b-b093-9edfb512b68f\") " pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-2wbrm" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.689676 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-dqc96" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.700910 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-f7fcc58b9-srw9z"] Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.701903 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-srw9z" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.705288 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.705321 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-ltsrn" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.710418 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-bwxj2"] Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.724653 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-64db6967f8-pdxgs"] Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.726103 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-f7fcc58b9-srw9z"] Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.748708 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-545456dc4-blz6f"] Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.749486 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-blz6f" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.754671 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-xn8nd" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.760441 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c789f89c6-pktcs"] Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.761228 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-pktcs" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.764722 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-czgv8" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.778421 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c789f89c6-pktcs"] Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.785159 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-545456dc4-blz6f"] Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.790573 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztrnd\" (UniqueName: \"kubernetes.io/projected/32ddb619-584a-4ff4-a988-63d565043353-kube-api-access-ztrnd\") pod \"horizon-operator-controller-manager-78bc7f9bd9-bwxj2\" (UID: \"32ddb619-584a-4ff4-a988-63d565043353\") " pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-bwxj2" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.790621 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1b8226be-5eb4-4156-a168-f843edac34ce-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-srw9z\" (UID: \"1b8226be-5eb4-4156-a168-f843edac34ce\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-srw9z" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.790686 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sswjt\" (UniqueName: \"kubernetes.io/projected/5cccaa46-1901-457b-b093-9edfb512b68f-kube-api-access-sswjt\") pod \"designate-operator-controller-manager-5d87c9d997-2wbrm\" (UID: \"5cccaa46-1901-457b-b093-9edfb512b68f\") " pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-2wbrm" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.790787 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbbxw\" (UniqueName: \"kubernetes.io/projected/23566122-1654-40c1-8dd6-577280d0dcec-kube-api-access-pbbxw\") pod \"heat-operator-controller-manager-cf99c678f-jw5nb\" (UID: \"23566122-1654-40c1-8dd6-577280d0dcec\") " pod="openstack-operators/heat-operator-controller-manager-cf99c678f-jw5nb" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.790844 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cb5t\" (UniqueName: \"kubernetes.io/projected/861a14c0-5dcd-4126-b386-65467726a9dd-kube-api-access-5cb5t\") pod \"cinder-operator-controller-manager-55d77d7b5c-89f2d\" (UID: \"861a14c0-5dcd-4126-b386-65467726a9dd\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-89f2d" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.790870 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdv69\" (UniqueName: \"kubernetes.io/projected/1b8226be-5eb4-4156-a168-f843edac34ce-kube-api-access-tdv69\") pod \"infra-operator-controller-manager-f7fcc58b9-srw9z\" (UID: \"1b8226be-5eb4-4156-a168-f843edac34ce\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-srw9z" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.790898 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7knf\" (UniqueName: \"kubernetes.io/projected/8fb737b1-978f-4f1e-98db-f1c542ef77d9-kube-api-access-w7knf\") pod \"glance-operator-controller-manager-64db6967f8-pdxgs\" (UID: \"8fb737b1-978f-4f1e-98db-f1c542ef77d9\") " pod="openstack-operators/glance-operator-controller-manager-64db6967f8-pdxgs" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.790922 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5d5l\" (UniqueName: \"kubernetes.io/projected/eaaa08cd-22f8-40a3-9cac-7e29137ea358-kube-api-access-m5d5l\") pod \"barbican-operator-controller-manager-6db6876945-s4tc6\" (UID: \"eaaa08cd-22f8-40a3-9cac-7e29137ea358\") " pod="openstack-operators/barbican-operator-controller-manager-6db6876945-s4tc6" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.793118 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-8vnvb"] Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.793875 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-67d996989d-8vnvb" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.799442 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-lbhm2"] Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.800437 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-lbhm2" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.825401 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-8vnvb"] Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.834326 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-r2ddx" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.834620 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-lrmtd" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.858399 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5d5l\" (UniqueName: \"kubernetes.io/projected/eaaa08cd-22f8-40a3-9cac-7e29137ea358-kube-api-access-m5d5l\") pod \"barbican-operator-controller-manager-6db6876945-s4tc6\" (UID: \"eaaa08cd-22f8-40a3-9cac-7e29137ea358\") " pod="openstack-operators/barbican-operator-controller-manager-6db6876945-s4tc6" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.866232 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-s4tc6" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.890825 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cb5t\" (UniqueName: \"kubernetes.io/projected/861a14c0-5dcd-4126-b386-65467726a9dd-kube-api-access-5cb5t\") pod \"cinder-operator-controller-manager-55d77d7b5c-89f2d\" (UID: \"861a14c0-5dcd-4126-b386-65467726a9dd\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-89f2d" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.891085 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-lbhm2"] Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.891704 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbbxw\" (UniqueName: \"kubernetes.io/projected/23566122-1654-40c1-8dd6-577280d0dcec-kube-api-access-pbbxw\") pod \"heat-operator-controller-manager-cf99c678f-jw5nb\" (UID: \"23566122-1654-40c1-8dd6-577280d0dcec\") " pod="openstack-operators/heat-operator-controller-manager-cf99c678f-jw5nb" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.891746 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdv69\" (UniqueName: \"kubernetes.io/projected/1b8226be-5eb4-4156-a168-f843edac34ce-kube-api-access-tdv69\") pod \"infra-operator-controller-manager-f7fcc58b9-srw9z\" (UID: \"1b8226be-5eb4-4156-a168-f843edac34ce\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-srw9z" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.891783 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zj4k6\" (UniqueName: \"kubernetes.io/projected/78b36bea-6c3d-4794-b38a-6b4a5b3e9f5d-kube-api-access-zj4k6\") pod \"manila-operator-controller-manager-67d996989d-8vnvb\" (UID: \"78b36bea-6c3d-4794-b38a-6b4a5b3e9f5d\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-8vnvb" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.891807 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnd94\" (UniqueName: \"kubernetes.io/projected/12c3b94d-baff-4b5d-864e-371f5b3857f5-kube-api-access-pnd94\") pod \"mariadb-operator-controller-manager-7b6bfb6475-lbhm2\" (UID: \"12c3b94d-baff-4b5d-864e-371f5b3857f5\") " pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-lbhm2" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.891825 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztrnd\" (UniqueName: \"kubernetes.io/projected/32ddb619-584a-4ff4-a988-63d565043353-kube-api-access-ztrnd\") pod \"horizon-operator-controller-manager-78bc7f9bd9-bwxj2\" (UID: \"32ddb619-584a-4ff4-a988-63d565043353\") " pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-bwxj2" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.891845 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1b8226be-5eb4-4156-a168-f843edac34ce-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-srw9z\" (UID: \"1b8226be-5eb4-4156-a168-f843edac34ce\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-srw9z" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.891864 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sx67k\" (UniqueName: \"kubernetes.io/projected/e9e766bf-fdea-451e-a58c-a8818fccf4b4-kube-api-access-sx67k\") pod \"ironic-operator-controller-manager-545456dc4-blz6f\" (UID: \"e9e766bf-fdea-451e-a58c-a8818fccf4b4\") " pod="openstack-operators/ironic-operator-controller-manager-545456dc4-blz6f" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.891882 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqzxj\" (UniqueName: \"kubernetes.io/projected/fbbd2d76-31fb-46d7-a422-af5f3e51baaf-kube-api-access-hqzxj\") pod \"keystone-operator-controller-manager-7c789f89c6-pktcs\" (UID: \"fbbd2d76-31fb-46d7-a422-af5f3e51baaf\") " pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-pktcs" Mar 09 09:21:31 crc kubenswrapper[4861]: E0309 09:21:31.892321 4861 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 09 09:21:31 crc kubenswrapper[4861]: E0309 09:21:31.892358 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b8226be-5eb4-4156-a168-f843edac34ce-cert podName:1b8226be-5eb4-4156-a168-f843edac34ce nodeName:}" failed. No retries permitted until 2026-03-09 09:21:32.392344914 +0000 UTC m=+935.477384315 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1b8226be-5eb4-4156-a168-f843edac34ce-cert") pod "infra-operator-controller-manager-f7fcc58b9-srw9z" (UID: "1b8226be-5eb4-4156-a168-f843edac34ce") : secret "infra-operator-webhook-server-cert" not found Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.894533 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-89f2d" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.904775 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sswjt\" (UniqueName: \"kubernetes.io/projected/5cccaa46-1901-457b-b093-9edfb512b68f-kube-api-access-sswjt\") pod \"designate-operator-controller-manager-5d87c9d997-2wbrm\" (UID: \"5cccaa46-1901-457b-b093-9edfb512b68f\") " pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-2wbrm" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.907206 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54688575f-9dhrv"] Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.907996 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-54688575f-9dhrv" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.913650 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7knf\" (UniqueName: \"kubernetes.io/projected/8fb737b1-978f-4f1e-98db-f1c542ef77d9-kube-api-access-w7knf\") pod \"glance-operator-controller-manager-64db6967f8-pdxgs\" (UID: \"8fb737b1-978f-4f1e-98db-f1c542ef77d9\") " pod="openstack-operators/glance-operator-controller-manager-64db6967f8-pdxgs" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.930441 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-74b6b5dc96-lnm2d"] Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.930647 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-pdxgs" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.931958 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-lnm2d" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.933098 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-74b6b5dc96-lnm2d"] Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.938701 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-2cb86"] Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.945063 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-6lcq4" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.946683 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-7fk24" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.950620 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-2cb86" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.959012 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-9gt4d" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.970791 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54688575f-9dhrv"] Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.970836 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6f64bd8c75j8xx4"] Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.971647 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64bd8c75j8xx4" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.977698 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbbxw\" (UniqueName: \"kubernetes.io/projected/23566122-1654-40c1-8dd6-577280d0dcec-kube-api-access-pbbxw\") pod \"heat-operator-controller-manager-cf99c678f-jw5nb\" (UID: \"23566122-1654-40c1-8dd6-577280d0dcec\") " pod="openstack-operators/heat-operator-controller-manager-cf99c678f-jw5nb" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.980979 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdv69\" (UniqueName: \"kubernetes.io/projected/1b8226be-5eb4-4156-a168-f843edac34ce-kube-api-access-tdv69\") pod \"infra-operator-controller-manager-f7fcc58b9-srw9z\" (UID: \"1b8226be-5eb4-4156-a168-f843edac34ce\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-srw9z" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.981815 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztrnd\" (UniqueName: \"kubernetes.io/projected/32ddb619-584a-4ff4-a988-63d565043353-kube-api-access-ztrnd\") pod \"horizon-operator-controller-manager-78bc7f9bd9-bwxj2\" (UID: \"32ddb619-584a-4ff4-a988-63d565043353\") " pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-bwxj2" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.988687 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-l6p2s" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.988859 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.997693 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnd94\" (UniqueName: \"kubernetes.io/projected/12c3b94d-baff-4b5d-864e-371f5b3857f5-kube-api-access-pnd94\") pod \"mariadb-operator-controller-manager-7b6bfb6475-lbhm2\" (UID: \"12c3b94d-baff-4b5d-864e-371f5b3857f5\") " pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-lbhm2" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.997744 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sx67k\" (UniqueName: \"kubernetes.io/projected/e9e766bf-fdea-451e-a58c-a8818fccf4b4-kube-api-access-sx67k\") pod \"ironic-operator-controller-manager-545456dc4-blz6f\" (UID: \"e9e766bf-fdea-451e-a58c-a8818fccf4b4\") " pod="openstack-operators/ironic-operator-controller-manager-545456dc4-blz6f" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.997766 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch7kl\" (UniqueName: \"kubernetes.io/projected/091caccf-659b-42dd-b9cb-05aeea2548ce-kube-api-access-ch7kl\") pod \"neutron-operator-controller-manager-54688575f-9dhrv\" (UID: \"091caccf-659b-42dd-b9cb-05aeea2548ce\") " pod="openstack-operators/neutron-operator-controller-manager-54688575f-9dhrv" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.997786 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqzxj\" (UniqueName: \"kubernetes.io/projected/fbbd2d76-31fb-46d7-a422-af5f3e51baaf-kube-api-access-hqzxj\") pod \"keystone-operator-controller-manager-7c789f89c6-pktcs\" (UID: \"fbbd2d76-31fb-46d7-a422-af5f3e51baaf\") " pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-pktcs" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.997820 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjn5d\" (UniqueName: \"kubernetes.io/projected/511b2722-0227-4a4f-931c-e69ad12e60de-kube-api-access-kjn5d\") pod \"nova-operator-controller-manager-74b6b5dc96-lnm2d\" (UID: \"511b2722-0227-4a4f-931c-e69ad12e60de\") " pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-lnm2d" Mar 09 09:21:31 crc kubenswrapper[4861]: I0309 09:21:31.997877 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zj4k6\" (UniqueName: \"kubernetes.io/projected/78b36bea-6c3d-4794-b38a-6b4a5b3e9f5d-kube-api-access-zj4k6\") pod \"manila-operator-controller-manager-67d996989d-8vnvb\" (UID: \"78b36bea-6c3d-4794-b38a-6b4a5b3e9f5d\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-8vnvb" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.001723 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-2cb86"] Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.018430 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-bwxj2" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.033176 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-75684d597f-qqwld"] Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.033323 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnd94\" (UniqueName: \"kubernetes.io/projected/12c3b94d-baff-4b5d-864e-371f5b3857f5-kube-api-access-pnd94\") pod \"mariadb-operator-controller-manager-7b6bfb6475-lbhm2\" (UID: \"12c3b94d-baff-4b5d-864e-371f5b3857f5\") " pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-lbhm2" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.034136 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqzxj\" (UniqueName: \"kubernetes.io/projected/fbbd2d76-31fb-46d7-a422-af5f3e51baaf-kube-api-access-hqzxj\") pod \"keystone-operator-controller-manager-7c789f89c6-pktcs\" (UID: \"fbbd2d76-31fb-46d7-a422-af5f3e51baaf\") " pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-pktcs" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.034155 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-qqwld" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.035837 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-qvs5w" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.067691 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zj4k6\" (UniqueName: \"kubernetes.io/projected/78b36bea-6c3d-4794-b38a-6b4a5b3e9f5d-kube-api-access-zj4k6\") pod \"manila-operator-controller-manager-67d996989d-8vnvb\" (UID: \"78b36bea-6c3d-4794-b38a-6b4a5b3e9f5d\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-8vnvb" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.072075 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6f64bd8c75j8xx4"] Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.084430 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-75684d597f-qqwld"] Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.101911 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/72b49679-1f56-42df-bafc-a899cd2da3cf-cert\") pod \"openstack-baremetal-operator-controller-manager-6f64bd8c75j8xx4\" (UID: \"72b49679-1f56-42df-bafc-a899cd2da3cf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64bd8c75j8xx4" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.101960 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ch7kl\" (UniqueName: \"kubernetes.io/projected/091caccf-659b-42dd-b9cb-05aeea2548ce-kube-api-access-ch7kl\") pod \"neutron-operator-controller-manager-54688575f-9dhrv\" (UID: \"091caccf-659b-42dd-b9cb-05aeea2548ce\") " pod="openstack-operators/neutron-operator-controller-manager-54688575f-9dhrv" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.101986 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hq255\" (UniqueName: \"kubernetes.io/projected/72b49679-1f56-42df-bafc-a899cd2da3cf-kube-api-access-hq255\") pod \"openstack-baremetal-operator-controller-manager-6f64bd8c75j8xx4\" (UID: \"72b49679-1f56-42df-bafc-a899cd2da3cf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64bd8c75j8xx4" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.102017 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjn5d\" (UniqueName: \"kubernetes.io/projected/511b2722-0227-4a4f-931c-e69ad12e60de-kube-api-access-kjn5d\") pod \"nova-operator-controller-manager-74b6b5dc96-lnm2d\" (UID: \"511b2722-0227-4a4f-931c-e69ad12e60de\") " pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-lnm2d" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.102050 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8jj6\" (UniqueName: \"kubernetes.io/projected/eec501ad-33c8-4195-8817-3078202db97a-kube-api-access-r8jj6\") pod \"octavia-operator-controller-manager-5d86c7ddb7-2cb86\" (UID: \"eec501ad-33c8-4195-8817-3078202db97a\") " pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-2cb86" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.102071 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqzq7\" (UniqueName: \"kubernetes.io/projected/73ae93db-3260-4af6-9724-52e8b97a0245-kube-api-access-gqzq7\") pod \"ovn-operator-controller-manager-75684d597f-qqwld\" (UID: \"73ae93db-3260-4af6-9724-52e8b97a0245\") " pod="openstack-operators/ovn-operator-controller-manager-75684d597f-qqwld" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.109172 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-pktcs" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.110444 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-648564c9fc-2bqxz"] Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.111186 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-2bqxz" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.131837 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-648564c9fc-2bqxz"] Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.133283 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-9b9ff9f4d-hrnq8"] Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.134020 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-hrnq8" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.140256 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sx67k\" (UniqueName: \"kubernetes.io/projected/e9e766bf-fdea-451e-a58c-a8818fccf4b4-kube-api-access-sx67k\") pod \"ironic-operator-controller-manager-545456dc4-blz6f\" (UID: \"e9e766bf-fdea-451e-a58c-a8818fccf4b4\") " pod="openstack-operators/ironic-operator-controller-manager-545456dc4-blz6f" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.140712 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-2xhh2" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.142895 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-tvtsp" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.154650 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-67d996989d-8vnvb" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.160941 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch7kl\" (UniqueName: \"kubernetes.io/projected/091caccf-659b-42dd-b9cb-05aeea2548ce-kube-api-access-ch7kl\") pod \"neutron-operator-controller-manager-54688575f-9dhrv\" (UID: \"091caccf-659b-42dd-b9cb-05aeea2548ce\") " pod="openstack-operators/neutron-operator-controller-manager-54688575f-9dhrv" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.174958 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjn5d\" (UniqueName: \"kubernetes.io/projected/511b2722-0227-4a4f-931c-e69ad12e60de-kube-api-access-kjn5d\") pod \"nova-operator-controller-manager-74b6b5dc96-lnm2d\" (UID: \"511b2722-0227-4a4f-931c-e69ad12e60de\") " pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-lnm2d" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.180734 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5fdb694969-vlxrh"] Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.193469 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-vlxrh" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.203114 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-76vdw" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.203632 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppqmk\" (UniqueName: \"kubernetes.io/projected/b386f8ad-7867-4d35-83f8-382a379e3c1e-kube-api-access-ppqmk\") pod \"placement-operator-controller-manager-648564c9fc-2bqxz\" (UID: \"b386f8ad-7867-4d35-83f8-382a379e3c1e\") " pod="openstack-operators/placement-operator-controller-manager-648564c9fc-2bqxz" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.203719 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/72b49679-1f56-42df-bafc-a899cd2da3cf-cert\") pod \"openstack-baremetal-operator-controller-manager-6f64bd8c75j8xx4\" (UID: \"72b49679-1f56-42df-bafc-a899cd2da3cf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64bd8c75j8xx4" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.203760 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hq255\" (UniqueName: \"kubernetes.io/projected/72b49679-1f56-42df-bafc-a899cd2da3cf-kube-api-access-hq255\") pod \"openstack-baremetal-operator-controller-manager-6f64bd8c75j8xx4\" (UID: \"72b49679-1f56-42df-bafc-a899cd2da3cf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64bd8c75j8xx4" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.203826 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hczlh\" (UniqueName: \"kubernetes.io/projected/925518c8-3714-4180-ad1f-9bee534dd0dc-kube-api-access-hczlh\") pod \"swift-operator-controller-manager-9b9ff9f4d-hrnq8\" (UID: \"925518c8-3714-4180-ad1f-9bee534dd0dc\") " pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-hrnq8" Mar 09 09:21:32 crc kubenswrapper[4861]: E0309 09:21:32.203887 4861 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 09:21:32 crc kubenswrapper[4861]: E0309 09:21:32.204026 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72b49679-1f56-42df-bafc-a899cd2da3cf-cert podName:72b49679-1f56-42df-bafc-a899cd2da3cf nodeName:}" failed. No retries permitted until 2026-03-09 09:21:32.703942353 +0000 UTC m=+935.788981744 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/72b49679-1f56-42df-bafc-a899cd2da3cf-cert") pod "openstack-baremetal-operator-controller-manager-6f64bd8c75j8xx4" (UID: "72b49679-1f56-42df-bafc-a899cd2da3cf") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.203893 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8jj6\" (UniqueName: \"kubernetes.io/projected/eec501ad-33c8-4195-8817-3078202db97a-kube-api-access-r8jj6\") pod \"octavia-operator-controller-manager-5d86c7ddb7-2cb86\" (UID: \"eec501ad-33c8-4195-8817-3078202db97a\") " pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-2cb86" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.204260 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqzq7\" (UniqueName: \"kubernetes.io/projected/73ae93db-3260-4af6-9724-52e8b97a0245-kube-api-access-gqzq7\") pod \"ovn-operator-controller-manager-75684d597f-qqwld\" (UID: \"73ae93db-3260-4af6-9724-52e8b97a0245\") " pod="openstack-operators/ovn-operator-controller-manager-75684d597f-qqwld" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.205790 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-2wbrm" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.233050 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-jw5nb" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.239524 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8jj6\" (UniqueName: \"kubernetes.io/projected/eec501ad-33c8-4195-8817-3078202db97a-kube-api-access-r8jj6\") pod \"octavia-operator-controller-manager-5d86c7ddb7-2cb86\" (UID: \"eec501ad-33c8-4195-8817-3078202db97a\") " pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-2cb86" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.240360 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hq255\" (UniqueName: \"kubernetes.io/projected/72b49679-1f56-42df-bafc-a899cd2da3cf-kube-api-access-hq255\") pod \"openstack-baremetal-operator-controller-manager-6f64bd8c75j8xx4\" (UID: \"72b49679-1f56-42df-bafc-a899cd2da3cf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64bd8c75j8xx4" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.240545 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5fdb694969-vlxrh"] Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.250594 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-lbhm2" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.274919 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9b9ff9f4d-hrnq8"] Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.278232 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqzq7\" (UniqueName: \"kubernetes.io/projected/73ae93db-3260-4af6-9724-52e8b97a0245-kube-api-access-gqzq7\") pod \"ovn-operator-controller-manager-75684d597f-qqwld\" (UID: \"73ae93db-3260-4af6-9724-52e8b97a0245\") " pod="openstack-operators/ovn-operator-controller-manager-75684d597f-qqwld" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.292844 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-55b5ff4dbb-ztmwj"] Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.293657 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-ztmwj" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.297687 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-x2hvw" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.298066 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-54688575f-9dhrv" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.306411 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hczlh\" (UniqueName: \"kubernetes.io/projected/925518c8-3714-4180-ad1f-9bee534dd0dc-kube-api-access-hczlh\") pod \"swift-operator-controller-manager-9b9ff9f4d-hrnq8\" (UID: \"925518c8-3714-4180-ad1f-9bee534dd0dc\") " pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-hrnq8" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.306472 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn7zp\" (UniqueName: \"kubernetes.io/projected/fdd188d8-f434-493e-a8f5-3506031b0f83-kube-api-access-mn7zp\") pod \"telemetry-operator-controller-manager-5fdb694969-vlxrh\" (UID: \"fdd188d8-f434-493e-a8f5-3506031b0f83\") " pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-vlxrh" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.306533 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppqmk\" (UniqueName: \"kubernetes.io/projected/b386f8ad-7867-4d35-83f8-382a379e3c1e-kube-api-access-ppqmk\") pod \"placement-operator-controller-manager-648564c9fc-2bqxz\" (UID: \"b386f8ad-7867-4d35-83f8-382a379e3c1e\") " pod="openstack-operators/placement-operator-controller-manager-648564c9fc-2bqxz" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.335798 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-55b5ff4dbb-ztmwj"] Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.336189 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-lnm2d" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.339115 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-pmwxp"] Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.340869 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-pmwxp" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.347536 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppqmk\" (UniqueName: \"kubernetes.io/projected/b386f8ad-7867-4d35-83f8-382a379e3c1e-kube-api-access-ppqmk\") pod \"placement-operator-controller-manager-648564c9fc-2bqxz\" (UID: \"b386f8ad-7867-4d35-83f8-382a379e3c1e\") " pod="openstack-operators/placement-operator-controller-manager-648564c9fc-2bqxz" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.349896 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-74q7n" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.365527 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hczlh\" (UniqueName: \"kubernetes.io/projected/925518c8-3714-4180-ad1f-9bee534dd0dc-kube-api-access-hczlh\") pod \"swift-operator-controller-manager-9b9ff9f4d-hrnq8\" (UID: \"925518c8-3714-4180-ad1f-9bee534dd0dc\") " pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-hrnq8" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.386437 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-blz6f" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.395090 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-pmwxp"] Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.399461 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-2cb86" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.407554 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mn7zp\" (UniqueName: \"kubernetes.io/projected/fdd188d8-f434-493e-a8f5-3506031b0f83-kube-api-access-mn7zp\") pod \"telemetry-operator-controller-manager-5fdb694969-vlxrh\" (UID: \"fdd188d8-f434-493e-a8f5-3506031b0f83\") " pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-vlxrh" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.407610 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tmkt\" (UniqueName: \"kubernetes.io/projected/4cbd2609-0983-4da2-a0c7-fa66387e36ae-kube-api-access-4tmkt\") pod \"watcher-operator-controller-manager-bccc79885-pmwxp\" (UID: \"4cbd2609-0983-4da2-a0c7-fa66387e36ae\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-pmwxp" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.407657 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdclz\" (UniqueName: \"kubernetes.io/projected/175e8d6d-930a-484b-b0b3-d45f37da4239-kube-api-access-kdclz\") pod \"test-operator-controller-manager-55b5ff4dbb-ztmwj\" (UID: \"175e8d6d-930a-484b-b0b3-d45f37da4239\") " pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-ztmwj" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.407694 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1b8226be-5eb4-4156-a168-f843edac34ce-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-srw9z\" (UID: \"1b8226be-5eb4-4156-a168-f843edac34ce\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-srw9z" Mar 09 09:21:32 crc kubenswrapper[4861]: E0309 09:21:32.408437 4861 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 09 09:21:32 crc kubenswrapper[4861]: E0309 09:21:32.408499 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b8226be-5eb4-4156-a168-f843edac34ce-cert podName:1b8226be-5eb4-4156-a168-f843edac34ce nodeName:}" failed. No retries permitted until 2026-03-09 09:21:33.408482332 +0000 UTC m=+936.493521733 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1b8226be-5eb4-4156-a168-f843edac34ce-cert") pod "infra-operator-controller-manager-f7fcc58b9-srw9z" (UID: "1b8226be-5eb4-4156-a168-f843edac34ce") : secret "infra-operator-webhook-server-cert" not found Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.432791 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-59b6c9788f-lfprc"] Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.435010 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn7zp\" (UniqueName: \"kubernetes.io/projected/fdd188d8-f434-493e-a8f5-3506031b0f83-kube-api-access-mn7zp\") pod \"telemetry-operator-controller-manager-5fdb694969-vlxrh\" (UID: \"fdd188d8-f434-493e-a8f5-3506031b0f83\") " pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-vlxrh" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.437309 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-lfprc" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.440001 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.440171 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-mmqjb" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.441470 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.475924 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-qqwld" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.491891 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-59b6c9788f-lfprc"] Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.499437 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-2bqxz" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.511244 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ec436429-c762-4e15-8f82-19a10cdc7941-webhook-certs\") pod \"openstack-operator-controller-manager-59b6c9788f-lfprc\" (UID: \"ec436429-c762-4e15-8f82-19a10cdc7941\") " pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-lfprc" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.511291 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tmkt\" (UniqueName: \"kubernetes.io/projected/4cbd2609-0983-4da2-a0c7-fa66387e36ae-kube-api-access-4tmkt\") pod \"watcher-operator-controller-manager-bccc79885-pmwxp\" (UID: \"4cbd2609-0983-4da2-a0c7-fa66387e36ae\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-pmwxp" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.511315 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ec436429-c762-4e15-8f82-19a10cdc7941-metrics-certs\") pod \"openstack-operator-controller-manager-59b6c9788f-lfprc\" (UID: \"ec436429-c762-4e15-8f82-19a10cdc7941\") " pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-lfprc" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.511346 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdclz\" (UniqueName: \"kubernetes.io/projected/175e8d6d-930a-484b-b0b3-d45f37da4239-kube-api-access-kdclz\") pod \"test-operator-controller-manager-55b5ff4dbb-ztmwj\" (UID: \"175e8d6d-930a-484b-b0b3-d45f37da4239\") " pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-ztmwj" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.511380 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9pht\" (UniqueName: \"kubernetes.io/projected/ec436429-c762-4e15-8f82-19a10cdc7941-kube-api-access-v9pht\") pod \"openstack-operator-controller-manager-59b6c9788f-lfprc\" (UID: \"ec436429-c762-4e15-8f82-19a10cdc7941\") " pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-lfprc" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.523284 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-hrnq8" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.532518 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tmkt\" (UniqueName: \"kubernetes.io/projected/4cbd2609-0983-4da2-a0c7-fa66387e36ae-kube-api-access-4tmkt\") pod \"watcher-operator-controller-manager-bccc79885-pmwxp\" (UID: \"4cbd2609-0983-4da2-a0c7-fa66387e36ae\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-pmwxp" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.535407 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xnfpv"] Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.536244 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xnfpv" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.537151 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdclz\" (UniqueName: \"kubernetes.io/projected/175e8d6d-930a-484b-b0b3-d45f37da4239-kube-api-access-kdclz\") pod \"test-operator-controller-manager-55b5ff4dbb-ztmwj\" (UID: \"175e8d6d-930a-484b-b0b3-d45f37da4239\") " pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-ztmwj" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.542600 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-gnnkx" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.571593 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xnfpv"] Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.578790 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-vlxrh" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.617143 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9pht\" (UniqueName: \"kubernetes.io/projected/ec436429-c762-4e15-8f82-19a10cdc7941-kube-api-access-v9pht\") pod \"openstack-operator-controller-manager-59b6c9788f-lfprc\" (UID: \"ec436429-c762-4e15-8f82-19a10cdc7941\") " pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-lfprc" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.617514 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ec436429-c762-4e15-8f82-19a10cdc7941-webhook-certs\") pod \"openstack-operator-controller-manager-59b6c9788f-lfprc\" (UID: \"ec436429-c762-4e15-8f82-19a10cdc7941\") " pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-lfprc" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.617550 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22pjg\" (UniqueName: \"kubernetes.io/projected/e649bda4-59a3-47e6-92e2-910c01b2f7c2-kube-api-access-22pjg\") pod \"rabbitmq-cluster-operator-manager-668c99d594-xnfpv\" (UID: \"e649bda4-59a3-47e6-92e2-910c01b2f7c2\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xnfpv" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.617576 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ec436429-c762-4e15-8f82-19a10cdc7941-metrics-certs\") pod \"openstack-operator-controller-manager-59b6c9788f-lfprc\" (UID: \"ec436429-c762-4e15-8f82-19a10cdc7941\") " pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-lfprc" Mar 09 09:21:32 crc kubenswrapper[4861]: E0309 09:21:32.617703 4861 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 09 09:21:32 crc kubenswrapper[4861]: E0309 09:21:32.617751 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec436429-c762-4e15-8f82-19a10cdc7941-metrics-certs podName:ec436429-c762-4e15-8f82-19a10cdc7941 nodeName:}" failed. No retries permitted until 2026-03-09 09:21:33.117736718 +0000 UTC m=+936.202776119 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ec436429-c762-4e15-8f82-19a10cdc7941-metrics-certs") pod "openstack-operator-controller-manager-59b6c9788f-lfprc" (UID: "ec436429-c762-4e15-8f82-19a10cdc7941") : secret "metrics-server-cert" not found Mar 09 09:21:32 crc kubenswrapper[4861]: E0309 09:21:32.618054 4861 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 09 09:21:32 crc kubenswrapper[4861]: E0309 09:21:32.618333 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec436429-c762-4e15-8f82-19a10cdc7941-webhook-certs podName:ec436429-c762-4e15-8f82-19a10cdc7941 nodeName:}" failed. No retries permitted until 2026-03-09 09:21:33.118315736 +0000 UTC m=+936.203355137 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/ec436429-c762-4e15-8f82-19a10cdc7941-webhook-certs") pod "openstack-operator-controller-manager-59b6c9788f-lfprc" (UID: "ec436429-c762-4e15-8f82-19a10cdc7941") : secret "webhook-server-cert" not found Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.623391 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-ztmwj" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.651683 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6db6876945-s4tc6"] Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.665197 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9pht\" (UniqueName: \"kubernetes.io/projected/ec436429-c762-4e15-8f82-19a10cdc7941-kube-api-access-v9pht\") pod \"openstack-operator-controller-manager-59b6c9788f-lfprc\" (UID: \"ec436429-c762-4e15-8f82-19a10cdc7941\") " pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-lfprc" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.679898 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-pmwxp" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.719084 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22pjg\" (UniqueName: \"kubernetes.io/projected/e649bda4-59a3-47e6-92e2-910c01b2f7c2-kube-api-access-22pjg\") pod \"rabbitmq-cluster-operator-manager-668c99d594-xnfpv\" (UID: \"e649bda4-59a3-47e6-92e2-910c01b2f7c2\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xnfpv" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.719254 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/72b49679-1f56-42df-bafc-a899cd2da3cf-cert\") pod \"openstack-baremetal-operator-controller-manager-6f64bd8c75j8xx4\" (UID: \"72b49679-1f56-42df-bafc-a899cd2da3cf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64bd8c75j8xx4" Mar 09 09:21:32 crc kubenswrapper[4861]: E0309 09:21:32.719428 4861 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 09:21:32 crc kubenswrapper[4861]: E0309 09:21:32.719490 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72b49679-1f56-42df-bafc-a899cd2da3cf-cert podName:72b49679-1f56-42df-bafc-a899cd2da3cf nodeName:}" failed. No retries permitted until 2026-03-09 09:21:33.719471773 +0000 UTC m=+936.804511174 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/72b49679-1f56-42df-bafc-a899cd2da3cf-cert") pod "openstack-baremetal-operator-controller-manager-6f64bd8c75j8xx4" (UID: "72b49679-1f56-42df-bafc-a899cd2da3cf") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.742728 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22pjg\" (UniqueName: \"kubernetes.io/projected/e649bda4-59a3-47e6-92e2-910c01b2f7c2-kube-api-access-22pjg\") pod \"rabbitmq-cluster-operator-manager-668c99d594-xnfpv\" (UID: \"e649bda4-59a3-47e6-92e2-910c01b2f7c2\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xnfpv" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.787527 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-64db6967f8-pdxgs"] Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.877724 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xnfpv" Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.888917 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-s4tc6" event={"ID":"eaaa08cd-22f8-40a3-9cac-7e29137ea358","Type":"ContainerStarted","Data":"4bf37a6a2f53222b01e18ea941c18535e59cfe6fc9bd6fa30d40b242ced45bcc"} Mar 09 09:21:32 crc kubenswrapper[4861]: I0309 09:21:32.900578 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-pdxgs" event={"ID":"8fb737b1-978f-4f1e-98db-f1c542ef77d9","Type":"ContainerStarted","Data":"a356ed8774ddfbad7b5fbfbf98d9840795043247f6946b8cf1003a895acebd70"} Mar 09 09:21:33 crc kubenswrapper[4861]: I0309 09:21:33.125301 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ec436429-c762-4e15-8f82-19a10cdc7941-webhook-certs\") pod \"openstack-operator-controller-manager-59b6c9788f-lfprc\" (UID: \"ec436429-c762-4e15-8f82-19a10cdc7941\") " pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-lfprc" Mar 09 09:21:33 crc kubenswrapper[4861]: I0309 09:21:33.125611 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ec436429-c762-4e15-8f82-19a10cdc7941-metrics-certs\") pod \"openstack-operator-controller-manager-59b6c9788f-lfprc\" (UID: \"ec436429-c762-4e15-8f82-19a10cdc7941\") " pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-lfprc" Mar 09 09:21:33 crc kubenswrapper[4861]: E0309 09:21:33.125765 4861 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 09 09:21:33 crc kubenswrapper[4861]: E0309 09:21:33.125812 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec436429-c762-4e15-8f82-19a10cdc7941-metrics-certs podName:ec436429-c762-4e15-8f82-19a10cdc7941 nodeName:}" failed. No retries permitted until 2026-03-09 09:21:34.125797621 +0000 UTC m=+937.210837022 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ec436429-c762-4e15-8f82-19a10cdc7941-metrics-certs") pod "openstack-operator-controller-manager-59b6c9788f-lfprc" (UID: "ec436429-c762-4e15-8f82-19a10cdc7941") : secret "metrics-server-cert" not found Mar 09 09:21:33 crc kubenswrapper[4861]: E0309 09:21:33.126114 4861 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 09 09:21:33 crc kubenswrapper[4861]: E0309 09:21:33.126158 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec436429-c762-4e15-8f82-19a10cdc7941-webhook-certs podName:ec436429-c762-4e15-8f82-19a10cdc7941 nodeName:}" failed. No retries permitted until 2026-03-09 09:21:34.126151462 +0000 UTC m=+937.211190863 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/ec436429-c762-4e15-8f82-19a10cdc7941-webhook-certs") pod "openstack-operator-controller-manager-59b6c9788f-lfprc" (UID: "ec436429-c762-4e15-8f82-19a10cdc7941") : secret "webhook-server-cert" not found Mar 09 09:21:33 crc kubenswrapper[4861]: I0309 09:21:33.151877 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-bwxj2"] Mar 09 09:21:33 crc kubenswrapper[4861]: W0309 09:21:33.177147 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12c3b94d_baff_4b5d_864e_371f5b3857f5.slice/crio-8bf99c1ff3f58db8cf577abd17ed206fd840814dca524572bdfb0773437ca8e1 WatchSource:0}: Error finding container 8bf99c1ff3f58db8cf577abd17ed206fd840814dca524572bdfb0773437ca8e1: Status 404 returned error can't find the container with id 8bf99c1ff3f58db8cf577abd17ed206fd840814dca524572bdfb0773437ca8e1 Mar 09 09:21:33 crc kubenswrapper[4861]: I0309 09:21:33.178457 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c789f89c6-pktcs"] Mar 09 09:21:33 crc kubenswrapper[4861]: I0309 09:21:33.186472 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-lbhm2"] Mar 09 09:21:33 crc kubenswrapper[4861]: I0309 09:21:33.191912 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-89f2d"] Mar 09 09:21:33 crc kubenswrapper[4861]: I0309 09:21:33.195886 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-8vnvb"] Mar 09 09:21:33 crc kubenswrapper[4861]: I0309 09:21:33.247785 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-cf99c678f-jw5nb"] Mar 09 09:21:33 crc kubenswrapper[4861]: I0309 09:21:33.253541 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-74b6b5dc96-lnm2d"] Mar 09 09:21:33 crc kubenswrapper[4861]: W0309 09:21:33.255676 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod511b2722_0227_4a4f_931c_e69ad12e60de.slice/crio-02c9ce8f6f2468db237e8ac3faa3c018af4306af1f368ec32807d4a74fb1a285 WatchSource:0}: Error finding container 02c9ce8f6f2468db237e8ac3faa3c018af4306af1f368ec32807d4a74fb1a285: Status 404 returned error can't find the container with id 02c9ce8f6f2468db237e8ac3faa3c018af4306af1f368ec32807d4a74fb1a285 Mar 09 09:21:33 crc kubenswrapper[4861]: W0309 09:21:33.259926 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23566122_1654_40c1_8dd6_577280d0dcec.slice/crio-cc8430f6a80bf0e092c7ef6502ed889e9fa302755671ba69d067a5417eef1e7a WatchSource:0}: Error finding container cc8430f6a80bf0e092c7ef6502ed889e9fa302755671ba69d067a5417eef1e7a: Status 404 returned error can't find the container with id cc8430f6a80bf0e092c7ef6502ed889e9fa302755671ba69d067a5417eef1e7a Mar 09 09:21:33 crc kubenswrapper[4861]: I0309 09:21:33.265924 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-2cb86"] Mar 09 09:21:33 crc kubenswrapper[4861]: I0309 09:21:33.400801 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54688575f-9dhrv"] Mar 09 09:21:33 crc kubenswrapper[4861]: I0309 09:21:33.431598 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1b8226be-5eb4-4156-a168-f843edac34ce-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-srw9z\" (UID: \"1b8226be-5eb4-4156-a168-f843edac34ce\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-srw9z" Mar 09 09:21:33 crc kubenswrapper[4861]: E0309 09:21:33.431738 4861 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 09 09:21:33 crc kubenswrapper[4861]: E0309 09:21:33.431791 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b8226be-5eb4-4156-a168-f843edac34ce-cert podName:1b8226be-5eb4-4156-a168-f843edac34ce nodeName:}" failed. No retries permitted until 2026-03-09 09:21:35.431776387 +0000 UTC m=+938.516815788 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1b8226be-5eb4-4156-a168-f843edac34ce-cert") pod "infra-operator-controller-manager-f7fcc58b9-srw9z" (UID: "1b8226be-5eb4-4156-a168-f843edac34ce") : secret "infra-operator-webhook-server-cert" not found Mar 09 09:21:33 crc kubenswrapper[4861]: I0309 09:21:33.584456 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5fdb694969-vlxrh"] Mar 09 09:21:33 crc kubenswrapper[4861]: W0309 09:21:33.586934 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfdd188d8_f434_493e_a8f5_3506031b0f83.slice/crio-701744523fa42c00d9a668e9c8cf3c72b15c4cd274648d3c94ad8b1736a9e77f WatchSource:0}: Error finding container 701744523fa42c00d9a668e9c8cf3c72b15c4cd274648d3c94ad8b1736a9e77f: Status 404 returned error can't find the container with id 701744523fa42c00d9a668e9c8cf3c72b15c4cd274648d3c94ad8b1736a9e77f Mar 09 09:21:33 crc kubenswrapper[4861]: W0309 09:21:33.599649 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9e766bf_fdea_451e_a58c_a8818fccf4b4.slice/crio-b34e5dbd3ae7272cb2d7e7914da8e86f5ea01b829b5217d1627d279d8c691e5b WatchSource:0}: Error finding container b34e5dbd3ae7272cb2d7e7914da8e86f5ea01b829b5217d1627d279d8c691e5b: Status 404 returned error can't find the container with id b34e5dbd3ae7272cb2d7e7914da8e86f5ea01b829b5217d1627d279d8c691e5b Mar 09 09:21:33 crc kubenswrapper[4861]: I0309 09:21:33.598140 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-545456dc4-blz6f"] Mar 09 09:21:33 crc kubenswrapper[4861]: I0309 09:21:33.607152 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9b9ff9f4d-hrnq8"] Mar 09 09:21:33 crc kubenswrapper[4861]: W0309 09:21:33.611319 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb386f8ad_7867_4d35_83f8_382a379e3c1e.slice/crio-0bb25cb5a82291b3f9a3afa12614ee99bf34b050fae9c6001927a888148ed8cd WatchSource:0}: Error finding container 0bb25cb5a82291b3f9a3afa12614ee99bf34b050fae9c6001927a888148ed8cd: Status 404 returned error can't find the container with id 0bb25cb5a82291b3f9a3afa12614ee99bf34b050fae9c6001927a888148ed8cd Mar 09 09:21:33 crc kubenswrapper[4861]: E0309 09:21:33.617423 4861 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:bb939885bd04593ad03af901adb77ee2a2d18529b328c23288c7cc7a2ba5282e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ppqmk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-648564c9fc-2bqxz_openstack-operators(b386f8ad-7867-4d35-83f8-382a379e3c1e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 09 09:21:33 crc kubenswrapper[4861]: E0309 09:21:33.619852 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-2bqxz" podUID="b386f8ad-7867-4d35-83f8-382a379e3c1e" Mar 09 09:21:33 crc kubenswrapper[4861]: I0309 09:21:33.623542 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-55b5ff4dbb-ztmwj"] Mar 09 09:21:33 crc kubenswrapper[4861]: I0309 09:21:33.635906 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-648564c9fc-2bqxz"] Mar 09 09:21:33 crc kubenswrapper[4861]: I0309 09:21:33.643550 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-75684d597f-qqwld"] Mar 09 09:21:33 crc kubenswrapper[4861]: W0309 09:21:33.647575 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod175e8d6d_930a_484b_b0b3_d45f37da4239.slice/crio-3969e30e8a804f65f0584359f70445ba6a41cea492f182e0e2bbc05eb59a219f WatchSource:0}: Error finding container 3969e30e8a804f65f0584359f70445ba6a41cea492f182e0e2bbc05eb59a219f: Status 404 returned error can't find the container with id 3969e30e8a804f65f0584359f70445ba6a41cea492f182e0e2bbc05eb59a219f Mar 09 09:21:33 crc kubenswrapper[4861]: W0309 09:21:33.649257 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5cccaa46_1901_457b_b093_9edfb512b68f.slice/crio-1ff71de76e959692b777920f8114583a3f16bba334c627b4e4ceb34fb909cdf2 WatchSource:0}: Error finding container 1ff71de76e959692b777920f8114583a3f16bba334c627b4e4ceb34fb909cdf2: Status 404 returned error can't find the container with id 1ff71de76e959692b777920f8114583a3f16bba334c627b4e4ceb34fb909cdf2 Mar 09 09:21:33 crc kubenswrapper[4861]: E0309 09:21:33.649512 4861 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:9d03f03aa9a460f1fcac8875064808c03e4ecd0388873bbfb9c7dc58331f3968,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kdclz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-55b5ff4dbb-ztmwj_openstack-operators(175e8d6d-930a-484b-b0b3-d45f37da4239): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 09 09:21:33 crc kubenswrapper[4861]: I0309 09:21:33.650788 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-pmwxp"] Mar 09 09:21:33 crc kubenswrapper[4861]: E0309 09:21:33.650851 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-ztmwj" podUID="175e8d6d-930a-484b-b0b3-d45f37da4239" Mar 09 09:21:33 crc kubenswrapper[4861]: W0309 09:21:33.659212 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode649bda4_59a3_47e6_92e2_910c01b2f7c2.slice/crio-b7ee1cf85c1ee5575ef347fdc48cef0b837c604bc3866b919ab773d882cd68de WatchSource:0}: Error finding container b7ee1cf85c1ee5575ef347fdc48cef0b837c604bc3866b919ab773d882cd68de: Status 404 returned error can't find the container with id b7ee1cf85c1ee5575ef347fdc48cef0b837c604bc3866b919ab773d882cd68de Mar 09 09:21:33 crc kubenswrapper[4861]: E0309 09:21:33.662883 4861 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:9f73c84a9581b5739d8da333c7b64403d7b7ca284b22c624d0effe07f3d2819c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gqzq7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-75684d597f-qqwld_openstack-operators(73ae93db-3260-4af6-9724-52e8b97a0245): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 09 09:21:33 crc kubenswrapper[4861]: E0309 09:21:33.662942 4861 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:508859beb0e5b69169393dbb0039dc03a9d4ba05f16f6ff74f9b25e19d446214,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sswjt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-5d87c9d997-2wbrm_openstack-operators(5cccaa46-1901-457b-b093-9edfb512b68f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 09 09:21:33 crc kubenswrapper[4861]: E0309 09:21:33.664134 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-qqwld" podUID="73ae93db-3260-4af6-9724-52e8b97a0245" Mar 09 09:21:33 crc kubenswrapper[4861]: E0309 09:21:33.666190 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-2wbrm" podUID="5cccaa46-1901-457b-b093-9edfb512b68f" Mar 09 09:21:33 crc kubenswrapper[4861]: W0309 09:21:33.667629 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4cbd2609_0983_4da2_a0c7_fa66387e36ae.slice/crio-ab49c91b5e2029cf3aebb7bbb5b240445e5fbdbd2f1ef601826673173eeeb90e WatchSource:0}: Error finding container ab49c91b5e2029cf3aebb7bbb5b240445e5fbdbd2f1ef601826673173eeeb90e: Status 404 returned error can't find the container with id ab49c91b5e2029cf3aebb7bbb5b240445e5fbdbd2f1ef601826673173eeeb90e Mar 09 09:21:33 crc kubenswrapper[4861]: E0309 09:21:33.667621 4861 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-22pjg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-xnfpv_openstack-operators(e649bda4-59a3-47e6-92e2-910c01b2f7c2): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 09 09:21:33 crc kubenswrapper[4861]: I0309 09:21:33.667999 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-5d87c9d997-2wbrm"] Mar 09 09:21:33 crc kubenswrapper[4861]: E0309 09:21:33.668983 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xnfpv" podUID="e649bda4-59a3-47e6-92e2-910c01b2f7c2" Mar 09 09:21:33 crc kubenswrapper[4861]: E0309 09:21:33.672451 4861 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4tmkt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-bccc79885-pmwxp_openstack-operators(4cbd2609-0983-4da2-a0c7-fa66387e36ae): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 09 09:21:33 crc kubenswrapper[4861]: E0309 09:21:33.675148 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-pmwxp" podUID="4cbd2609-0983-4da2-a0c7-fa66387e36ae" Mar 09 09:21:33 crc kubenswrapper[4861]: I0309 09:21:33.675564 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xnfpv"] Mar 09 09:21:33 crc kubenswrapper[4861]: I0309 09:21:33.736102 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/72b49679-1f56-42df-bafc-a899cd2da3cf-cert\") pod \"openstack-baremetal-operator-controller-manager-6f64bd8c75j8xx4\" (UID: \"72b49679-1f56-42df-bafc-a899cd2da3cf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64bd8c75j8xx4" Mar 09 09:21:33 crc kubenswrapper[4861]: E0309 09:21:33.738192 4861 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 09:21:33 crc kubenswrapper[4861]: E0309 09:21:33.738256 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72b49679-1f56-42df-bafc-a899cd2da3cf-cert podName:72b49679-1f56-42df-bafc-a899cd2da3cf nodeName:}" failed. No retries permitted until 2026-03-09 09:21:35.738239676 +0000 UTC m=+938.823279077 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/72b49679-1f56-42df-bafc-a899cd2da3cf-cert") pod "openstack-baremetal-operator-controller-manager-6f64bd8c75j8xx4" (UID: "72b49679-1f56-42df-bafc-a899cd2da3cf") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 09:21:33 crc kubenswrapper[4861]: I0309 09:21:33.921134 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-54688575f-9dhrv" event={"ID":"091caccf-659b-42dd-b9cb-05aeea2548ce","Type":"ContainerStarted","Data":"19cf2d3b9f0a5e3bf784adf2dbeb853becf978d9abad259bccd05a1d09e8aafc"} Mar 09 09:21:33 crc kubenswrapper[4861]: I0309 09:21:33.922347 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-vlxrh" event={"ID":"fdd188d8-f434-493e-a8f5-3506031b0f83","Type":"ContainerStarted","Data":"701744523fa42c00d9a668e9c8cf3c72b15c4cd274648d3c94ad8b1736a9e77f"} Mar 09 09:21:33 crc kubenswrapper[4861]: I0309 09:21:33.924274 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xnfpv" event={"ID":"e649bda4-59a3-47e6-92e2-910c01b2f7c2","Type":"ContainerStarted","Data":"b7ee1cf85c1ee5575ef347fdc48cef0b837c604bc3866b919ab773d882cd68de"} Mar 09 09:21:33 crc kubenswrapper[4861]: I0309 09:21:33.925997 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-lbhm2" event={"ID":"12c3b94d-baff-4b5d-864e-371f5b3857f5","Type":"ContainerStarted","Data":"8bf99c1ff3f58db8cf577abd17ed206fd840814dca524572bdfb0773437ca8e1"} Mar 09 09:21:33 crc kubenswrapper[4861]: I0309 09:21:33.926926 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-qqwld" event={"ID":"73ae93db-3260-4af6-9724-52e8b97a0245","Type":"ContainerStarted","Data":"797fbde92d447fc1a5c6a05de09029523929673ecf670761ed911d6465bf8d1c"} Mar 09 09:21:33 crc kubenswrapper[4861]: I0309 09:21:33.928543 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-bwxj2" event={"ID":"32ddb619-584a-4ff4-a988-63d565043353","Type":"ContainerStarted","Data":"4fbe7b1bbe4b93faaf2a21c4841637c4a937b2a9e1e9ecdf5a757bc2ebe02b07"} Mar 09 09:21:33 crc kubenswrapper[4861]: I0309 09:21:33.929194 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-2cb86" event={"ID":"eec501ad-33c8-4195-8817-3078202db97a","Type":"ContainerStarted","Data":"96457e7d5a425b9c137a8cb23e3361f793a3c07b6eff8b2537b1b207785f964f"} Mar 09 09:21:33 crc kubenswrapper[4861]: I0309 09:21:33.930298 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-hrnq8" event={"ID":"925518c8-3714-4180-ad1f-9bee534dd0dc","Type":"ContainerStarted","Data":"bf4b1058ecd4cc6fa99f87655b0a06b1b107b596a5cdffb214f8132890362112"} Mar 09 09:21:33 crc kubenswrapper[4861]: I0309 09:21:33.931918 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-2bqxz" event={"ID":"b386f8ad-7867-4d35-83f8-382a379e3c1e","Type":"ContainerStarted","Data":"0bb25cb5a82291b3f9a3afa12614ee99bf34b050fae9c6001927a888148ed8cd"} Mar 09 09:21:33 crc kubenswrapper[4861]: I0309 09:21:33.934360 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-lnm2d" event={"ID":"511b2722-0227-4a4f-931c-e69ad12e60de","Type":"ContainerStarted","Data":"02c9ce8f6f2468db237e8ac3faa3c018af4306af1f368ec32807d4a74fb1a285"} Mar 09 09:21:33 crc kubenswrapper[4861]: E0309 09:21:33.936934 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xnfpv" podUID="e649bda4-59a3-47e6-92e2-910c01b2f7c2" Mar 09 09:21:33 crc kubenswrapper[4861]: I0309 09:21:33.936985 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-89f2d" event={"ID":"861a14c0-5dcd-4126-b386-65467726a9dd","Type":"ContainerStarted","Data":"c2b1e32669d3d758405007695c73127b5c8cab903d5ed41814a7e17306c71523"} Mar 09 09:21:33 crc kubenswrapper[4861]: E0309 09:21:33.937124 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:9f73c84a9581b5739d8da333c7b64403d7b7ca284b22c624d0effe07f3d2819c\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-qqwld" podUID="73ae93db-3260-4af6-9724-52e8b97a0245" Mar 09 09:21:33 crc kubenswrapper[4861]: E0309 09:21:33.937393 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:bb939885bd04593ad03af901adb77ee2a2d18529b328c23288c7cc7a2ba5282e\\\"\"" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-2bqxz" podUID="b386f8ad-7867-4d35-83f8-382a379e3c1e" Mar 09 09:21:33 crc kubenswrapper[4861]: I0309 09:21:33.939908 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-pktcs" event={"ID":"fbbd2d76-31fb-46d7-a422-af5f3e51baaf","Type":"ContainerStarted","Data":"288602403c97a0ccf2cfbdea9f40f3a042d26c81c01507d0f24bc7027178e6f2"} Mar 09 09:21:33 crc kubenswrapper[4861]: I0309 09:21:33.972615 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-pmwxp" event={"ID":"4cbd2609-0983-4da2-a0c7-fa66387e36ae","Type":"ContainerStarted","Data":"ab49c91b5e2029cf3aebb7bbb5b240445e5fbdbd2f1ef601826673173eeeb90e"} Mar 09 09:21:33 crc kubenswrapper[4861]: E0309 09:21:33.974257 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-pmwxp" podUID="4cbd2609-0983-4da2-a0c7-fa66387e36ae" Mar 09 09:21:33 crc kubenswrapper[4861]: I0309 09:21:33.975455 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-67d996989d-8vnvb" event={"ID":"78b36bea-6c3d-4794-b38a-6b4a5b3e9f5d","Type":"ContainerStarted","Data":"dbb3318bbdf6012d1b4b43fce909032584b0b7299e17df663eaef519a4da7388"} Mar 09 09:21:33 crc kubenswrapper[4861]: I0309 09:21:33.976990 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-2wbrm" event={"ID":"5cccaa46-1901-457b-b093-9edfb512b68f","Type":"ContainerStarted","Data":"1ff71de76e959692b777920f8114583a3f16bba334c627b4e4ceb34fb909cdf2"} Mar 09 09:21:33 crc kubenswrapper[4861]: I0309 09:21:33.978640 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-jw5nb" event={"ID":"23566122-1654-40c1-8dd6-577280d0dcec","Type":"ContainerStarted","Data":"cc8430f6a80bf0e092c7ef6502ed889e9fa302755671ba69d067a5417eef1e7a"} Mar 09 09:21:33 crc kubenswrapper[4861]: E0309 09:21:33.978882 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/designate-operator@sha256:508859beb0e5b69169393dbb0039dc03a9d4ba05f16f6ff74f9b25e19d446214\\\"\"" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-2wbrm" podUID="5cccaa46-1901-457b-b093-9edfb512b68f" Mar 09 09:21:34 crc kubenswrapper[4861]: I0309 09:21:34.005781 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-ztmwj" event={"ID":"175e8d6d-930a-484b-b0b3-d45f37da4239","Type":"ContainerStarted","Data":"3969e30e8a804f65f0584359f70445ba6a41cea492f182e0e2bbc05eb59a219f"} Mar 09 09:21:34 crc kubenswrapper[4861]: E0309 09:21:34.007350 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:9d03f03aa9a460f1fcac8875064808c03e4ecd0388873bbfb9c7dc58331f3968\\\"\"" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-ztmwj" podUID="175e8d6d-930a-484b-b0b3-d45f37da4239" Mar 09 09:21:34 crc kubenswrapper[4861]: I0309 09:21:34.008008 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-blz6f" event={"ID":"e9e766bf-fdea-451e-a58c-a8818fccf4b4","Type":"ContainerStarted","Data":"b34e5dbd3ae7272cb2d7e7914da8e86f5ea01b829b5217d1627d279d8c691e5b"} Mar 09 09:21:34 crc kubenswrapper[4861]: I0309 09:21:34.164093 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ec436429-c762-4e15-8f82-19a10cdc7941-webhook-certs\") pod \"openstack-operator-controller-manager-59b6c9788f-lfprc\" (UID: \"ec436429-c762-4e15-8f82-19a10cdc7941\") " pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-lfprc" Mar 09 09:21:34 crc kubenswrapper[4861]: I0309 09:21:34.164161 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ec436429-c762-4e15-8f82-19a10cdc7941-metrics-certs\") pod \"openstack-operator-controller-manager-59b6c9788f-lfprc\" (UID: \"ec436429-c762-4e15-8f82-19a10cdc7941\") " pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-lfprc" Mar 09 09:21:34 crc kubenswrapper[4861]: E0309 09:21:34.164278 4861 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 09 09:21:34 crc kubenswrapper[4861]: E0309 09:21:34.164329 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec436429-c762-4e15-8f82-19a10cdc7941-metrics-certs podName:ec436429-c762-4e15-8f82-19a10cdc7941 nodeName:}" failed. No retries permitted until 2026-03-09 09:21:36.164311138 +0000 UTC m=+939.249350539 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ec436429-c762-4e15-8f82-19a10cdc7941-metrics-certs") pod "openstack-operator-controller-manager-59b6c9788f-lfprc" (UID: "ec436429-c762-4e15-8f82-19a10cdc7941") : secret "metrics-server-cert" not found Mar 09 09:21:34 crc kubenswrapper[4861]: E0309 09:21:34.164655 4861 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 09 09:21:34 crc kubenswrapper[4861]: E0309 09:21:34.164685 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec436429-c762-4e15-8f82-19a10cdc7941-webhook-certs podName:ec436429-c762-4e15-8f82-19a10cdc7941 nodeName:}" failed. No retries permitted until 2026-03-09 09:21:36.164676708 +0000 UTC m=+939.249716109 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/ec436429-c762-4e15-8f82-19a10cdc7941-webhook-certs") pod "openstack-operator-controller-manager-59b6c9788f-lfprc" (UID: "ec436429-c762-4e15-8f82-19a10cdc7941") : secret "webhook-server-cert" not found Mar 09 09:21:35 crc kubenswrapper[4861]: E0309 09:21:35.021411 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xnfpv" podUID="e649bda4-59a3-47e6-92e2-910c01b2f7c2" Mar 09 09:21:35 crc kubenswrapper[4861]: E0309 09:21:35.021711 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-pmwxp" podUID="4cbd2609-0983-4da2-a0c7-fa66387e36ae" Mar 09 09:21:35 crc kubenswrapper[4861]: E0309 09:21:35.021748 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:9f73c84a9581b5739d8da333c7b64403d7b7ca284b22c624d0effe07f3d2819c\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-qqwld" podUID="73ae93db-3260-4af6-9724-52e8b97a0245" Mar 09 09:21:35 crc kubenswrapper[4861]: E0309 09:21:35.021810 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:bb939885bd04593ad03af901adb77ee2a2d18529b328c23288c7cc7a2ba5282e\\\"\"" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-2bqxz" podUID="b386f8ad-7867-4d35-83f8-382a379e3c1e" Mar 09 09:21:35 crc kubenswrapper[4861]: E0309 09:21:35.022098 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:9d03f03aa9a460f1fcac8875064808c03e4ecd0388873bbfb9c7dc58331f3968\\\"\"" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-ztmwj" podUID="175e8d6d-930a-484b-b0b3-d45f37da4239" Mar 09 09:21:35 crc kubenswrapper[4861]: E0309 09:21:35.028678 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/designate-operator@sha256:508859beb0e5b69169393dbb0039dc03a9d4ba05f16f6ff74f9b25e19d446214\\\"\"" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-2wbrm" podUID="5cccaa46-1901-457b-b093-9edfb512b68f" Mar 09 09:21:35 crc kubenswrapper[4861]: I0309 09:21:35.486120 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1b8226be-5eb4-4156-a168-f843edac34ce-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-srw9z\" (UID: \"1b8226be-5eb4-4156-a168-f843edac34ce\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-srw9z" Mar 09 09:21:35 crc kubenswrapper[4861]: E0309 09:21:35.486379 4861 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 09 09:21:35 crc kubenswrapper[4861]: E0309 09:21:35.486510 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b8226be-5eb4-4156-a168-f843edac34ce-cert podName:1b8226be-5eb4-4156-a168-f843edac34ce nodeName:}" failed. No retries permitted until 2026-03-09 09:21:39.486489752 +0000 UTC m=+942.571529153 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1b8226be-5eb4-4156-a168-f843edac34ce-cert") pod "infra-operator-controller-manager-f7fcc58b9-srw9z" (UID: "1b8226be-5eb4-4156-a168-f843edac34ce") : secret "infra-operator-webhook-server-cert" not found Mar 09 09:21:35 crc kubenswrapper[4861]: I0309 09:21:35.790320 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/72b49679-1f56-42df-bafc-a899cd2da3cf-cert\") pod \"openstack-baremetal-operator-controller-manager-6f64bd8c75j8xx4\" (UID: \"72b49679-1f56-42df-bafc-a899cd2da3cf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64bd8c75j8xx4" Mar 09 09:21:35 crc kubenswrapper[4861]: E0309 09:21:35.790515 4861 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 09:21:35 crc kubenswrapper[4861]: E0309 09:21:35.790563 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72b49679-1f56-42df-bafc-a899cd2da3cf-cert podName:72b49679-1f56-42df-bafc-a899cd2da3cf nodeName:}" failed. No retries permitted until 2026-03-09 09:21:39.790549692 +0000 UTC m=+942.875589093 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/72b49679-1f56-42df-bafc-a899cd2da3cf-cert") pod "openstack-baremetal-operator-controller-manager-6f64bd8c75j8xx4" (UID: "72b49679-1f56-42df-bafc-a899cd2da3cf") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 09:21:36 crc kubenswrapper[4861]: I0309 09:21:36.197435 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ec436429-c762-4e15-8f82-19a10cdc7941-webhook-certs\") pod \"openstack-operator-controller-manager-59b6c9788f-lfprc\" (UID: \"ec436429-c762-4e15-8f82-19a10cdc7941\") " pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-lfprc" Mar 09 09:21:36 crc kubenswrapper[4861]: I0309 09:21:36.197802 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ec436429-c762-4e15-8f82-19a10cdc7941-metrics-certs\") pod \"openstack-operator-controller-manager-59b6c9788f-lfprc\" (UID: \"ec436429-c762-4e15-8f82-19a10cdc7941\") " pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-lfprc" Mar 09 09:21:36 crc kubenswrapper[4861]: E0309 09:21:36.197625 4861 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 09 09:21:36 crc kubenswrapper[4861]: E0309 09:21:36.197978 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec436429-c762-4e15-8f82-19a10cdc7941-webhook-certs podName:ec436429-c762-4e15-8f82-19a10cdc7941 nodeName:}" failed. No retries permitted until 2026-03-09 09:21:40.197962081 +0000 UTC m=+943.283001482 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/ec436429-c762-4e15-8f82-19a10cdc7941-webhook-certs") pod "openstack-operator-controller-manager-59b6c9788f-lfprc" (UID: "ec436429-c762-4e15-8f82-19a10cdc7941") : secret "webhook-server-cert" not found Mar 09 09:21:36 crc kubenswrapper[4861]: E0309 09:21:36.197927 4861 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 09 09:21:36 crc kubenswrapper[4861]: E0309 09:21:36.198297 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec436429-c762-4e15-8f82-19a10cdc7941-metrics-certs podName:ec436429-c762-4e15-8f82-19a10cdc7941 nodeName:}" failed. No retries permitted until 2026-03-09 09:21:40.198287962 +0000 UTC m=+943.283327363 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ec436429-c762-4e15-8f82-19a10cdc7941-metrics-certs") pod "openstack-operator-controller-manager-59b6c9788f-lfprc" (UID: "ec436429-c762-4e15-8f82-19a10cdc7941") : secret "metrics-server-cert" not found Mar 09 09:21:39 crc kubenswrapper[4861]: I0309 09:21:39.545824 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1b8226be-5eb4-4156-a168-f843edac34ce-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-srw9z\" (UID: \"1b8226be-5eb4-4156-a168-f843edac34ce\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-srw9z" Mar 09 09:21:39 crc kubenswrapper[4861]: E0309 09:21:39.546047 4861 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 09 09:21:39 crc kubenswrapper[4861]: E0309 09:21:39.546352 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b8226be-5eb4-4156-a168-f843edac34ce-cert podName:1b8226be-5eb4-4156-a168-f843edac34ce nodeName:}" failed. No retries permitted until 2026-03-09 09:21:47.546330234 +0000 UTC m=+950.631369635 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1b8226be-5eb4-4156-a168-f843edac34ce-cert") pod "infra-operator-controller-manager-f7fcc58b9-srw9z" (UID: "1b8226be-5eb4-4156-a168-f843edac34ce") : secret "infra-operator-webhook-server-cert" not found Mar 09 09:21:39 crc kubenswrapper[4861]: I0309 09:21:39.856033 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/72b49679-1f56-42df-bafc-a899cd2da3cf-cert\") pod \"openstack-baremetal-operator-controller-manager-6f64bd8c75j8xx4\" (UID: \"72b49679-1f56-42df-bafc-a899cd2da3cf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64bd8c75j8xx4" Mar 09 09:21:39 crc kubenswrapper[4861]: E0309 09:21:39.856310 4861 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 09:21:39 crc kubenswrapper[4861]: E0309 09:21:39.856394 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72b49679-1f56-42df-bafc-a899cd2da3cf-cert podName:72b49679-1f56-42df-bafc-a899cd2da3cf nodeName:}" failed. No retries permitted until 2026-03-09 09:21:47.856351176 +0000 UTC m=+950.941390587 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/72b49679-1f56-42df-bafc-a899cd2da3cf-cert") pod "openstack-baremetal-operator-controller-manager-6f64bd8c75j8xx4" (UID: "72b49679-1f56-42df-bafc-a899cd2da3cf") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 09:21:40 crc kubenswrapper[4861]: I0309 09:21:40.261406 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ec436429-c762-4e15-8f82-19a10cdc7941-metrics-certs\") pod \"openstack-operator-controller-manager-59b6c9788f-lfprc\" (UID: \"ec436429-c762-4e15-8f82-19a10cdc7941\") " pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-lfprc" Mar 09 09:21:40 crc kubenswrapper[4861]: I0309 09:21:40.261546 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ec436429-c762-4e15-8f82-19a10cdc7941-webhook-certs\") pod \"openstack-operator-controller-manager-59b6c9788f-lfprc\" (UID: \"ec436429-c762-4e15-8f82-19a10cdc7941\") " pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-lfprc" Mar 09 09:21:40 crc kubenswrapper[4861]: E0309 09:21:40.261650 4861 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 09 09:21:40 crc kubenswrapper[4861]: E0309 09:21:40.261684 4861 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 09 09:21:40 crc kubenswrapper[4861]: E0309 09:21:40.261733 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec436429-c762-4e15-8f82-19a10cdc7941-metrics-certs podName:ec436429-c762-4e15-8f82-19a10cdc7941 nodeName:}" failed. No retries permitted until 2026-03-09 09:21:48.261710088 +0000 UTC m=+951.346749489 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ec436429-c762-4e15-8f82-19a10cdc7941-metrics-certs") pod "openstack-operator-controller-manager-59b6c9788f-lfprc" (UID: "ec436429-c762-4e15-8f82-19a10cdc7941") : secret "metrics-server-cert" not found Mar 09 09:21:40 crc kubenswrapper[4861]: E0309 09:21:40.261757 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec436429-c762-4e15-8f82-19a10cdc7941-webhook-certs podName:ec436429-c762-4e15-8f82-19a10cdc7941 nodeName:}" failed. No retries permitted until 2026-03-09 09:21:48.261747029 +0000 UTC m=+951.346786570 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/ec436429-c762-4e15-8f82-19a10cdc7941-webhook-certs") pod "openstack-operator-controller-manager-59b6c9788f-lfprc" (UID: "ec436429-c762-4e15-8f82-19a10cdc7941") : secret "webhook-server-cert" not found Mar 09 09:21:47 crc kubenswrapper[4861]: E0309 09:21:47.300664 4861 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:172f24bd4603ac3498536a8a2c8fffb07cf9113dd52bc132778ea0aa275c6b84" Mar 09 09:21:47 crc kubenswrapper[4861]: E0309 09:21:47.301355 4861 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:172f24bd4603ac3498536a8a2c8fffb07cf9113dd52bc132778ea0aa275c6b84,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kjn5d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-74b6b5dc96-lnm2d_openstack-operators(511b2722-0227-4a4f-931c-e69ad12e60de): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 09:21:47 crc kubenswrapper[4861]: E0309 09:21:47.306201 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-lnm2d" podUID="511b2722-0227-4a4f-931c-e69ad12e60de" Mar 09 09:21:47 crc kubenswrapper[4861]: I0309 09:21:47.576105 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1b8226be-5eb4-4156-a168-f843edac34ce-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-srw9z\" (UID: \"1b8226be-5eb4-4156-a168-f843edac34ce\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-srw9z" Mar 09 09:21:47 crc kubenswrapper[4861]: I0309 09:21:47.585725 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1b8226be-5eb4-4156-a168-f843edac34ce-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-srw9z\" (UID: \"1b8226be-5eb4-4156-a168-f843edac34ce\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-srw9z" Mar 09 09:21:47 crc kubenswrapper[4861]: I0309 09:21:47.632305 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-srw9z" Mar 09 09:21:47 crc kubenswrapper[4861]: I0309 09:21:47.879746 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/72b49679-1f56-42df-bafc-a899cd2da3cf-cert\") pod \"openstack-baremetal-operator-controller-manager-6f64bd8c75j8xx4\" (UID: \"72b49679-1f56-42df-bafc-a899cd2da3cf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64bd8c75j8xx4" Mar 09 09:21:47 crc kubenswrapper[4861]: I0309 09:21:47.884458 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/72b49679-1f56-42df-bafc-a899cd2da3cf-cert\") pod \"openstack-baremetal-operator-controller-manager-6f64bd8c75j8xx4\" (UID: \"72b49679-1f56-42df-bafc-a899cd2da3cf\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64bd8c75j8xx4" Mar 09 09:21:48 crc kubenswrapper[4861]: I0309 09:21:48.017197 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64bd8c75j8xx4" Mar 09 09:21:48 crc kubenswrapper[4861]: I0309 09:21:48.136428 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-hrnq8" event={"ID":"925518c8-3714-4180-ad1f-9bee534dd0dc","Type":"ContainerStarted","Data":"f26e6c5d139300064a68780f86727c8be917e33819d28ac63f1135ea089e7196"} Mar 09 09:21:48 crc kubenswrapper[4861]: I0309 09:21:48.136731 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-hrnq8" Mar 09 09:21:48 crc kubenswrapper[4861]: I0309 09:21:48.144705 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-blz6f" event={"ID":"e9e766bf-fdea-451e-a58c-a8818fccf4b4","Type":"ContainerStarted","Data":"f3ea251c26157144233b8e844894edc338855140afcb8d207914e2e5f2d8afff"} Mar 09 09:21:48 crc kubenswrapper[4861]: I0309 09:21:48.145066 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-blz6f" Mar 09 09:21:48 crc kubenswrapper[4861]: I0309 09:21:48.155250 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-pktcs" event={"ID":"fbbd2d76-31fb-46d7-a422-af5f3e51baaf","Type":"ContainerStarted","Data":"9967a21b61f9a40eb3a0531a5b1638ae93808f0dd5f9f75e8d849e4a8b241741"} Mar 09 09:21:48 crc kubenswrapper[4861]: I0309 09:21:48.155659 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-pktcs" Mar 09 09:21:48 crc kubenswrapper[4861]: I0309 09:21:48.156967 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-67d996989d-8vnvb" event={"ID":"78b36bea-6c3d-4794-b38a-6b4a5b3e9f5d","Type":"ContainerStarted","Data":"9e348e2a7b29363219a3b43822241921dd7beb36a7e404b8792fae60036acd81"} Mar 09 09:21:48 crc kubenswrapper[4861]: I0309 09:21:48.157092 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-67d996989d-8vnvb" Mar 09 09:21:48 crc kubenswrapper[4861]: I0309 09:21:48.157789 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-s4tc6" event={"ID":"eaaa08cd-22f8-40a3-9cac-7e29137ea358","Type":"ContainerStarted","Data":"390845fbfecbdcb2db4b62a0d21acaf0ed5f889f97e5dffad7d7b0133ad884cb"} Mar 09 09:21:48 crc kubenswrapper[4861]: I0309 09:21:48.157983 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-s4tc6" Mar 09 09:21:48 crc kubenswrapper[4861]: I0309 09:21:48.169299 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-2cb86" event={"ID":"eec501ad-33c8-4195-8817-3078202db97a","Type":"ContainerStarted","Data":"72434fd0783f0d5f0afb2336759f4b28565f539af7b1ea269a5db16d244dec18"} Mar 09 09:21:48 crc kubenswrapper[4861]: I0309 09:21:48.169654 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-2cb86" Mar 09 09:21:48 crc kubenswrapper[4861]: I0309 09:21:48.172451 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-jw5nb" event={"ID":"23566122-1654-40c1-8dd6-577280d0dcec","Type":"ContainerStarted","Data":"1a28f1a29c1e5b0fe05c6f9f0d7e95dea65b8a705e4be2e007fef63fe2295d25"} Mar 09 09:21:48 crc kubenswrapper[4861]: I0309 09:21:48.172603 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-jw5nb" Mar 09 09:21:48 crc kubenswrapper[4861]: I0309 09:21:48.176761 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-89f2d" event={"ID":"861a14c0-5dcd-4126-b386-65467726a9dd","Type":"ContainerStarted","Data":"f9a023e7868d153b0666d5340933b147b7f271cf60dfd24082127c988f1405e4"} Mar 09 09:21:48 crc kubenswrapper[4861]: I0309 09:21:48.177127 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-89f2d" Mar 09 09:21:48 crc kubenswrapper[4861]: I0309 09:21:48.179155 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-hrnq8" podStartSLOduration=2.475446734 podStartE2EDuration="16.179140729s" podCreationTimestamp="2026-03-09 09:21:32 +0000 UTC" firstStartedPulling="2026-03-09 09:21:33.60546017 +0000 UTC m=+936.690499571" lastFinishedPulling="2026-03-09 09:21:47.309154165 +0000 UTC m=+950.394193566" observedRunningTime="2026-03-09 09:21:48.178570122 +0000 UTC m=+951.263609513" watchObservedRunningTime="2026-03-09 09:21:48.179140729 +0000 UTC m=+951.264180130" Mar 09 09:21:48 crc kubenswrapper[4861]: I0309 09:21:48.181937 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-lbhm2" event={"ID":"12c3b94d-baff-4b5d-864e-371f5b3857f5","Type":"ContainerStarted","Data":"e299ba0161a241dc9e34b910bc515a0d591ead65caa192efe9e0e01a64fc4c6d"} Mar 09 09:21:48 crc kubenswrapper[4861]: I0309 09:21:48.182643 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-lbhm2" Mar 09 09:21:48 crc kubenswrapper[4861]: I0309 09:21:48.184293 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-bwxj2" event={"ID":"32ddb619-584a-4ff4-a988-63d565043353","Type":"ContainerStarted","Data":"4c4be88be327f22152d4be7c4775de330a9167b20fd00bd947fe3860a52c4cd6"} Mar 09 09:21:48 crc kubenswrapper[4861]: I0309 09:21:48.184930 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-bwxj2" Mar 09 09:21:48 crc kubenswrapper[4861]: I0309 09:21:48.186648 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-pdxgs" event={"ID":"8fb737b1-978f-4f1e-98db-f1c542ef77d9","Type":"ContainerStarted","Data":"b0e33f5d4bf8d315ebe96eafcb3919d2372393881b110655774d3c442026f193"} Mar 09 09:21:48 crc kubenswrapper[4861]: I0309 09:21:48.187201 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-pdxgs" Mar 09 09:21:48 crc kubenswrapper[4861]: I0309 09:21:48.202949 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-67d996989d-8vnvb" podStartSLOduration=3.084513311 podStartE2EDuration="17.20293583s" podCreationTimestamp="2026-03-09 09:21:31 +0000 UTC" firstStartedPulling="2026-03-09 09:21:33.187593106 +0000 UTC m=+936.272632507" lastFinishedPulling="2026-03-09 09:21:47.306015625 +0000 UTC m=+950.391055026" observedRunningTime="2026-03-09 09:21:48.200561001 +0000 UTC m=+951.285600392" watchObservedRunningTime="2026-03-09 09:21:48.20293583 +0000 UTC m=+951.287975231" Mar 09 09:21:48 crc kubenswrapper[4861]: I0309 09:21:48.213735 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-54688575f-9dhrv" event={"ID":"091caccf-659b-42dd-b9cb-05aeea2548ce","Type":"ContainerStarted","Data":"d97cab5c9c3e44321bd142d3b6f897cc6cbebc63069a46cf1585d89dd789cf64"} Mar 09 09:21:48 crc kubenswrapper[4861]: I0309 09:21:48.214119 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-54688575f-9dhrv" Mar 09 09:21:48 crc kubenswrapper[4861]: I0309 09:21:48.230604 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-vlxrh" event={"ID":"fdd188d8-f434-493e-a8f5-3506031b0f83","Type":"ContainerStarted","Data":"2ebe5323d424b1982e1dcfddd8b40b5e50496b0d722b8ae460a8e614c0ed2cf4"} Mar 09 09:21:48 crc kubenswrapper[4861]: I0309 09:21:48.230644 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-vlxrh" Mar 09 09:21:48 crc kubenswrapper[4861]: E0309 09:21:48.233968 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:172f24bd4603ac3498536a8a2c8fffb07cf9113dd52bc132778ea0aa275c6b84\\\"\"" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-lnm2d" podUID="511b2722-0227-4a4f-931c-e69ad12e60de" Mar 09 09:21:48 crc kubenswrapper[4861]: I0309 09:21:48.253610 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-jw5nb" podStartSLOduration=3.247057192 podStartE2EDuration="17.253591111s" podCreationTimestamp="2026-03-09 09:21:31 +0000 UTC" firstStartedPulling="2026-03-09 09:21:33.263079158 +0000 UTC m=+936.348118559" lastFinishedPulling="2026-03-09 09:21:47.269613067 +0000 UTC m=+950.354652478" observedRunningTime="2026-03-09 09:21:48.25355905 +0000 UTC m=+951.338598451" watchObservedRunningTime="2026-03-09 09:21:48.253591111 +0000 UTC m=+951.338630512" Mar 09 09:21:48 crc kubenswrapper[4861]: I0309 09:21:48.289637 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ec436429-c762-4e15-8f82-19a10cdc7941-webhook-certs\") pod \"openstack-operator-controller-manager-59b6c9788f-lfprc\" (UID: \"ec436429-c762-4e15-8f82-19a10cdc7941\") " pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-lfprc" Mar 09 09:21:48 crc kubenswrapper[4861]: I0309 09:21:48.289927 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ec436429-c762-4e15-8f82-19a10cdc7941-metrics-certs\") pod \"openstack-operator-controller-manager-59b6c9788f-lfprc\" (UID: \"ec436429-c762-4e15-8f82-19a10cdc7941\") " pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-lfprc" Mar 09 09:21:48 crc kubenswrapper[4861]: E0309 09:21:48.290541 4861 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 09 09:21:48 crc kubenswrapper[4861]: E0309 09:21:48.290668 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec436429-c762-4e15-8f82-19a10cdc7941-webhook-certs podName:ec436429-c762-4e15-8f82-19a10cdc7941 nodeName:}" failed. No retries permitted until 2026-03-09 09:22:04.290643776 +0000 UTC m=+967.375683187 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/ec436429-c762-4e15-8f82-19a10cdc7941-webhook-certs") pod "openstack-operator-controller-manager-59b6c9788f-lfprc" (UID: "ec436429-c762-4e15-8f82-19a10cdc7941") : secret "webhook-server-cert" not found Mar 09 09:21:48 crc kubenswrapper[4861]: E0309 09:21:48.291674 4861 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 09 09:21:48 crc kubenswrapper[4861]: E0309 09:21:48.291807 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec436429-c762-4e15-8f82-19a10cdc7941-metrics-certs podName:ec436429-c762-4e15-8f82-19a10cdc7941 nodeName:}" failed. No retries permitted until 2026-03-09 09:22:04.29179737 +0000 UTC m=+967.376836771 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ec436429-c762-4e15-8f82-19a10cdc7941-metrics-certs") pod "openstack-operator-controller-manager-59b6c9788f-lfprc" (UID: "ec436429-c762-4e15-8f82-19a10cdc7941") : secret "metrics-server-cert" not found Mar 09 09:21:48 crc kubenswrapper[4861]: I0309 09:21:48.311947 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-s4tc6" podStartSLOduration=2.779451103 podStartE2EDuration="17.311933274s" podCreationTimestamp="2026-03-09 09:21:31 +0000 UTC" firstStartedPulling="2026-03-09 09:21:32.737167227 +0000 UTC m=+935.822206628" lastFinishedPulling="2026-03-09 09:21:47.269649398 +0000 UTC m=+950.354688799" observedRunningTime="2026-03-09 09:21:48.308690311 +0000 UTC m=+951.393729712" watchObservedRunningTime="2026-03-09 09:21:48.311933274 +0000 UTC m=+951.396972675" Mar 09 09:21:48 crc kubenswrapper[4861]: I0309 09:21:48.364267 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-89f2d" podStartSLOduration=3.857471716 podStartE2EDuration="17.364252524s" podCreationTimestamp="2026-03-09 09:21:31 +0000 UTC" firstStartedPulling="2026-03-09 09:21:33.168517602 +0000 UTC m=+936.253557003" lastFinishedPulling="2026-03-09 09:21:46.67529841 +0000 UTC m=+949.760337811" observedRunningTime="2026-03-09 09:21:48.361757672 +0000 UTC m=+951.446797073" watchObservedRunningTime="2026-03-09 09:21:48.364252524 +0000 UTC m=+951.449291925" Mar 09 09:21:48 crc kubenswrapper[4861]: I0309 09:21:48.385218 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-pktcs" podStartSLOduration=3.1529624800000002 podStartE2EDuration="17.385200213s" podCreationTimestamp="2026-03-09 09:21:31 +0000 UTC" firstStartedPulling="2026-03-09 09:21:33.160200311 +0000 UTC m=+936.245239712" lastFinishedPulling="2026-03-09 09:21:47.392438044 +0000 UTC m=+950.477477445" observedRunningTime="2026-03-09 09:21:48.379682482 +0000 UTC m=+951.464721873" watchObservedRunningTime="2026-03-09 09:21:48.385200213 +0000 UTC m=+951.470239614" Mar 09 09:21:48 crc kubenswrapper[4861]: I0309 09:21:48.477942 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-blz6f" podStartSLOduration=3.819319519 podStartE2EDuration="17.477925455s" podCreationTimestamp="2026-03-09 09:21:31 +0000 UTC" firstStartedPulling="2026-03-09 09:21:33.610187057 +0000 UTC m=+936.695226458" lastFinishedPulling="2026-03-09 09:21:47.268792993 +0000 UTC m=+950.353832394" observedRunningTime="2026-03-09 09:21:48.459722157 +0000 UTC m=+951.544761548" watchObservedRunningTime="2026-03-09 09:21:48.477925455 +0000 UTC m=+951.562964856" Mar 09 09:21:48 crc kubenswrapper[4861]: I0309 09:21:48.501393 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-2cb86" podStartSLOduration=3.502557001 podStartE2EDuration="17.501357966s" podCreationTimestamp="2026-03-09 09:21:31 +0000 UTC" firstStartedPulling="2026-03-09 09:21:33.272136231 +0000 UTC m=+936.357175632" lastFinishedPulling="2026-03-09 09:21:47.270937196 +0000 UTC m=+950.355976597" observedRunningTime="2026-03-09 09:21:48.482549149 +0000 UTC m=+951.567588550" watchObservedRunningTime="2026-03-09 09:21:48.501357966 +0000 UTC m=+951.586397367" Mar 09 09:21:48 crc kubenswrapper[4861]: I0309 09:21:48.543538 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-vlxrh" podStartSLOduration=2.829600357 podStartE2EDuration="16.54352067s" podCreationTimestamp="2026-03-09 09:21:32 +0000 UTC" firstStartedPulling="2026-03-09 09:21:33.590237848 +0000 UTC m=+936.675277249" lastFinishedPulling="2026-03-09 09:21:47.304158161 +0000 UTC m=+950.389197562" observedRunningTime="2026-03-09 09:21:48.542932372 +0000 UTC m=+951.627971763" watchObservedRunningTime="2026-03-09 09:21:48.54352067 +0000 UTC m=+951.628560071" Mar 09 09:21:48 crc kubenswrapper[4861]: I0309 09:21:48.544206 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-bwxj2" podStartSLOduration=3.443433294 podStartE2EDuration="17.544200009s" podCreationTimestamp="2026-03-09 09:21:31 +0000 UTC" firstStartedPulling="2026-03-09 09:21:33.168949445 +0000 UTC m=+936.253988846" lastFinishedPulling="2026-03-09 09:21:47.26971614 +0000 UTC m=+950.354755561" observedRunningTime="2026-03-09 09:21:48.516760143 +0000 UTC m=+951.601799544" watchObservedRunningTime="2026-03-09 09:21:48.544200009 +0000 UTC m=+951.629239410" Mar 09 09:21:48 crc kubenswrapper[4861]: I0309 09:21:48.567794 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-54688575f-9dhrv" podStartSLOduration=3.70744408 podStartE2EDuration="17.567774454s" podCreationTimestamp="2026-03-09 09:21:31 +0000 UTC" firstStartedPulling="2026-03-09 09:21:33.410455918 +0000 UTC m=+936.495495319" lastFinishedPulling="2026-03-09 09:21:47.270786292 +0000 UTC m=+950.355825693" observedRunningTime="2026-03-09 09:21:48.56624294 +0000 UTC m=+951.651282351" watchObservedRunningTime="2026-03-09 09:21:48.567774454 +0000 UTC m=+951.652813855" Mar 09 09:21:48 crc kubenswrapper[4861]: I0309 09:21:48.598730 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-lbhm2" podStartSLOduration=3.512293594 podStartE2EDuration="17.598710902s" podCreationTimestamp="2026-03-09 09:21:31 +0000 UTC" firstStartedPulling="2026-03-09 09:21:33.182360505 +0000 UTC m=+936.267399906" lastFinishedPulling="2026-03-09 09:21:47.268777793 +0000 UTC m=+950.353817214" observedRunningTime="2026-03-09 09:21:48.597895909 +0000 UTC m=+951.682935310" watchObservedRunningTime="2026-03-09 09:21:48.598710902 +0000 UTC m=+951.683750293" Mar 09 09:21:50 crc kubenswrapper[4861]: I0309 09:21:50.827347 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-pdxgs" podStartSLOduration=5.368679832 podStartE2EDuration="19.827326659s" podCreationTimestamp="2026-03-09 09:21:31 +0000 UTC" firstStartedPulling="2026-03-09 09:21:32.811449564 +0000 UTC m=+935.896488965" lastFinishedPulling="2026-03-09 09:21:47.270096391 +0000 UTC m=+950.355135792" observedRunningTime="2026-03-09 09:21:48.676163782 +0000 UTC m=+951.761203183" watchObservedRunningTime="2026-03-09 09:21:50.827326659 +0000 UTC m=+953.912366060" Mar 09 09:21:50 crc kubenswrapper[4861]: I0309 09:21:50.833225 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-f7fcc58b9-srw9z"] Mar 09 09:21:50 crc kubenswrapper[4861]: I0309 09:21:50.932515 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6f64bd8c75j8xx4"] Mar 09 09:21:51 crc kubenswrapper[4861]: I0309 09:21:51.254054 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-2bqxz" event={"ID":"b386f8ad-7867-4d35-83f8-382a379e3c1e","Type":"ContainerStarted","Data":"f727f384a0d189eb7436fdcc02fa4ff3eec5edce4383162d9735dc4295909e83"} Mar 09 09:21:51 crc kubenswrapper[4861]: I0309 09:21:51.254351 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-2bqxz" Mar 09 09:21:51 crc kubenswrapper[4861]: I0309 09:21:51.255874 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-srw9z" event={"ID":"1b8226be-5eb4-4156-a168-f843edac34ce","Type":"ContainerStarted","Data":"8fc34e1d8776ac46ec1232bea4ffbae002c19ec31043b0f6a333da7c472fdc2c"} Mar 09 09:21:51 crc kubenswrapper[4861]: I0309 09:21:51.258117 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64bd8c75j8xx4" event={"ID":"72b49679-1f56-42df-bafc-a899cd2da3cf","Type":"ContainerStarted","Data":"4384b7c6c56d2f0c2c3f33e85861ec60e91d991a7658fb61b811c14576fed490"} Mar 09 09:21:51 crc kubenswrapper[4861]: I0309 09:21:51.260395 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-qqwld" event={"ID":"73ae93db-3260-4af6-9724-52e8b97a0245","Type":"ContainerStarted","Data":"65c4532cac9ebbbf89f09f9f6a17647ab3d7f7aae69141cb2504f2adeb4dabd4"} Mar 09 09:21:51 crc kubenswrapper[4861]: I0309 09:21:51.260832 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-qqwld" Mar 09 09:21:51 crc kubenswrapper[4861]: I0309 09:21:51.273119 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-2bqxz" podStartSLOduration=3.501391296 podStartE2EDuration="20.273105233s" podCreationTimestamp="2026-03-09 09:21:31 +0000 UTC" firstStartedPulling="2026-03-09 09:21:33.617215851 +0000 UTC m=+936.702255252" lastFinishedPulling="2026-03-09 09:21:50.388929778 +0000 UTC m=+953.473969189" observedRunningTime="2026-03-09 09:21:51.26781168 +0000 UTC m=+954.352851081" watchObservedRunningTime="2026-03-09 09:21:51.273105233 +0000 UTC m=+954.358144634" Mar 09 09:21:51 crc kubenswrapper[4861]: I0309 09:21:51.288173 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-qqwld" podStartSLOduration=3.562415048 podStartE2EDuration="20.28815224s" podCreationTimestamp="2026-03-09 09:21:31 +0000 UTC" firstStartedPulling="2026-03-09 09:21:33.662749664 +0000 UTC m=+936.747789065" lastFinishedPulling="2026-03-09 09:21:50.388486856 +0000 UTC m=+953.473526257" observedRunningTime="2026-03-09 09:21:51.280789457 +0000 UTC m=+954.365828878" watchObservedRunningTime="2026-03-09 09:21:51.28815224 +0000 UTC m=+954.373191641" Mar 09 09:21:52 crc kubenswrapper[4861]: I0309 09:21:52.020497 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-bwxj2" Mar 09 09:21:52 crc kubenswrapper[4861]: I0309 09:21:52.112266 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-pktcs" Mar 09 09:21:52 crc kubenswrapper[4861]: I0309 09:21:52.157873 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-67d996989d-8vnvb" Mar 09 09:21:52 crc kubenswrapper[4861]: I0309 09:21:52.236982 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-jw5nb" Mar 09 09:21:52 crc kubenswrapper[4861]: I0309 09:21:52.256700 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-lbhm2" Mar 09 09:21:52 crc kubenswrapper[4861]: I0309 09:21:52.300155 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-54688575f-9dhrv" Mar 09 09:21:52 crc kubenswrapper[4861]: I0309 09:21:52.391772 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-blz6f" Mar 09 09:21:52 crc kubenswrapper[4861]: I0309 09:21:52.402048 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-2cb86" Mar 09 09:21:52 crc kubenswrapper[4861]: I0309 09:21:52.530546 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-hrnq8" Mar 09 09:21:52 crc kubenswrapper[4861]: I0309 09:21:52.584107 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-vlxrh" Mar 09 09:21:54 crc kubenswrapper[4861]: I0309 09:21:54.606337 4861 patch_prober.go:28] interesting pod/machine-config-daemon-5g7gc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:21:54 crc kubenswrapper[4861]: I0309 09:21:54.606827 4861 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:22:00 crc kubenswrapper[4861]: I0309 09:22:00.133925 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550802-7tqjn"] Mar 09 09:22:00 crc kubenswrapper[4861]: I0309 09:22:00.135428 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550802-7tqjn" Mar 09 09:22:00 crc kubenswrapper[4861]: I0309 09:22:00.137729 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:22:00 crc kubenswrapper[4861]: I0309 09:22:00.137802 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:22:00 crc kubenswrapper[4861]: I0309 09:22:00.137829 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k86l8" Mar 09 09:22:00 crc kubenswrapper[4861]: I0309 09:22:00.141164 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550802-7tqjn"] Mar 09 09:22:00 crc kubenswrapper[4861]: I0309 09:22:00.287352 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trcxg\" (UniqueName: \"kubernetes.io/projected/7d6f6548-9254-43a1-b236-d393886fb553-kube-api-access-trcxg\") pod \"auto-csr-approver-29550802-7tqjn\" (UID: \"7d6f6548-9254-43a1-b236-d393886fb553\") " pod="openshift-infra/auto-csr-approver-29550802-7tqjn" Mar 09 09:22:00 crc kubenswrapper[4861]: I0309 09:22:00.389151 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trcxg\" (UniqueName: \"kubernetes.io/projected/7d6f6548-9254-43a1-b236-d393886fb553-kube-api-access-trcxg\") pod \"auto-csr-approver-29550802-7tqjn\" (UID: \"7d6f6548-9254-43a1-b236-d393886fb553\") " pod="openshift-infra/auto-csr-approver-29550802-7tqjn" Mar 09 09:22:00 crc kubenswrapper[4861]: I0309 09:22:00.407165 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trcxg\" (UniqueName: \"kubernetes.io/projected/7d6f6548-9254-43a1-b236-d393886fb553-kube-api-access-trcxg\") pod \"auto-csr-approver-29550802-7tqjn\" (UID: \"7d6f6548-9254-43a1-b236-d393886fb553\") " pod="openshift-infra/auto-csr-approver-29550802-7tqjn" Mar 09 09:22:00 crc kubenswrapper[4861]: I0309 09:22:00.460158 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550802-7tqjn" Mar 09 09:22:01 crc kubenswrapper[4861]: I0309 09:22:01.870303 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-s4tc6" Mar 09 09:22:01 crc kubenswrapper[4861]: I0309 09:22:01.897591 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-89f2d" Mar 09 09:22:01 crc kubenswrapper[4861]: I0309 09:22:01.937539 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-pdxgs" Mar 09 09:22:02 crc kubenswrapper[4861]: E0309 09:22:02.219725 4861 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97" Mar 09 09:22:02 crc kubenswrapper[4861]: E0309 09:22:02.219944 4861 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4tmkt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-bccc79885-pmwxp_openstack-operators(4cbd2609-0983-4da2-a0c7-fa66387e36ae): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 09:22:02 crc kubenswrapper[4861]: E0309 09:22:02.221162 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-pmwxp" podUID="4cbd2609-0983-4da2-a0c7-fa66387e36ae" Mar 09 09:22:02 crc kubenswrapper[4861]: I0309 09:22:02.478602 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-qqwld" Mar 09 09:22:02 crc kubenswrapper[4861]: I0309 09:22:02.501739 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-2bqxz" Mar 09 09:22:02 crc kubenswrapper[4861]: E0309 09:22:02.845820 4861 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Mar 09 09:22:02 crc kubenswrapper[4861]: E0309 09:22:02.846477 4861 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-22pjg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-xnfpv_openstack-operators(e649bda4-59a3-47e6-92e2-910c01b2f7c2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 09:22:02 crc kubenswrapper[4861]: E0309 09:22:02.847836 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xnfpv" podUID="e649bda4-59a3-47e6-92e2-910c01b2f7c2" Mar 09 09:22:03 crc kubenswrapper[4861]: I0309 09:22:03.287104 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550802-7tqjn"] Mar 09 09:22:03 crc kubenswrapper[4861]: W0309 09:22:03.290911 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d6f6548_9254_43a1_b236_d393886fb553.slice/crio-fc85d5f734989b6de5aa164c6c5302383035a2205b90fe546494d7be9a4c31c9 WatchSource:0}: Error finding container fc85d5f734989b6de5aa164c6c5302383035a2205b90fe546494d7be9a4c31c9: Status 404 returned error can't find the container with id fc85d5f734989b6de5aa164c6c5302383035a2205b90fe546494d7be9a4c31c9 Mar 09 09:22:03 crc kubenswrapper[4861]: I0309 09:22:03.364204 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64bd8c75j8xx4" event={"ID":"72b49679-1f56-42df-bafc-a899cd2da3cf","Type":"ContainerStarted","Data":"3a38199c78ccc0bb9a1be2ee6d695817be283f5a5fa2aa51ad0a4dab8afbbaa6"} Mar 09 09:22:03 crc kubenswrapper[4861]: I0309 09:22:03.364295 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64bd8c75j8xx4" Mar 09 09:22:03 crc kubenswrapper[4861]: I0309 09:22:03.365764 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-ztmwj" event={"ID":"175e8d6d-930a-484b-b0b3-d45f37da4239","Type":"ContainerStarted","Data":"adfa1aea6ddc7e749bd40e46b30a6944a665d496b9004b44260aa2c3e9049890"} Mar 09 09:22:03 crc kubenswrapper[4861]: I0309 09:22:03.365906 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-ztmwj" Mar 09 09:22:03 crc kubenswrapper[4861]: I0309 09:22:03.367214 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-2wbrm" event={"ID":"5cccaa46-1901-457b-b093-9edfb512b68f","Type":"ContainerStarted","Data":"627418663c4a751cdb30b928f9f461f805c09d92aadbd11b7b82a752c48cf289"} Mar 09 09:22:03 crc kubenswrapper[4861]: I0309 09:22:03.367460 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-2wbrm" Mar 09 09:22:03 crc kubenswrapper[4861]: I0309 09:22:03.369134 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550802-7tqjn" event={"ID":"7d6f6548-9254-43a1-b236-d393886fb553","Type":"ContainerStarted","Data":"fc85d5f734989b6de5aa164c6c5302383035a2205b90fe546494d7be9a4c31c9"} Mar 09 09:22:03 crc kubenswrapper[4861]: I0309 09:22:03.370358 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-srw9z" event={"ID":"1b8226be-5eb4-4156-a168-f843edac34ce","Type":"ContainerStarted","Data":"20210e4f84e89b23646ef8d0cf0ea9cc65e866846d8615da6f69b29a2072c7f3"} Mar 09 09:22:03 crc kubenswrapper[4861]: I0309 09:22:03.370463 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-srw9z" Mar 09 09:22:03 crc kubenswrapper[4861]: I0309 09:22:03.398084 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64bd8c75j8xx4" podStartSLOduration=20.525419079 podStartE2EDuration="32.398066885s" podCreationTimestamp="2026-03-09 09:21:31 +0000 UTC" firstStartedPulling="2026-03-09 09:21:50.950817604 +0000 UTC m=+954.035857005" lastFinishedPulling="2026-03-09 09:22:02.82346541 +0000 UTC m=+965.908504811" observedRunningTime="2026-03-09 09:22:03.392806332 +0000 UTC m=+966.477845753" watchObservedRunningTime="2026-03-09 09:22:03.398066885 +0000 UTC m=+966.483106286" Mar 09 09:22:03 crc kubenswrapper[4861]: I0309 09:22:03.427350 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-2wbrm" podStartSLOduration=3.266493975 podStartE2EDuration="32.427330415s" podCreationTimestamp="2026-03-09 09:21:31 +0000 UTC" firstStartedPulling="2026-03-09 09:21:33.662816925 +0000 UTC m=+936.747856326" lastFinishedPulling="2026-03-09 09:22:02.823653365 +0000 UTC m=+965.908692766" observedRunningTime="2026-03-09 09:22:03.423639377 +0000 UTC m=+966.508678788" watchObservedRunningTime="2026-03-09 09:22:03.427330415 +0000 UTC m=+966.512369816" Mar 09 09:22:03 crc kubenswrapper[4861]: I0309 09:22:03.444319 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-ztmwj" podStartSLOduration=2.259518302 podStartE2EDuration="31.444301927s" podCreationTimestamp="2026-03-09 09:21:32 +0000 UTC" firstStartedPulling="2026-03-09 09:21:33.649402325 +0000 UTC m=+936.734441726" lastFinishedPulling="2026-03-09 09:22:02.83418595 +0000 UTC m=+965.919225351" observedRunningTime="2026-03-09 09:22:03.442011121 +0000 UTC m=+966.527050542" watchObservedRunningTime="2026-03-09 09:22:03.444301927 +0000 UTC m=+966.529341338" Mar 09 09:22:03 crc kubenswrapper[4861]: I0309 09:22:03.458282 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-srw9z" podStartSLOduration=20.463806522 podStartE2EDuration="32.458267053s" podCreationTimestamp="2026-03-09 09:21:31 +0000 UTC" firstStartedPulling="2026-03-09 09:21:50.841033067 +0000 UTC m=+953.926072468" lastFinishedPulling="2026-03-09 09:22:02.835493598 +0000 UTC m=+965.920532999" observedRunningTime="2026-03-09 09:22:03.454844094 +0000 UTC m=+966.539883495" watchObservedRunningTime="2026-03-09 09:22:03.458267053 +0000 UTC m=+966.543306444" Mar 09 09:22:04 crc kubenswrapper[4861]: I0309 09:22:04.344855 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ec436429-c762-4e15-8f82-19a10cdc7941-webhook-certs\") pod \"openstack-operator-controller-manager-59b6c9788f-lfprc\" (UID: \"ec436429-c762-4e15-8f82-19a10cdc7941\") " pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-lfprc" Mar 09 09:22:04 crc kubenswrapper[4861]: I0309 09:22:04.345221 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ec436429-c762-4e15-8f82-19a10cdc7941-metrics-certs\") pod \"openstack-operator-controller-manager-59b6c9788f-lfprc\" (UID: \"ec436429-c762-4e15-8f82-19a10cdc7941\") " pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-lfprc" Mar 09 09:22:04 crc kubenswrapper[4861]: I0309 09:22:04.350600 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ec436429-c762-4e15-8f82-19a10cdc7941-webhook-certs\") pod \"openstack-operator-controller-manager-59b6c9788f-lfprc\" (UID: \"ec436429-c762-4e15-8f82-19a10cdc7941\") " pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-lfprc" Mar 09 09:22:04 crc kubenswrapper[4861]: I0309 09:22:04.350754 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ec436429-c762-4e15-8f82-19a10cdc7941-metrics-certs\") pod \"openstack-operator-controller-manager-59b6c9788f-lfprc\" (UID: \"ec436429-c762-4e15-8f82-19a10cdc7941\") " pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-lfprc" Mar 09 09:22:04 crc kubenswrapper[4861]: I0309 09:22:04.381652 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-lnm2d" event={"ID":"511b2722-0227-4a4f-931c-e69ad12e60de","Type":"ContainerStarted","Data":"468b88d9c4da835ce0e03f3d83983ec8c716888ed9fa83b04ae2cd696806e8d2"} Mar 09 09:22:04 crc kubenswrapper[4861]: I0309 09:22:04.382083 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-lnm2d" Mar 09 09:22:04 crc kubenswrapper[4861]: I0309 09:22:04.399461 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-lnm2d" podStartSLOduration=2.593440031 podStartE2EDuration="33.399446723s" podCreationTimestamp="2026-03-09 09:21:31 +0000 UTC" firstStartedPulling="2026-03-09 09:21:33.260759561 +0000 UTC m=+936.345798962" lastFinishedPulling="2026-03-09 09:22:04.066766253 +0000 UTC m=+967.151805654" observedRunningTime="2026-03-09 09:22:04.394309065 +0000 UTC m=+967.479348466" watchObservedRunningTime="2026-03-09 09:22:04.399446723 +0000 UTC m=+967.484486124" Mar 09 09:22:04 crc kubenswrapper[4861]: I0309 09:22:04.590141 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-mmqjb" Mar 09 09:22:04 crc kubenswrapper[4861]: I0309 09:22:04.598463 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-lfprc" Mar 09 09:22:05 crc kubenswrapper[4861]: I0309 09:22:05.269232 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-59b6c9788f-lfprc"] Mar 09 09:22:05 crc kubenswrapper[4861]: W0309 09:22:05.276528 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec436429_c762_4e15_8f82_19a10cdc7941.slice/crio-fb1dadfbbf7448afe27158346d6aae00de8fc4a10221ac326e4617ed0b223902 WatchSource:0}: Error finding container fb1dadfbbf7448afe27158346d6aae00de8fc4a10221ac326e4617ed0b223902: Status 404 returned error can't find the container with id fb1dadfbbf7448afe27158346d6aae00de8fc4a10221ac326e4617ed0b223902 Mar 09 09:22:05 crc kubenswrapper[4861]: I0309 09:22:05.389669 4861 generic.go:334] "Generic (PLEG): container finished" podID="7d6f6548-9254-43a1-b236-d393886fb553" containerID="ff7c9025092c9f3c0f4abb7501ffbadeb30751bc3f3e130b40b0e00b37b51d64" exitCode=0 Mar 09 09:22:05 crc kubenswrapper[4861]: I0309 09:22:05.389743 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550802-7tqjn" event={"ID":"7d6f6548-9254-43a1-b236-d393886fb553","Type":"ContainerDied","Data":"ff7c9025092c9f3c0f4abb7501ffbadeb30751bc3f3e130b40b0e00b37b51d64"} Mar 09 09:22:05 crc kubenswrapper[4861]: I0309 09:22:05.390939 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-lfprc" event={"ID":"ec436429-c762-4e15-8f82-19a10cdc7941","Type":"ContainerStarted","Data":"fb1dadfbbf7448afe27158346d6aae00de8fc4a10221ac326e4617ed0b223902"} Mar 09 09:22:06 crc kubenswrapper[4861]: I0309 09:22:06.400036 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-lfprc" event={"ID":"ec436429-c762-4e15-8f82-19a10cdc7941","Type":"ContainerStarted","Data":"bfd4d5b19bf5effdbf360c2351327d0666567f8a407a09960abd1fd09642d445"} Mar 09 09:22:06 crc kubenswrapper[4861]: I0309 09:22:06.400434 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-lfprc" Mar 09 09:22:06 crc kubenswrapper[4861]: I0309 09:22:06.446826 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-lfprc" podStartSLOduration=34.446796616 podStartE2EDuration="34.446796616s" podCreationTimestamp="2026-03-09 09:21:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:22:06.435596051 +0000 UTC m=+969.520635462" watchObservedRunningTime="2026-03-09 09:22:06.446796616 +0000 UTC m=+969.531836057" Mar 09 09:22:06 crc kubenswrapper[4861]: I0309 09:22:06.757931 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550802-7tqjn" Mar 09 09:22:06 crc kubenswrapper[4861]: I0309 09:22:06.879398 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trcxg\" (UniqueName: \"kubernetes.io/projected/7d6f6548-9254-43a1-b236-d393886fb553-kube-api-access-trcxg\") pod \"7d6f6548-9254-43a1-b236-d393886fb553\" (UID: \"7d6f6548-9254-43a1-b236-d393886fb553\") " Mar 09 09:22:06 crc kubenswrapper[4861]: I0309 09:22:06.887601 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d6f6548-9254-43a1-b236-d393886fb553-kube-api-access-trcxg" (OuterVolumeSpecName: "kube-api-access-trcxg") pod "7d6f6548-9254-43a1-b236-d393886fb553" (UID: "7d6f6548-9254-43a1-b236-d393886fb553"). InnerVolumeSpecName "kube-api-access-trcxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:22:06 crc kubenswrapper[4861]: I0309 09:22:06.980846 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trcxg\" (UniqueName: \"kubernetes.io/projected/7d6f6548-9254-43a1-b236-d393886fb553-kube-api-access-trcxg\") on node \"crc\" DevicePath \"\"" Mar 09 09:22:07 crc kubenswrapper[4861]: I0309 09:22:07.411088 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550802-7tqjn" Mar 09 09:22:07 crc kubenswrapper[4861]: I0309 09:22:07.411084 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550802-7tqjn" event={"ID":"7d6f6548-9254-43a1-b236-d393886fb553","Type":"ContainerDied","Data":"fc85d5f734989b6de5aa164c6c5302383035a2205b90fe546494d7be9a4c31c9"} Mar 09 09:22:07 crc kubenswrapper[4861]: I0309 09:22:07.411639 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc85d5f734989b6de5aa164c6c5302383035a2205b90fe546494d7be9a4c31c9" Mar 09 09:22:07 crc kubenswrapper[4861]: I0309 09:22:07.816750 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550796-fhd9r"] Mar 09 09:22:07 crc kubenswrapper[4861]: I0309 09:22:07.823051 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550796-fhd9r"] Mar 09 09:22:08 crc kubenswrapper[4861]: I0309 09:22:08.025252 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6f64bd8c75j8xx4" Mar 09 09:22:09 crc kubenswrapper[4861]: I0309 09:22:09.666224 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6640fac9-5944-4811-9b15-15741cc9d35c" path="/var/lib/kubelet/pods/6640fac9-5944-4811-9b15-15741cc9d35c/volumes" Mar 09 09:22:12 crc kubenswrapper[4861]: I0309 09:22:12.210912 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-2wbrm" Mar 09 09:22:12 crc kubenswrapper[4861]: I0309 09:22:12.340237 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-lnm2d" Mar 09 09:22:12 crc kubenswrapper[4861]: I0309 09:22:12.626515 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-ztmwj" Mar 09 09:22:13 crc kubenswrapper[4861]: E0309 09:22:13.660408 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-pmwxp" podUID="4cbd2609-0983-4da2-a0c7-fa66387e36ae" Mar 09 09:22:14 crc kubenswrapper[4861]: I0309 09:22:14.607856 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-59b6c9788f-lfprc" Mar 09 09:22:15 crc kubenswrapper[4861]: E0309 09:22:15.660460 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xnfpv" podUID="e649bda4-59a3-47e6-92e2-910c01b2f7c2" Mar 09 09:22:17 crc kubenswrapper[4861]: I0309 09:22:17.646555 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-srw9z" Mar 09 09:22:24 crc kubenswrapper[4861]: I0309 09:22:24.605929 4861 patch_prober.go:28] interesting pod/machine-config-daemon-5g7gc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:22:24 crc kubenswrapper[4861]: I0309 09:22:24.606577 4861 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:22:27 crc kubenswrapper[4861]: I0309 09:22:27.558815 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xnfpv" event={"ID":"e649bda4-59a3-47e6-92e2-910c01b2f7c2","Type":"ContainerStarted","Data":"372c9797ef950232f28b6863d77b639d5da57734b6f3f0b96f371e7492d11e6d"} Mar 09 09:22:27 crc kubenswrapper[4861]: I0309 09:22:27.583589 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xnfpv" podStartSLOduration=2.05544504 podStartE2EDuration="55.583564863s" podCreationTimestamp="2026-03-09 09:21:32 +0000 UTC" firstStartedPulling="2026-03-09 09:21:33.667500171 +0000 UTC m=+936.752539572" lastFinishedPulling="2026-03-09 09:22:27.195619994 +0000 UTC m=+990.280659395" observedRunningTime="2026-03-09 09:22:27.574802241 +0000 UTC m=+990.659841672" watchObservedRunningTime="2026-03-09 09:22:27.583564863 +0000 UTC m=+990.668604284" Mar 09 09:22:28 crc kubenswrapper[4861]: I0309 09:22:28.566311 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-pmwxp" event={"ID":"4cbd2609-0983-4da2-a0c7-fa66387e36ae","Type":"ContainerStarted","Data":"166bc1498da5557d975bee70b837aacb52d7e1be0f2c94db5964268d3c64c018"} Mar 09 09:22:28 crc kubenswrapper[4861]: I0309 09:22:28.566847 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-pmwxp" Mar 09 09:22:28 crc kubenswrapper[4861]: I0309 09:22:28.587992 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-pmwxp" podStartSLOduration=2.114313482 podStartE2EDuration="56.587975494s" podCreationTimestamp="2026-03-09 09:21:32 +0000 UTC" firstStartedPulling="2026-03-09 09:21:33.672299511 +0000 UTC m=+936.757345612" lastFinishedPulling="2026-03-09 09:22:28.145968213 +0000 UTC m=+991.231007624" observedRunningTime="2026-03-09 09:22:28.583423919 +0000 UTC m=+991.668463340" watchObservedRunningTime="2026-03-09 09:22:28.587975494 +0000 UTC m=+991.673014895" Mar 09 09:22:42 crc kubenswrapper[4861]: I0309 09:22:42.682717 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-pmwxp" Mar 09 09:22:54 crc kubenswrapper[4861]: I0309 09:22:54.606445 4861 patch_prober.go:28] interesting pod/machine-config-daemon-5g7gc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:22:54 crc kubenswrapper[4861]: I0309 09:22:54.607189 4861 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:22:54 crc kubenswrapper[4861]: I0309 09:22:54.607251 4861 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" Mar 09 09:22:54 crc kubenswrapper[4861]: I0309 09:22:54.608133 4861 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"496f42623ddb6e7fafcb7e74986e05b309e695c4366816fec0ffd01b5c0a1be9"} pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 09:22:54 crc kubenswrapper[4861]: I0309 09:22:54.608215 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" containerID="cri-o://496f42623ddb6e7fafcb7e74986e05b309e695c4366816fec0ffd01b5c0a1be9" gracePeriod=600 Mar 09 09:22:54 crc kubenswrapper[4861]: I0309 09:22:54.802154 4861 generic.go:334] "Generic (PLEG): container finished" podID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerID="496f42623ddb6e7fafcb7e74986e05b309e695c4366816fec0ffd01b5c0a1be9" exitCode=0 Mar 09 09:22:54 crc kubenswrapper[4861]: I0309 09:22:54.802200 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" event={"ID":"6f7875e3-174f-4c67-8675-d878de74aa4f","Type":"ContainerDied","Data":"496f42623ddb6e7fafcb7e74986e05b309e695c4366816fec0ffd01b5c0a1be9"} Mar 09 09:22:54 crc kubenswrapper[4861]: I0309 09:22:54.802295 4861 scope.go:117] "RemoveContainer" containerID="e0df4d92a9184d4707aae8be303bba70127fc6e2155c0877c558c19c847ce33b" Mar 09 09:22:55 crc kubenswrapper[4861]: I0309 09:22:55.811666 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" event={"ID":"6f7875e3-174f-4c67-8675-d878de74aa4f","Type":"ContainerStarted","Data":"925e707d587470ad152a1a9ef2490c9fccb36de6da22acc63f3054b647081cf1"} Mar 09 09:23:00 crc kubenswrapper[4861]: I0309 09:23:00.985088 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-wzkz2"] Mar 09 09:23:00 crc kubenswrapper[4861]: E0309 09:23:00.985884 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d6f6548-9254-43a1-b236-d393886fb553" containerName="oc" Mar 09 09:23:00 crc kubenswrapper[4861]: I0309 09:23:00.985897 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d6f6548-9254-43a1-b236-d393886fb553" containerName="oc" Mar 09 09:23:00 crc kubenswrapper[4861]: I0309 09:23:00.986036 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d6f6548-9254-43a1-b236-d393886fb553" containerName="oc" Mar 09 09:23:00 crc kubenswrapper[4861]: I0309 09:23:00.986670 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-wzkz2" Mar 09 09:23:00 crc kubenswrapper[4861]: I0309 09:23:00.990704 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 09 09:23:00 crc kubenswrapper[4861]: I0309 09:23:00.990842 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 09 09:23:00 crc kubenswrapper[4861]: I0309 09:23:00.990929 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 09 09:23:00 crc kubenswrapper[4861]: I0309 09:23:00.991212 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-cqng9" Mar 09 09:23:01 crc kubenswrapper[4861]: I0309 09:23:01.003773 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-wzkz2"] Mar 09 09:23:01 crc kubenswrapper[4861]: I0309 09:23:01.038256 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-47h2z"] Mar 09 09:23:01 crc kubenswrapper[4861]: I0309 09:23:01.039623 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-47h2z" Mar 09 09:23:01 crc kubenswrapper[4861]: I0309 09:23:01.042300 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 09 09:23:01 crc kubenswrapper[4861]: I0309 09:23:01.050758 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-47h2z"] Mar 09 09:23:01 crc kubenswrapper[4861]: I0309 09:23:01.078660 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-st6tz\" (UniqueName: \"kubernetes.io/projected/a3b1b7ac-3322-4520-892a-1c881ee10c50-kube-api-access-st6tz\") pod \"dnsmasq-dns-589db6c89c-wzkz2\" (UID: \"a3b1b7ac-3322-4520-892a-1c881ee10c50\") " pod="openstack/dnsmasq-dns-589db6c89c-wzkz2" Mar 09 09:23:01 crc kubenswrapper[4861]: I0309 09:23:01.078815 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3b1b7ac-3322-4520-892a-1c881ee10c50-config\") pod \"dnsmasq-dns-589db6c89c-wzkz2\" (UID: \"a3b1b7ac-3322-4520-892a-1c881ee10c50\") " pod="openstack/dnsmasq-dns-589db6c89c-wzkz2" Mar 09 09:23:01 crc kubenswrapper[4861]: I0309 09:23:01.180161 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-st6tz\" (UniqueName: \"kubernetes.io/projected/a3b1b7ac-3322-4520-892a-1c881ee10c50-kube-api-access-st6tz\") pod \"dnsmasq-dns-589db6c89c-wzkz2\" (UID: \"a3b1b7ac-3322-4520-892a-1c881ee10c50\") " pod="openstack/dnsmasq-dns-589db6c89c-wzkz2" Mar 09 09:23:01 crc kubenswrapper[4861]: I0309 09:23:01.180213 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhqxr\" (UniqueName: \"kubernetes.io/projected/aedcdae4-7043-4e90-a4cd-b8d58ceeafb3-kube-api-access-dhqxr\") pod \"dnsmasq-dns-86bbd886cf-47h2z\" (UID: \"aedcdae4-7043-4e90-a4cd-b8d58ceeafb3\") " pod="openstack/dnsmasq-dns-86bbd886cf-47h2z" Mar 09 09:23:01 crc kubenswrapper[4861]: I0309 09:23:01.180254 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aedcdae4-7043-4e90-a4cd-b8d58ceeafb3-config\") pod \"dnsmasq-dns-86bbd886cf-47h2z\" (UID: \"aedcdae4-7043-4e90-a4cd-b8d58ceeafb3\") " pod="openstack/dnsmasq-dns-86bbd886cf-47h2z" Mar 09 09:23:01 crc kubenswrapper[4861]: I0309 09:23:01.180307 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aedcdae4-7043-4e90-a4cd-b8d58ceeafb3-dns-svc\") pod \"dnsmasq-dns-86bbd886cf-47h2z\" (UID: \"aedcdae4-7043-4e90-a4cd-b8d58ceeafb3\") " pod="openstack/dnsmasq-dns-86bbd886cf-47h2z" Mar 09 09:23:01 crc kubenswrapper[4861]: I0309 09:23:01.180334 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3b1b7ac-3322-4520-892a-1c881ee10c50-config\") pod \"dnsmasq-dns-589db6c89c-wzkz2\" (UID: \"a3b1b7ac-3322-4520-892a-1c881ee10c50\") " pod="openstack/dnsmasq-dns-589db6c89c-wzkz2" Mar 09 09:23:01 crc kubenswrapper[4861]: I0309 09:23:01.181080 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3b1b7ac-3322-4520-892a-1c881ee10c50-config\") pod \"dnsmasq-dns-589db6c89c-wzkz2\" (UID: \"a3b1b7ac-3322-4520-892a-1c881ee10c50\") " pod="openstack/dnsmasq-dns-589db6c89c-wzkz2" Mar 09 09:23:01 crc kubenswrapper[4861]: I0309 09:23:01.197185 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-st6tz\" (UniqueName: \"kubernetes.io/projected/a3b1b7ac-3322-4520-892a-1c881ee10c50-kube-api-access-st6tz\") pod \"dnsmasq-dns-589db6c89c-wzkz2\" (UID: \"a3b1b7ac-3322-4520-892a-1c881ee10c50\") " pod="openstack/dnsmasq-dns-589db6c89c-wzkz2" Mar 09 09:23:01 crc kubenswrapper[4861]: I0309 09:23:01.281715 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aedcdae4-7043-4e90-a4cd-b8d58ceeafb3-dns-svc\") pod \"dnsmasq-dns-86bbd886cf-47h2z\" (UID: \"aedcdae4-7043-4e90-a4cd-b8d58ceeafb3\") " pod="openstack/dnsmasq-dns-86bbd886cf-47h2z" Mar 09 09:23:01 crc kubenswrapper[4861]: I0309 09:23:01.281793 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhqxr\" (UniqueName: \"kubernetes.io/projected/aedcdae4-7043-4e90-a4cd-b8d58ceeafb3-kube-api-access-dhqxr\") pod \"dnsmasq-dns-86bbd886cf-47h2z\" (UID: \"aedcdae4-7043-4e90-a4cd-b8d58ceeafb3\") " pod="openstack/dnsmasq-dns-86bbd886cf-47h2z" Mar 09 09:23:01 crc kubenswrapper[4861]: I0309 09:23:01.281849 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aedcdae4-7043-4e90-a4cd-b8d58ceeafb3-config\") pod \"dnsmasq-dns-86bbd886cf-47h2z\" (UID: \"aedcdae4-7043-4e90-a4cd-b8d58ceeafb3\") " pod="openstack/dnsmasq-dns-86bbd886cf-47h2z" Mar 09 09:23:01 crc kubenswrapper[4861]: I0309 09:23:01.282813 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aedcdae4-7043-4e90-a4cd-b8d58ceeafb3-dns-svc\") pod \"dnsmasq-dns-86bbd886cf-47h2z\" (UID: \"aedcdae4-7043-4e90-a4cd-b8d58ceeafb3\") " pod="openstack/dnsmasq-dns-86bbd886cf-47h2z" Mar 09 09:23:01 crc kubenswrapper[4861]: I0309 09:23:01.283128 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aedcdae4-7043-4e90-a4cd-b8d58ceeafb3-config\") pod \"dnsmasq-dns-86bbd886cf-47h2z\" (UID: \"aedcdae4-7043-4e90-a4cd-b8d58ceeafb3\") " pod="openstack/dnsmasq-dns-86bbd886cf-47h2z" Mar 09 09:23:01 crc kubenswrapper[4861]: I0309 09:23:01.310753 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhqxr\" (UniqueName: \"kubernetes.io/projected/aedcdae4-7043-4e90-a4cd-b8d58ceeafb3-kube-api-access-dhqxr\") pod \"dnsmasq-dns-86bbd886cf-47h2z\" (UID: \"aedcdae4-7043-4e90-a4cd-b8d58ceeafb3\") " pod="openstack/dnsmasq-dns-86bbd886cf-47h2z" Mar 09 09:23:01 crc kubenswrapper[4861]: I0309 09:23:01.311022 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-wzkz2" Mar 09 09:23:01 crc kubenswrapper[4861]: I0309 09:23:01.357931 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-47h2z" Mar 09 09:23:01 crc kubenswrapper[4861]: I0309 09:23:01.774504 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-wzkz2"] Mar 09 09:23:01 crc kubenswrapper[4861]: W0309 09:23:01.775306 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3b1b7ac_3322_4520_892a_1c881ee10c50.slice/crio-dc28abf7115da75e366ee5a259691d0fe38b9f2f268c369846e5cce9e127694a WatchSource:0}: Error finding container dc28abf7115da75e366ee5a259691d0fe38b9f2f268c369846e5cce9e127694a: Status 404 returned error can't find the container with id dc28abf7115da75e366ee5a259691d0fe38b9f2f268c369846e5cce9e127694a Mar 09 09:23:01 crc kubenswrapper[4861]: I0309 09:23:01.857173 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589db6c89c-wzkz2" event={"ID":"a3b1b7ac-3322-4520-892a-1c881ee10c50","Type":"ContainerStarted","Data":"dc28abf7115da75e366ee5a259691d0fe38b9f2f268c369846e5cce9e127694a"} Mar 09 09:23:01 crc kubenswrapper[4861]: I0309 09:23:01.861903 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-47h2z"] Mar 09 09:23:01 crc kubenswrapper[4861]: W0309 09:23:01.866713 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaedcdae4_7043_4e90_a4cd_b8d58ceeafb3.slice/crio-88a876c7e47434ebc46b1506439ebc3eab638a39c0f407cfbd77bda53dddbf00 WatchSource:0}: Error finding container 88a876c7e47434ebc46b1506439ebc3eab638a39c0f407cfbd77bda53dddbf00: Status 404 returned error can't find the container with id 88a876c7e47434ebc46b1506439ebc3eab638a39c0f407cfbd77bda53dddbf00 Mar 09 09:23:02 crc kubenswrapper[4861]: I0309 09:23:02.877616 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86bbd886cf-47h2z" event={"ID":"aedcdae4-7043-4e90-a4cd-b8d58ceeafb3","Type":"ContainerStarted","Data":"88a876c7e47434ebc46b1506439ebc3eab638a39c0f407cfbd77bda53dddbf00"} Mar 09 09:23:03 crc kubenswrapper[4861]: I0309 09:23:03.428512 4861 scope.go:117] "RemoveContainer" containerID="4c0c9c8b6e4389c0650f8bf6aa40f2831f9be11b143f5e71b3c423f8298e4e6e" Mar 09 09:23:03 crc kubenswrapper[4861]: I0309 09:23:03.748720 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-wzkz2"] Mar 09 09:23:03 crc kubenswrapper[4861]: I0309 09:23:03.775318 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-6bxvk"] Mar 09 09:23:03 crc kubenswrapper[4861]: I0309 09:23:03.778176 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cb4465c9-6bxvk" Mar 09 09:23:03 crc kubenswrapper[4861]: I0309 09:23:03.787727 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-6bxvk"] Mar 09 09:23:03 crc kubenswrapper[4861]: I0309 09:23:03.932025 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eba69073-9a91-4b38-a2d1-2880e6b6882f-dns-svc\") pod \"dnsmasq-dns-78cb4465c9-6bxvk\" (UID: \"eba69073-9a91-4b38-a2d1-2880e6b6882f\") " pod="openstack/dnsmasq-dns-78cb4465c9-6bxvk" Mar 09 09:23:03 crc kubenswrapper[4861]: I0309 09:23:03.932166 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xbl8\" (UniqueName: \"kubernetes.io/projected/eba69073-9a91-4b38-a2d1-2880e6b6882f-kube-api-access-5xbl8\") pod \"dnsmasq-dns-78cb4465c9-6bxvk\" (UID: \"eba69073-9a91-4b38-a2d1-2880e6b6882f\") " pod="openstack/dnsmasq-dns-78cb4465c9-6bxvk" Mar 09 09:23:03 crc kubenswrapper[4861]: I0309 09:23:03.932195 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eba69073-9a91-4b38-a2d1-2880e6b6882f-config\") pod \"dnsmasq-dns-78cb4465c9-6bxvk\" (UID: \"eba69073-9a91-4b38-a2d1-2880e6b6882f\") " pod="openstack/dnsmasq-dns-78cb4465c9-6bxvk" Mar 09 09:23:04 crc kubenswrapper[4861]: I0309 09:23:04.034863 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xbl8\" (UniqueName: \"kubernetes.io/projected/eba69073-9a91-4b38-a2d1-2880e6b6882f-kube-api-access-5xbl8\") pod \"dnsmasq-dns-78cb4465c9-6bxvk\" (UID: \"eba69073-9a91-4b38-a2d1-2880e6b6882f\") " pod="openstack/dnsmasq-dns-78cb4465c9-6bxvk" Mar 09 09:23:04 crc kubenswrapper[4861]: I0309 09:23:04.035150 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eba69073-9a91-4b38-a2d1-2880e6b6882f-config\") pod \"dnsmasq-dns-78cb4465c9-6bxvk\" (UID: \"eba69073-9a91-4b38-a2d1-2880e6b6882f\") " pod="openstack/dnsmasq-dns-78cb4465c9-6bxvk" Mar 09 09:23:04 crc kubenswrapper[4861]: I0309 09:23:04.035213 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eba69073-9a91-4b38-a2d1-2880e6b6882f-dns-svc\") pod \"dnsmasq-dns-78cb4465c9-6bxvk\" (UID: \"eba69073-9a91-4b38-a2d1-2880e6b6882f\") " pod="openstack/dnsmasq-dns-78cb4465c9-6bxvk" Mar 09 09:23:04 crc kubenswrapper[4861]: I0309 09:23:04.036479 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eba69073-9a91-4b38-a2d1-2880e6b6882f-config\") pod \"dnsmasq-dns-78cb4465c9-6bxvk\" (UID: \"eba69073-9a91-4b38-a2d1-2880e6b6882f\") " pod="openstack/dnsmasq-dns-78cb4465c9-6bxvk" Mar 09 09:23:04 crc kubenswrapper[4861]: I0309 09:23:04.037795 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eba69073-9a91-4b38-a2d1-2880e6b6882f-dns-svc\") pod \"dnsmasq-dns-78cb4465c9-6bxvk\" (UID: \"eba69073-9a91-4b38-a2d1-2880e6b6882f\") " pod="openstack/dnsmasq-dns-78cb4465c9-6bxvk" Mar 09 09:23:04 crc kubenswrapper[4861]: I0309 09:23:04.045114 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-47h2z"] Mar 09 09:23:04 crc kubenswrapper[4861]: I0309 09:23:04.065210 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xbl8\" (UniqueName: \"kubernetes.io/projected/eba69073-9a91-4b38-a2d1-2880e6b6882f-kube-api-access-5xbl8\") pod \"dnsmasq-dns-78cb4465c9-6bxvk\" (UID: \"eba69073-9a91-4b38-a2d1-2880e6b6882f\") " pod="openstack/dnsmasq-dns-78cb4465c9-6bxvk" Mar 09 09:23:04 crc kubenswrapper[4861]: I0309 09:23:04.083440 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-zbnlj"] Mar 09 09:23:04 crc kubenswrapper[4861]: I0309 09:23:04.084923 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-zbnlj" Mar 09 09:23:04 crc kubenswrapper[4861]: I0309 09:23:04.088021 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-zbnlj"] Mar 09 09:23:04 crc kubenswrapper[4861]: I0309 09:23:04.097709 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cb4465c9-6bxvk" Mar 09 09:23:04 crc kubenswrapper[4861]: I0309 09:23:04.244306 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76d8l\" (UniqueName: \"kubernetes.io/projected/d4847dbd-e086-4995-b488-c7611173b6e8-kube-api-access-76d8l\") pod \"dnsmasq-dns-7c47bcb9f9-zbnlj\" (UID: \"d4847dbd-e086-4995-b488-c7611173b6e8\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-zbnlj" Mar 09 09:23:04 crc kubenswrapper[4861]: I0309 09:23:04.244473 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4847dbd-e086-4995-b488-c7611173b6e8-config\") pod \"dnsmasq-dns-7c47bcb9f9-zbnlj\" (UID: \"d4847dbd-e086-4995-b488-c7611173b6e8\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-zbnlj" Mar 09 09:23:04 crc kubenswrapper[4861]: I0309 09:23:04.244568 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d4847dbd-e086-4995-b488-c7611173b6e8-dns-svc\") pod \"dnsmasq-dns-7c47bcb9f9-zbnlj\" (UID: \"d4847dbd-e086-4995-b488-c7611173b6e8\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-zbnlj" Mar 09 09:23:04 crc kubenswrapper[4861]: I0309 09:23:04.346357 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d4847dbd-e086-4995-b488-c7611173b6e8-dns-svc\") pod \"dnsmasq-dns-7c47bcb9f9-zbnlj\" (UID: \"d4847dbd-e086-4995-b488-c7611173b6e8\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-zbnlj" Mar 09 09:23:04 crc kubenswrapper[4861]: I0309 09:23:04.346780 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76d8l\" (UniqueName: \"kubernetes.io/projected/d4847dbd-e086-4995-b488-c7611173b6e8-kube-api-access-76d8l\") pod \"dnsmasq-dns-7c47bcb9f9-zbnlj\" (UID: \"d4847dbd-e086-4995-b488-c7611173b6e8\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-zbnlj" Mar 09 09:23:04 crc kubenswrapper[4861]: I0309 09:23:04.346859 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4847dbd-e086-4995-b488-c7611173b6e8-config\") pod \"dnsmasq-dns-7c47bcb9f9-zbnlj\" (UID: \"d4847dbd-e086-4995-b488-c7611173b6e8\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-zbnlj" Mar 09 09:23:04 crc kubenswrapper[4861]: I0309 09:23:04.347913 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4847dbd-e086-4995-b488-c7611173b6e8-config\") pod \"dnsmasq-dns-7c47bcb9f9-zbnlj\" (UID: \"d4847dbd-e086-4995-b488-c7611173b6e8\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-zbnlj" Mar 09 09:23:04 crc kubenswrapper[4861]: I0309 09:23:04.354613 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d4847dbd-e086-4995-b488-c7611173b6e8-dns-svc\") pod \"dnsmasq-dns-7c47bcb9f9-zbnlj\" (UID: \"d4847dbd-e086-4995-b488-c7611173b6e8\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-zbnlj" Mar 09 09:23:04 crc kubenswrapper[4861]: I0309 09:23:04.372989 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76d8l\" (UniqueName: \"kubernetes.io/projected/d4847dbd-e086-4995-b488-c7611173b6e8-kube-api-access-76d8l\") pod \"dnsmasq-dns-7c47bcb9f9-zbnlj\" (UID: \"d4847dbd-e086-4995-b488-c7611173b6e8\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-zbnlj" Mar 09 09:23:04 crc kubenswrapper[4861]: I0309 09:23:04.404896 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-zbnlj" Mar 09 09:23:04 crc kubenswrapper[4861]: I0309 09:23:04.654728 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-6bxvk"] Mar 09 09:23:04 crc kubenswrapper[4861]: I0309 09:23:04.877352 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-zbnlj"] Mar 09 09:23:04 crc kubenswrapper[4861]: I0309 09:23:04.907217 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cb4465c9-6bxvk" event={"ID":"eba69073-9a91-4b38-a2d1-2880e6b6882f","Type":"ContainerStarted","Data":"4dd8779e44cab1535a198761db1d7878ebfc9dd40ef53862ee5e0f2d02f7ef1f"} Mar 09 09:23:04 crc kubenswrapper[4861]: I0309 09:23:04.932579 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 09:23:04 crc kubenswrapper[4861]: I0309 09:23:04.944776 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 09:23:04 crc kubenswrapper[4861]: I0309 09:23:04.944939 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:23:04 crc kubenswrapper[4861]: I0309 09:23:04.947088 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 09 09:23:04 crc kubenswrapper[4861]: I0309 09:23:04.947286 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 09 09:23:04 crc kubenswrapper[4861]: I0309 09:23:04.947487 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 09 09:23:04 crc kubenswrapper[4861]: I0309 09:23:04.947561 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 09 09:23:04 crc kubenswrapper[4861]: I0309 09:23:04.947685 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 09 09:23:04 crc kubenswrapper[4861]: I0309 09:23:04.947762 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-lv2dg" Mar 09 09:23:04 crc kubenswrapper[4861]: I0309 09:23:04.947803 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.058026 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/03452acf-c21f-4d68-a813-772c30604a60-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"03452acf-c21f-4d68-a813-772c30604a60\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.058496 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/03452acf-c21f-4d68-a813-772c30604a60-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"03452acf-c21f-4d68-a813-772c30604a60\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.058534 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/03452acf-c21f-4d68-a813-772c30604a60-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"03452acf-c21f-4d68-a813-772c30604a60\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.058587 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/03452acf-c21f-4d68-a813-772c30604a60-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"03452acf-c21f-4d68-a813-772c30604a60\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.058612 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/03452acf-c21f-4d68-a813-772c30604a60-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"03452acf-c21f-4d68-a813-772c30604a60\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.058638 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/03452acf-c21f-4d68-a813-772c30604a60-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"03452acf-c21f-4d68-a813-772c30604a60\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.058751 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/03452acf-c21f-4d68-a813-772c30604a60-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"03452acf-c21f-4d68-a813-772c30604a60\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.058784 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/03452acf-c21f-4d68-a813-772c30604a60-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"03452acf-c21f-4d68-a813-772c30604a60\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.058845 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/03452acf-c21f-4d68-a813-772c30604a60-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"03452acf-c21f-4d68-a813-772c30604a60\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.058871 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttclt\" (UniqueName: \"kubernetes.io/projected/03452acf-c21f-4d68-a813-772c30604a60-kube-api-access-ttclt\") pod \"rabbitmq-cell1-server-0\" (UID: \"03452acf-c21f-4d68-a813-772c30604a60\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.058905 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"03452acf-c21f-4d68-a813-772c30604a60\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.160656 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/03452acf-c21f-4d68-a813-772c30604a60-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"03452acf-c21f-4d68-a813-772c30604a60\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.160699 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/03452acf-c21f-4d68-a813-772c30604a60-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"03452acf-c21f-4d68-a813-772c30604a60\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.160729 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/03452acf-c21f-4d68-a813-772c30604a60-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"03452acf-c21f-4d68-a813-772c30604a60\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.160750 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttclt\" (UniqueName: \"kubernetes.io/projected/03452acf-c21f-4d68-a813-772c30604a60-kube-api-access-ttclt\") pod \"rabbitmq-cell1-server-0\" (UID: \"03452acf-c21f-4d68-a813-772c30604a60\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.160770 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"03452acf-c21f-4d68-a813-772c30604a60\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.160791 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/03452acf-c21f-4d68-a813-772c30604a60-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"03452acf-c21f-4d68-a813-772c30604a60\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.160823 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/03452acf-c21f-4d68-a813-772c30604a60-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"03452acf-c21f-4d68-a813-772c30604a60\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.160845 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/03452acf-c21f-4d68-a813-772c30604a60-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"03452acf-c21f-4d68-a813-772c30604a60\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.160858 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/03452acf-c21f-4d68-a813-772c30604a60-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"03452acf-c21f-4d68-a813-772c30604a60\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.160874 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/03452acf-c21f-4d68-a813-772c30604a60-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"03452acf-c21f-4d68-a813-772c30604a60\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.160890 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/03452acf-c21f-4d68-a813-772c30604a60-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"03452acf-c21f-4d68-a813-772c30604a60\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.161194 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/03452acf-c21f-4d68-a813-772c30604a60-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"03452acf-c21f-4d68-a813-772c30604a60\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.162555 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/03452acf-c21f-4d68-a813-772c30604a60-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"03452acf-c21f-4d68-a813-772c30604a60\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.162967 4861 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"03452acf-c21f-4d68-a813-772c30604a60\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.163801 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/03452acf-c21f-4d68-a813-772c30604a60-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"03452acf-c21f-4d68-a813-772c30604a60\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.163876 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/03452acf-c21f-4d68-a813-772c30604a60-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"03452acf-c21f-4d68-a813-772c30604a60\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.164573 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/03452acf-c21f-4d68-a813-772c30604a60-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"03452acf-c21f-4d68-a813-772c30604a60\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.168449 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/03452acf-c21f-4d68-a813-772c30604a60-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"03452acf-c21f-4d68-a813-772c30604a60\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.168755 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/03452acf-c21f-4d68-a813-772c30604a60-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"03452acf-c21f-4d68-a813-772c30604a60\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.176180 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/03452acf-c21f-4d68-a813-772c30604a60-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"03452acf-c21f-4d68-a813-772c30604a60\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.183905 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.184261 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttclt\" (UniqueName: \"kubernetes.io/projected/03452acf-c21f-4d68-a813-772c30604a60-kube-api-access-ttclt\") pod \"rabbitmq-cell1-server-0\" (UID: \"03452acf-c21f-4d68-a813-772c30604a60\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.185190 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.185657 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"03452acf-c21f-4d68-a813-772c30604a60\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.188073 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.188139 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.188217 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/03452acf-c21f-4d68-a813-772c30604a60-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"03452acf-c21f-4d68-a813-772c30604a60\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.188424 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.188695 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.188710 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.188896 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-zmgq2" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.193695 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.202717 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.262319 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b9b83355-ea40-4408-9b77-c717df91e1a9-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b9b83355-ea40-4408-9b77-c717df91e1a9\") " pod="openstack/rabbitmq-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.262389 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b9b83355-ea40-4408-9b77-c717df91e1a9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b9b83355-ea40-4408-9b77-c717df91e1a9\") " pod="openstack/rabbitmq-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.262419 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r246d\" (UniqueName: \"kubernetes.io/projected/b9b83355-ea40-4408-9b77-c717df91e1a9-kube-api-access-r246d\") pod \"rabbitmq-server-0\" (UID: \"b9b83355-ea40-4408-9b77-c717df91e1a9\") " pod="openstack/rabbitmq-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.262524 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"b9b83355-ea40-4408-9b77-c717df91e1a9\") " pod="openstack/rabbitmq-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.262630 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b9b83355-ea40-4408-9b77-c717df91e1a9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b9b83355-ea40-4408-9b77-c717df91e1a9\") " pod="openstack/rabbitmq-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.262725 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b9b83355-ea40-4408-9b77-c717df91e1a9-config-data\") pod \"rabbitmq-server-0\" (UID: \"b9b83355-ea40-4408-9b77-c717df91e1a9\") " pod="openstack/rabbitmq-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.262765 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b9b83355-ea40-4408-9b77-c717df91e1a9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b9b83355-ea40-4408-9b77-c717df91e1a9\") " pod="openstack/rabbitmq-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.262851 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b9b83355-ea40-4408-9b77-c717df91e1a9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b9b83355-ea40-4408-9b77-c717df91e1a9\") " pod="openstack/rabbitmq-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.262873 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b9b83355-ea40-4408-9b77-c717df91e1a9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b9b83355-ea40-4408-9b77-c717df91e1a9\") " pod="openstack/rabbitmq-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.262903 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b9b83355-ea40-4408-9b77-c717df91e1a9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b9b83355-ea40-4408-9b77-c717df91e1a9\") " pod="openstack/rabbitmq-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.262931 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b9b83355-ea40-4408-9b77-c717df91e1a9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b9b83355-ea40-4408-9b77-c717df91e1a9\") " pod="openstack/rabbitmq-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.270995 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.364421 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b9b83355-ea40-4408-9b77-c717df91e1a9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b9b83355-ea40-4408-9b77-c717df91e1a9\") " pod="openstack/rabbitmq-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.364465 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b9b83355-ea40-4408-9b77-c717df91e1a9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b9b83355-ea40-4408-9b77-c717df91e1a9\") " pod="openstack/rabbitmq-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.364490 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b9b83355-ea40-4408-9b77-c717df91e1a9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b9b83355-ea40-4408-9b77-c717df91e1a9\") " pod="openstack/rabbitmq-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.364508 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b9b83355-ea40-4408-9b77-c717df91e1a9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b9b83355-ea40-4408-9b77-c717df91e1a9\") " pod="openstack/rabbitmq-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.364539 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b9b83355-ea40-4408-9b77-c717df91e1a9-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b9b83355-ea40-4408-9b77-c717df91e1a9\") " pod="openstack/rabbitmq-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.364564 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b9b83355-ea40-4408-9b77-c717df91e1a9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b9b83355-ea40-4408-9b77-c717df91e1a9\") " pod="openstack/rabbitmq-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.364584 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r246d\" (UniqueName: \"kubernetes.io/projected/b9b83355-ea40-4408-9b77-c717df91e1a9-kube-api-access-r246d\") pod \"rabbitmq-server-0\" (UID: \"b9b83355-ea40-4408-9b77-c717df91e1a9\") " pod="openstack/rabbitmq-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.364617 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"b9b83355-ea40-4408-9b77-c717df91e1a9\") " pod="openstack/rabbitmq-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.364632 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b9b83355-ea40-4408-9b77-c717df91e1a9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b9b83355-ea40-4408-9b77-c717df91e1a9\") " pod="openstack/rabbitmq-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.364658 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b9b83355-ea40-4408-9b77-c717df91e1a9-config-data\") pod \"rabbitmq-server-0\" (UID: \"b9b83355-ea40-4408-9b77-c717df91e1a9\") " pod="openstack/rabbitmq-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.364675 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b9b83355-ea40-4408-9b77-c717df91e1a9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b9b83355-ea40-4408-9b77-c717df91e1a9\") " pod="openstack/rabbitmq-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.365256 4861 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"b9b83355-ea40-4408-9b77-c717df91e1a9\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.365689 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b9b83355-ea40-4408-9b77-c717df91e1a9-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b9b83355-ea40-4408-9b77-c717df91e1a9\") " pod="openstack/rabbitmq-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.365986 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b9b83355-ea40-4408-9b77-c717df91e1a9-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b9b83355-ea40-4408-9b77-c717df91e1a9\") " pod="openstack/rabbitmq-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.366988 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b9b83355-ea40-4408-9b77-c717df91e1a9-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b9b83355-ea40-4408-9b77-c717df91e1a9\") " pod="openstack/rabbitmq-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.367426 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b9b83355-ea40-4408-9b77-c717df91e1a9-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b9b83355-ea40-4408-9b77-c717df91e1a9\") " pod="openstack/rabbitmq-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.371783 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b9b83355-ea40-4408-9b77-c717df91e1a9-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b9b83355-ea40-4408-9b77-c717df91e1a9\") " pod="openstack/rabbitmq-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.371808 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b9b83355-ea40-4408-9b77-c717df91e1a9-config-data\") pod \"rabbitmq-server-0\" (UID: \"b9b83355-ea40-4408-9b77-c717df91e1a9\") " pod="openstack/rabbitmq-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.372057 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b9b83355-ea40-4408-9b77-c717df91e1a9-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b9b83355-ea40-4408-9b77-c717df91e1a9\") " pod="openstack/rabbitmq-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.372194 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b9b83355-ea40-4408-9b77-c717df91e1a9-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b9b83355-ea40-4408-9b77-c717df91e1a9\") " pod="openstack/rabbitmq-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.382681 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r246d\" (UniqueName: \"kubernetes.io/projected/b9b83355-ea40-4408-9b77-c717df91e1a9-kube-api-access-r246d\") pod \"rabbitmq-server-0\" (UID: \"b9b83355-ea40-4408-9b77-c717df91e1a9\") " pod="openstack/rabbitmq-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.385169 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b9b83355-ea40-4408-9b77-c717df91e1a9-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b9b83355-ea40-4408-9b77-c717df91e1a9\") " pod="openstack/rabbitmq-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.387642 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"b9b83355-ea40-4408-9b77-c717df91e1a9\") " pod="openstack/rabbitmq-server-0" Mar 09 09:23:05 crc kubenswrapper[4861]: I0309 09:23:05.558999 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 09 09:23:06 crc kubenswrapper[4861]: I0309 09:23:06.552435 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 09 09:23:06 crc kubenswrapper[4861]: I0309 09:23:06.554227 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 09 09:23:06 crc kubenswrapper[4861]: I0309 09:23:06.562503 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-xc5gk" Mar 09 09:23:06 crc kubenswrapper[4861]: I0309 09:23:06.562686 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 09 09:23:06 crc kubenswrapper[4861]: I0309 09:23:06.563344 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 09 09:23:06 crc kubenswrapper[4861]: I0309 09:23:06.566285 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 09 09:23:06 crc kubenswrapper[4861]: I0309 09:23:06.566508 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 09 09:23:06 crc kubenswrapper[4861]: I0309 09:23:06.608923 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 09 09:23:06 crc kubenswrapper[4861]: I0309 09:23:06.689184 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab732e3-1122-4f45-a9af-b36eaa88c19e-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"0ab732e3-1122-4f45-a9af-b36eaa88c19e\") " pod="openstack/openstack-galera-0" Mar 09 09:23:06 crc kubenswrapper[4861]: I0309 09:23:06.689269 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0ab732e3-1122-4f45-a9af-b36eaa88c19e-config-data-default\") pod \"openstack-galera-0\" (UID: \"0ab732e3-1122-4f45-a9af-b36eaa88c19e\") " pod="openstack/openstack-galera-0" Mar 09 09:23:06 crc kubenswrapper[4861]: I0309 09:23:06.689302 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2944d\" (UniqueName: \"kubernetes.io/projected/0ab732e3-1122-4f45-a9af-b36eaa88c19e-kube-api-access-2944d\") pod \"openstack-galera-0\" (UID: \"0ab732e3-1122-4f45-a9af-b36eaa88c19e\") " pod="openstack/openstack-galera-0" Mar 09 09:23:06 crc kubenswrapper[4861]: I0309 09:23:06.689344 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ab732e3-1122-4f45-a9af-b36eaa88c19e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"0ab732e3-1122-4f45-a9af-b36eaa88c19e\") " pod="openstack/openstack-galera-0" Mar 09 09:23:06 crc kubenswrapper[4861]: I0309 09:23:06.689394 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ab732e3-1122-4f45-a9af-b36eaa88c19e-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0ab732e3-1122-4f45-a9af-b36eaa88c19e\") " pod="openstack/openstack-galera-0" Mar 09 09:23:06 crc kubenswrapper[4861]: I0309 09:23:06.689418 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"0ab732e3-1122-4f45-a9af-b36eaa88c19e\") " pod="openstack/openstack-galera-0" Mar 09 09:23:06 crc kubenswrapper[4861]: I0309 09:23:06.689455 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0ab732e3-1122-4f45-a9af-b36eaa88c19e-kolla-config\") pod \"openstack-galera-0\" (UID: \"0ab732e3-1122-4f45-a9af-b36eaa88c19e\") " pod="openstack/openstack-galera-0" Mar 09 09:23:06 crc kubenswrapper[4861]: I0309 09:23:06.689496 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0ab732e3-1122-4f45-a9af-b36eaa88c19e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"0ab732e3-1122-4f45-a9af-b36eaa88c19e\") " pod="openstack/openstack-galera-0" Mar 09 09:23:06 crc kubenswrapper[4861]: I0309 09:23:06.791247 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab732e3-1122-4f45-a9af-b36eaa88c19e-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"0ab732e3-1122-4f45-a9af-b36eaa88c19e\") " pod="openstack/openstack-galera-0" Mar 09 09:23:06 crc kubenswrapper[4861]: I0309 09:23:06.791314 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0ab732e3-1122-4f45-a9af-b36eaa88c19e-config-data-default\") pod \"openstack-galera-0\" (UID: \"0ab732e3-1122-4f45-a9af-b36eaa88c19e\") " pod="openstack/openstack-galera-0" Mar 09 09:23:06 crc kubenswrapper[4861]: I0309 09:23:06.791336 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2944d\" (UniqueName: \"kubernetes.io/projected/0ab732e3-1122-4f45-a9af-b36eaa88c19e-kube-api-access-2944d\") pod \"openstack-galera-0\" (UID: \"0ab732e3-1122-4f45-a9af-b36eaa88c19e\") " pod="openstack/openstack-galera-0" Mar 09 09:23:06 crc kubenswrapper[4861]: I0309 09:23:06.791370 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ab732e3-1122-4f45-a9af-b36eaa88c19e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"0ab732e3-1122-4f45-a9af-b36eaa88c19e\") " pod="openstack/openstack-galera-0" Mar 09 09:23:06 crc kubenswrapper[4861]: I0309 09:23:06.791408 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ab732e3-1122-4f45-a9af-b36eaa88c19e-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0ab732e3-1122-4f45-a9af-b36eaa88c19e\") " pod="openstack/openstack-galera-0" Mar 09 09:23:06 crc kubenswrapper[4861]: I0309 09:23:06.791430 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"0ab732e3-1122-4f45-a9af-b36eaa88c19e\") " pod="openstack/openstack-galera-0" Mar 09 09:23:06 crc kubenswrapper[4861]: I0309 09:23:06.791462 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0ab732e3-1122-4f45-a9af-b36eaa88c19e-kolla-config\") pod \"openstack-galera-0\" (UID: \"0ab732e3-1122-4f45-a9af-b36eaa88c19e\") " pod="openstack/openstack-galera-0" Mar 09 09:23:06 crc kubenswrapper[4861]: I0309 09:23:06.791491 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0ab732e3-1122-4f45-a9af-b36eaa88c19e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"0ab732e3-1122-4f45-a9af-b36eaa88c19e\") " pod="openstack/openstack-galera-0" Mar 09 09:23:06 crc kubenswrapper[4861]: I0309 09:23:06.791722 4861 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"0ab732e3-1122-4f45-a9af-b36eaa88c19e\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/openstack-galera-0" Mar 09 09:23:06 crc kubenswrapper[4861]: I0309 09:23:06.792138 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0ab732e3-1122-4f45-a9af-b36eaa88c19e-config-data-generated\") pod \"openstack-galera-0\" (UID: \"0ab732e3-1122-4f45-a9af-b36eaa88c19e\") " pod="openstack/openstack-galera-0" Mar 09 09:23:06 crc kubenswrapper[4861]: I0309 09:23:06.792550 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0ab732e3-1122-4f45-a9af-b36eaa88c19e-kolla-config\") pod \"openstack-galera-0\" (UID: \"0ab732e3-1122-4f45-a9af-b36eaa88c19e\") " pod="openstack/openstack-galera-0" Mar 09 09:23:06 crc kubenswrapper[4861]: I0309 09:23:06.792724 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0ab732e3-1122-4f45-a9af-b36eaa88c19e-config-data-default\") pod \"openstack-galera-0\" (UID: \"0ab732e3-1122-4f45-a9af-b36eaa88c19e\") " pod="openstack/openstack-galera-0" Mar 09 09:23:06 crc kubenswrapper[4861]: I0309 09:23:06.793358 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0ab732e3-1122-4f45-a9af-b36eaa88c19e-operator-scripts\") pod \"openstack-galera-0\" (UID: \"0ab732e3-1122-4f45-a9af-b36eaa88c19e\") " pod="openstack/openstack-galera-0" Mar 09 09:23:06 crc kubenswrapper[4861]: I0309 09:23:06.795750 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0ab732e3-1122-4f45-a9af-b36eaa88c19e-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"0ab732e3-1122-4f45-a9af-b36eaa88c19e\") " pod="openstack/openstack-galera-0" Mar 09 09:23:06 crc kubenswrapper[4861]: I0309 09:23:06.796510 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ab732e3-1122-4f45-a9af-b36eaa88c19e-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"0ab732e3-1122-4f45-a9af-b36eaa88c19e\") " pod="openstack/openstack-galera-0" Mar 09 09:23:06 crc kubenswrapper[4861]: I0309 09:23:06.811617 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2944d\" (UniqueName: \"kubernetes.io/projected/0ab732e3-1122-4f45-a9af-b36eaa88c19e-kube-api-access-2944d\") pod \"openstack-galera-0\" (UID: \"0ab732e3-1122-4f45-a9af-b36eaa88c19e\") " pod="openstack/openstack-galera-0" Mar 09 09:23:06 crc kubenswrapper[4861]: I0309 09:23:06.812483 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-galera-0\" (UID: \"0ab732e3-1122-4f45-a9af-b36eaa88c19e\") " pod="openstack/openstack-galera-0" Mar 09 09:23:06 crc kubenswrapper[4861]: I0309 09:23:06.923223 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 09 09:23:07 crc kubenswrapper[4861]: W0309 09:23:07.450476 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4847dbd_e086_4995_b488_c7611173b6e8.slice/crio-3d4e3b249f8679d3df8a380646926af0a3cfdd7bee76416d14e5f3488415e2c1 WatchSource:0}: Error finding container 3d4e3b249f8679d3df8a380646926af0a3cfdd7bee76416d14e5f3488415e2c1: Status 404 returned error can't find the container with id 3d4e3b249f8679d3df8a380646926af0a3cfdd7bee76416d14e5f3488415e2c1 Mar 09 09:23:07 crc kubenswrapper[4861]: I0309 09:23:07.914128 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 09 09:23:07 crc kubenswrapper[4861]: I0309 09:23:07.916010 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 09 09:23:07 crc kubenswrapper[4861]: I0309 09:23:07.921545 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-gb5l9" Mar 09 09:23:07 crc kubenswrapper[4861]: I0309 09:23:07.921679 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 09 09:23:07 crc kubenswrapper[4861]: I0309 09:23:07.921774 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 09 09:23:07 crc kubenswrapper[4861]: I0309 09:23:07.922161 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 09 09:23:07 crc kubenswrapper[4861]: I0309 09:23:07.926536 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 09 09:23:07 crc kubenswrapper[4861]: I0309 09:23:07.928871 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c47bcb9f9-zbnlj" event={"ID":"d4847dbd-e086-4995-b488-c7611173b6e8","Type":"ContainerStarted","Data":"3d4e3b249f8679d3df8a380646926af0a3cfdd7bee76416d14e5f3488415e2c1"} Mar 09 09:23:08 crc kubenswrapper[4861]: I0309 09:23:08.009722 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f095ca7b-1959-4cda-bde8-40ca6446e34d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f095ca7b-1959-4cda-bde8-40ca6446e34d\") " pod="openstack/openstack-cell1-galera-0" Mar 09 09:23:08 crc kubenswrapper[4861]: I0309 09:23:08.009809 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f095ca7b-1959-4cda-bde8-40ca6446e34d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f095ca7b-1959-4cda-bde8-40ca6446e34d\") " pod="openstack/openstack-cell1-galera-0" Mar 09 09:23:08 crc kubenswrapper[4861]: I0309 09:23:08.009844 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f095ca7b-1959-4cda-bde8-40ca6446e34d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f095ca7b-1959-4cda-bde8-40ca6446e34d\") " pod="openstack/openstack-cell1-galera-0" Mar 09 09:23:08 crc kubenswrapper[4861]: I0309 09:23:08.009866 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f095ca7b-1959-4cda-bde8-40ca6446e34d\") " pod="openstack/openstack-cell1-galera-0" Mar 09 09:23:08 crc kubenswrapper[4861]: I0309 09:23:08.009887 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f095ca7b-1959-4cda-bde8-40ca6446e34d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f095ca7b-1959-4cda-bde8-40ca6446e34d\") " pod="openstack/openstack-cell1-galera-0" Mar 09 09:23:08 crc kubenswrapper[4861]: I0309 09:23:08.009909 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vgwf\" (UniqueName: \"kubernetes.io/projected/f095ca7b-1959-4cda-bde8-40ca6446e34d-kube-api-access-6vgwf\") pod \"openstack-cell1-galera-0\" (UID: \"f095ca7b-1959-4cda-bde8-40ca6446e34d\") " pod="openstack/openstack-cell1-galera-0" Mar 09 09:23:08 crc kubenswrapper[4861]: I0309 09:23:08.009935 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f095ca7b-1959-4cda-bde8-40ca6446e34d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f095ca7b-1959-4cda-bde8-40ca6446e34d\") " pod="openstack/openstack-cell1-galera-0" Mar 09 09:23:08 crc kubenswrapper[4861]: I0309 09:23:08.009970 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f095ca7b-1959-4cda-bde8-40ca6446e34d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f095ca7b-1959-4cda-bde8-40ca6446e34d\") " pod="openstack/openstack-cell1-galera-0" Mar 09 09:23:08 crc kubenswrapper[4861]: I0309 09:23:08.048797 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 09 09:23:08 crc kubenswrapper[4861]: I0309 09:23:08.054800 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 09 09:23:08 crc kubenswrapper[4861]: I0309 09:23:08.056945 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 09 09:23:08 crc kubenswrapper[4861]: I0309 09:23:08.058081 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-wx9nc" Mar 09 09:23:08 crc kubenswrapper[4861]: I0309 09:23:08.058393 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 09 09:23:08 crc kubenswrapper[4861]: I0309 09:23:08.058451 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 09 09:23:08 crc kubenswrapper[4861]: I0309 09:23:08.110992 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f095ca7b-1959-4cda-bde8-40ca6446e34d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f095ca7b-1959-4cda-bde8-40ca6446e34d\") " pod="openstack/openstack-cell1-galera-0" Mar 09 09:23:08 crc kubenswrapper[4861]: I0309 09:23:08.111052 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f095ca7b-1959-4cda-bde8-40ca6446e34d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f095ca7b-1959-4cda-bde8-40ca6446e34d\") " pod="openstack/openstack-cell1-galera-0" Mar 09 09:23:08 crc kubenswrapper[4861]: I0309 09:23:08.111070 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f095ca7b-1959-4cda-bde8-40ca6446e34d\") " pod="openstack/openstack-cell1-galera-0" Mar 09 09:23:08 crc kubenswrapper[4861]: I0309 09:23:08.111091 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f095ca7b-1959-4cda-bde8-40ca6446e34d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f095ca7b-1959-4cda-bde8-40ca6446e34d\") " pod="openstack/openstack-cell1-galera-0" Mar 09 09:23:08 crc kubenswrapper[4861]: I0309 09:23:08.111110 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vgwf\" (UniqueName: \"kubernetes.io/projected/f095ca7b-1959-4cda-bde8-40ca6446e34d-kube-api-access-6vgwf\") pod \"openstack-cell1-galera-0\" (UID: \"f095ca7b-1959-4cda-bde8-40ca6446e34d\") " pod="openstack/openstack-cell1-galera-0" Mar 09 09:23:08 crc kubenswrapper[4861]: I0309 09:23:08.111137 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f095ca7b-1959-4cda-bde8-40ca6446e34d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f095ca7b-1959-4cda-bde8-40ca6446e34d\") " pod="openstack/openstack-cell1-galera-0" Mar 09 09:23:08 crc kubenswrapper[4861]: I0309 09:23:08.111167 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3671a10-52be-44e3-9c3d-11ba14e8e449-memcached-tls-certs\") pod \"memcached-0\" (UID: \"a3671a10-52be-44e3-9c3d-11ba14e8e449\") " pod="openstack/memcached-0" Mar 09 09:23:08 crc kubenswrapper[4861]: I0309 09:23:08.111193 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f095ca7b-1959-4cda-bde8-40ca6446e34d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f095ca7b-1959-4cda-bde8-40ca6446e34d\") " pod="openstack/openstack-cell1-galera-0" Mar 09 09:23:08 crc kubenswrapper[4861]: I0309 09:23:08.111216 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a3671a10-52be-44e3-9c3d-11ba14e8e449-config-data\") pod \"memcached-0\" (UID: \"a3671a10-52be-44e3-9c3d-11ba14e8e449\") " pod="openstack/memcached-0" Mar 09 09:23:08 crc kubenswrapper[4861]: I0309 09:23:08.111234 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a3671a10-52be-44e3-9c3d-11ba14e8e449-kolla-config\") pod \"memcached-0\" (UID: \"a3671a10-52be-44e3-9c3d-11ba14e8e449\") " pod="openstack/memcached-0" Mar 09 09:23:08 crc kubenswrapper[4861]: I0309 09:23:08.111283 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3671a10-52be-44e3-9c3d-11ba14e8e449-combined-ca-bundle\") pod \"memcached-0\" (UID: \"a3671a10-52be-44e3-9c3d-11ba14e8e449\") " pod="openstack/memcached-0" Mar 09 09:23:08 crc kubenswrapper[4861]: I0309 09:23:08.111308 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2666v\" (UniqueName: \"kubernetes.io/projected/a3671a10-52be-44e3-9c3d-11ba14e8e449-kube-api-access-2666v\") pod \"memcached-0\" (UID: \"a3671a10-52be-44e3-9c3d-11ba14e8e449\") " pod="openstack/memcached-0" Mar 09 09:23:08 crc kubenswrapper[4861]: I0309 09:23:08.111335 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f095ca7b-1959-4cda-bde8-40ca6446e34d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f095ca7b-1959-4cda-bde8-40ca6446e34d\") " pod="openstack/openstack-cell1-galera-0" Mar 09 09:23:08 crc kubenswrapper[4861]: I0309 09:23:08.111394 4861 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f095ca7b-1959-4cda-bde8-40ca6446e34d\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/openstack-cell1-galera-0" Mar 09 09:23:08 crc kubenswrapper[4861]: I0309 09:23:08.112272 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f095ca7b-1959-4cda-bde8-40ca6446e34d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f095ca7b-1959-4cda-bde8-40ca6446e34d\") " pod="openstack/openstack-cell1-galera-0" Mar 09 09:23:08 crc kubenswrapper[4861]: I0309 09:23:08.113196 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f095ca7b-1959-4cda-bde8-40ca6446e34d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f095ca7b-1959-4cda-bde8-40ca6446e34d\") " pod="openstack/openstack-cell1-galera-0" Mar 09 09:23:08 crc kubenswrapper[4861]: I0309 09:23:08.114339 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f095ca7b-1959-4cda-bde8-40ca6446e34d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f095ca7b-1959-4cda-bde8-40ca6446e34d\") " pod="openstack/openstack-cell1-galera-0" Mar 09 09:23:08 crc kubenswrapper[4861]: I0309 09:23:08.115115 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f095ca7b-1959-4cda-bde8-40ca6446e34d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f095ca7b-1959-4cda-bde8-40ca6446e34d\") " pod="openstack/openstack-cell1-galera-0" Mar 09 09:23:08 crc kubenswrapper[4861]: I0309 09:23:08.115738 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f095ca7b-1959-4cda-bde8-40ca6446e34d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f095ca7b-1959-4cda-bde8-40ca6446e34d\") " pod="openstack/openstack-cell1-galera-0" Mar 09 09:23:08 crc kubenswrapper[4861]: I0309 09:23:08.115807 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f095ca7b-1959-4cda-bde8-40ca6446e34d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f095ca7b-1959-4cda-bde8-40ca6446e34d\") " pod="openstack/openstack-cell1-galera-0" Mar 09 09:23:08 crc kubenswrapper[4861]: I0309 09:23:08.131668 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f095ca7b-1959-4cda-bde8-40ca6446e34d\") " pod="openstack/openstack-cell1-galera-0" Mar 09 09:23:08 crc kubenswrapper[4861]: I0309 09:23:08.132096 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vgwf\" (UniqueName: \"kubernetes.io/projected/f095ca7b-1959-4cda-bde8-40ca6446e34d-kube-api-access-6vgwf\") pod \"openstack-cell1-galera-0\" (UID: \"f095ca7b-1959-4cda-bde8-40ca6446e34d\") " pod="openstack/openstack-cell1-galera-0" Mar 09 09:23:08 crc kubenswrapper[4861]: I0309 09:23:08.212704 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a3671a10-52be-44e3-9c3d-11ba14e8e449-config-data\") pod \"memcached-0\" (UID: \"a3671a10-52be-44e3-9c3d-11ba14e8e449\") " pod="openstack/memcached-0" Mar 09 09:23:08 crc kubenswrapper[4861]: I0309 09:23:08.212757 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a3671a10-52be-44e3-9c3d-11ba14e8e449-kolla-config\") pod \"memcached-0\" (UID: \"a3671a10-52be-44e3-9c3d-11ba14e8e449\") " pod="openstack/memcached-0" Mar 09 09:23:08 crc kubenswrapper[4861]: I0309 09:23:08.212803 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3671a10-52be-44e3-9c3d-11ba14e8e449-combined-ca-bundle\") pod \"memcached-0\" (UID: \"a3671a10-52be-44e3-9c3d-11ba14e8e449\") " pod="openstack/memcached-0" Mar 09 09:23:08 crc kubenswrapper[4861]: I0309 09:23:08.212836 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2666v\" (UniqueName: \"kubernetes.io/projected/a3671a10-52be-44e3-9c3d-11ba14e8e449-kube-api-access-2666v\") pod \"memcached-0\" (UID: \"a3671a10-52be-44e3-9c3d-11ba14e8e449\") " pod="openstack/memcached-0" Mar 09 09:23:08 crc kubenswrapper[4861]: I0309 09:23:08.213114 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3671a10-52be-44e3-9c3d-11ba14e8e449-memcached-tls-certs\") pod \"memcached-0\" (UID: \"a3671a10-52be-44e3-9c3d-11ba14e8e449\") " pod="openstack/memcached-0" Mar 09 09:23:08 crc kubenswrapper[4861]: I0309 09:23:08.214836 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a3671a10-52be-44e3-9c3d-11ba14e8e449-kolla-config\") pod \"memcached-0\" (UID: \"a3671a10-52be-44e3-9c3d-11ba14e8e449\") " pod="openstack/memcached-0" Mar 09 09:23:08 crc kubenswrapper[4861]: I0309 09:23:08.219773 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a3671a10-52be-44e3-9c3d-11ba14e8e449-config-data\") pod \"memcached-0\" (UID: \"a3671a10-52be-44e3-9c3d-11ba14e8e449\") " pod="openstack/memcached-0" Mar 09 09:23:08 crc kubenswrapper[4861]: I0309 09:23:08.222433 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3671a10-52be-44e3-9c3d-11ba14e8e449-combined-ca-bundle\") pod \"memcached-0\" (UID: \"a3671a10-52be-44e3-9c3d-11ba14e8e449\") " pod="openstack/memcached-0" Mar 09 09:23:08 crc kubenswrapper[4861]: I0309 09:23:08.222548 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3671a10-52be-44e3-9c3d-11ba14e8e449-memcached-tls-certs\") pod \"memcached-0\" (UID: \"a3671a10-52be-44e3-9c3d-11ba14e8e449\") " pod="openstack/memcached-0" Mar 09 09:23:08 crc kubenswrapper[4861]: I0309 09:23:08.228764 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2666v\" (UniqueName: \"kubernetes.io/projected/a3671a10-52be-44e3-9c3d-11ba14e8e449-kube-api-access-2666v\") pod \"memcached-0\" (UID: \"a3671a10-52be-44e3-9c3d-11ba14e8e449\") " pod="openstack/memcached-0" Mar 09 09:23:08 crc kubenswrapper[4861]: I0309 09:23:08.240932 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 09 09:23:08 crc kubenswrapper[4861]: I0309 09:23:08.373720 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 09 09:23:10 crc kubenswrapper[4861]: I0309 09:23:10.295490 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 09:23:10 crc kubenswrapper[4861]: I0309 09:23:10.296892 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 09 09:23:10 crc kubenswrapper[4861]: I0309 09:23:10.308030 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 09:23:10 crc kubenswrapper[4861]: I0309 09:23:10.312679 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-8bpl5" Mar 09 09:23:10 crc kubenswrapper[4861]: I0309 09:23:10.343959 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl7ph\" (UniqueName: \"kubernetes.io/projected/8f114ae0-31de-4f17-9bad-1bc0b895d006-kube-api-access-tl7ph\") pod \"kube-state-metrics-0\" (UID: \"8f114ae0-31de-4f17-9bad-1bc0b895d006\") " pod="openstack/kube-state-metrics-0" Mar 09 09:23:10 crc kubenswrapper[4861]: I0309 09:23:10.444807 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tl7ph\" (UniqueName: \"kubernetes.io/projected/8f114ae0-31de-4f17-9bad-1bc0b895d006-kube-api-access-tl7ph\") pod \"kube-state-metrics-0\" (UID: \"8f114ae0-31de-4f17-9bad-1bc0b895d006\") " pod="openstack/kube-state-metrics-0" Mar 09 09:23:10 crc kubenswrapper[4861]: I0309 09:23:10.468839 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl7ph\" (UniqueName: \"kubernetes.io/projected/8f114ae0-31de-4f17-9bad-1bc0b895d006-kube-api-access-tl7ph\") pod \"kube-state-metrics-0\" (UID: \"8f114ae0-31de-4f17-9bad-1bc0b895d006\") " pod="openstack/kube-state-metrics-0" Mar 09 09:23:10 crc kubenswrapper[4861]: I0309 09:23:10.613231 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 09 09:23:13 crc kubenswrapper[4861]: I0309 09:23:13.437820 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-s7nq5"] Mar 09 09:23:13 crc kubenswrapper[4861]: I0309 09:23:13.439277 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s7nq5" Mar 09 09:23:13 crc kubenswrapper[4861]: I0309 09:23:13.441104 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 09 09:23:13 crc kubenswrapper[4861]: I0309 09:23:13.441152 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-k4bbx" Mar 09 09:23:13 crc kubenswrapper[4861]: I0309 09:23:13.442088 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 09 09:23:13 crc kubenswrapper[4861]: I0309 09:23:13.452746 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-hmb5w"] Mar 09 09:23:13 crc kubenswrapper[4861]: I0309 09:23:13.454388 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-hmb5w" Mar 09 09:23:13 crc kubenswrapper[4861]: I0309 09:23:13.459533 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-s7nq5"] Mar 09 09:23:13 crc kubenswrapper[4861]: I0309 09:23:13.476575 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-hmb5w"] Mar 09 09:23:13 crc kubenswrapper[4861]: I0309 09:23:13.606889 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cafa7cbf-ac96-4bb5-a33e-90d69df5d797-var-run\") pod \"ovn-controller-ovs-hmb5w\" (UID: \"cafa7cbf-ac96-4bb5-a33e-90d69df5d797\") " pod="openstack/ovn-controller-ovs-hmb5w" Mar 09 09:23:13 crc kubenswrapper[4861]: I0309 09:23:13.606938 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pf29b\" (UniqueName: \"kubernetes.io/projected/4e354b06-2ae2-41af-b5d7-2909bca8cff6-kube-api-access-pf29b\") pod \"ovn-controller-s7nq5\" (UID: \"4e354b06-2ae2-41af-b5d7-2909bca8cff6\") " pod="openstack/ovn-controller-s7nq5" Mar 09 09:23:13 crc kubenswrapper[4861]: I0309 09:23:13.606959 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4e354b06-2ae2-41af-b5d7-2909bca8cff6-scripts\") pod \"ovn-controller-s7nq5\" (UID: \"4e354b06-2ae2-41af-b5d7-2909bca8cff6\") " pod="openstack/ovn-controller-s7nq5" Mar 09 09:23:13 crc kubenswrapper[4861]: I0309 09:23:13.606991 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/cafa7cbf-ac96-4bb5-a33e-90d69df5d797-var-lib\") pod \"ovn-controller-ovs-hmb5w\" (UID: \"cafa7cbf-ac96-4bb5-a33e-90d69df5d797\") " pod="openstack/ovn-controller-ovs-hmb5w" Mar 09 09:23:13 crc kubenswrapper[4861]: I0309 09:23:13.607500 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e354b06-2ae2-41af-b5d7-2909bca8cff6-ovn-controller-tls-certs\") pod \"ovn-controller-s7nq5\" (UID: \"4e354b06-2ae2-41af-b5d7-2909bca8cff6\") " pod="openstack/ovn-controller-s7nq5" Mar 09 09:23:13 crc kubenswrapper[4861]: I0309 09:23:13.607644 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4e354b06-2ae2-41af-b5d7-2909bca8cff6-var-run\") pod \"ovn-controller-s7nq5\" (UID: \"4e354b06-2ae2-41af-b5d7-2909bca8cff6\") " pod="openstack/ovn-controller-s7nq5" Mar 09 09:23:13 crc kubenswrapper[4861]: I0309 09:23:13.607670 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/cafa7cbf-ac96-4bb5-a33e-90d69df5d797-var-log\") pod \"ovn-controller-ovs-hmb5w\" (UID: \"cafa7cbf-ac96-4bb5-a33e-90d69df5d797\") " pod="openstack/ovn-controller-ovs-hmb5w" Mar 09 09:23:13 crc kubenswrapper[4861]: I0309 09:23:13.607688 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqf76\" (UniqueName: \"kubernetes.io/projected/cafa7cbf-ac96-4bb5-a33e-90d69df5d797-kube-api-access-xqf76\") pod \"ovn-controller-ovs-hmb5w\" (UID: \"cafa7cbf-ac96-4bb5-a33e-90d69df5d797\") " pod="openstack/ovn-controller-ovs-hmb5w" Mar 09 09:23:13 crc kubenswrapper[4861]: I0309 09:23:13.607711 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cafa7cbf-ac96-4bb5-a33e-90d69df5d797-scripts\") pod \"ovn-controller-ovs-hmb5w\" (UID: \"cafa7cbf-ac96-4bb5-a33e-90d69df5d797\") " pod="openstack/ovn-controller-ovs-hmb5w" Mar 09 09:23:13 crc kubenswrapper[4861]: I0309 09:23:13.607754 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/cafa7cbf-ac96-4bb5-a33e-90d69df5d797-etc-ovs\") pod \"ovn-controller-ovs-hmb5w\" (UID: \"cafa7cbf-ac96-4bb5-a33e-90d69df5d797\") " pod="openstack/ovn-controller-ovs-hmb5w" Mar 09 09:23:13 crc kubenswrapper[4861]: I0309 09:23:13.607789 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4e354b06-2ae2-41af-b5d7-2909bca8cff6-var-log-ovn\") pod \"ovn-controller-s7nq5\" (UID: \"4e354b06-2ae2-41af-b5d7-2909bca8cff6\") " pod="openstack/ovn-controller-s7nq5" Mar 09 09:23:13 crc kubenswrapper[4861]: I0309 09:23:13.607805 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e354b06-2ae2-41af-b5d7-2909bca8cff6-combined-ca-bundle\") pod \"ovn-controller-s7nq5\" (UID: \"4e354b06-2ae2-41af-b5d7-2909bca8cff6\") " pod="openstack/ovn-controller-s7nq5" Mar 09 09:23:13 crc kubenswrapper[4861]: I0309 09:23:13.607830 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4e354b06-2ae2-41af-b5d7-2909bca8cff6-var-run-ovn\") pod \"ovn-controller-s7nq5\" (UID: \"4e354b06-2ae2-41af-b5d7-2909bca8cff6\") " pod="openstack/ovn-controller-s7nq5" Mar 09 09:23:13 crc kubenswrapper[4861]: I0309 09:23:13.709670 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e354b06-2ae2-41af-b5d7-2909bca8cff6-ovn-controller-tls-certs\") pod \"ovn-controller-s7nq5\" (UID: \"4e354b06-2ae2-41af-b5d7-2909bca8cff6\") " pod="openstack/ovn-controller-s7nq5" Mar 09 09:23:13 crc kubenswrapper[4861]: I0309 09:23:13.709795 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4e354b06-2ae2-41af-b5d7-2909bca8cff6-var-run\") pod \"ovn-controller-s7nq5\" (UID: \"4e354b06-2ae2-41af-b5d7-2909bca8cff6\") " pod="openstack/ovn-controller-s7nq5" Mar 09 09:23:13 crc kubenswrapper[4861]: I0309 09:23:13.709830 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/cafa7cbf-ac96-4bb5-a33e-90d69df5d797-var-log\") pod \"ovn-controller-ovs-hmb5w\" (UID: \"cafa7cbf-ac96-4bb5-a33e-90d69df5d797\") " pod="openstack/ovn-controller-ovs-hmb5w" Mar 09 09:23:13 crc kubenswrapper[4861]: I0309 09:23:13.709847 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqf76\" (UniqueName: \"kubernetes.io/projected/cafa7cbf-ac96-4bb5-a33e-90d69df5d797-kube-api-access-xqf76\") pod \"ovn-controller-ovs-hmb5w\" (UID: \"cafa7cbf-ac96-4bb5-a33e-90d69df5d797\") " pod="openstack/ovn-controller-ovs-hmb5w" Mar 09 09:23:13 crc kubenswrapper[4861]: I0309 09:23:13.709865 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cafa7cbf-ac96-4bb5-a33e-90d69df5d797-scripts\") pod \"ovn-controller-ovs-hmb5w\" (UID: \"cafa7cbf-ac96-4bb5-a33e-90d69df5d797\") " pod="openstack/ovn-controller-ovs-hmb5w" Mar 09 09:23:13 crc kubenswrapper[4861]: I0309 09:23:13.709892 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/cafa7cbf-ac96-4bb5-a33e-90d69df5d797-etc-ovs\") pod \"ovn-controller-ovs-hmb5w\" (UID: \"cafa7cbf-ac96-4bb5-a33e-90d69df5d797\") " pod="openstack/ovn-controller-ovs-hmb5w" Mar 09 09:23:13 crc kubenswrapper[4861]: I0309 09:23:13.709912 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4e354b06-2ae2-41af-b5d7-2909bca8cff6-var-log-ovn\") pod \"ovn-controller-s7nq5\" (UID: \"4e354b06-2ae2-41af-b5d7-2909bca8cff6\") " pod="openstack/ovn-controller-s7nq5" Mar 09 09:23:13 crc kubenswrapper[4861]: I0309 09:23:13.709927 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e354b06-2ae2-41af-b5d7-2909bca8cff6-combined-ca-bundle\") pod \"ovn-controller-s7nq5\" (UID: \"4e354b06-2ae2-41af-b5d7-2909bca8cff6\") " pod="openstack/ovn-controller-s7nq5" Mar 09 09:23:13 crc kubenswrapper[4861]: I0309 09:23:13.709957 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4e354b06-2ae2-41af-b5d7-2909bca8cff6-var-run-ovn\") pod \"ovn-controller-s7nq5\" (UID: \"4e354b06-2ae2-41af-b5d7-2909bca8cff6\") " pod="openstack/ovn-controller-s7nq5" Mar 09 09:23:13 crc kubenswrapper[4861]: I0309 09:23:13.710008 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cafa7cbf-ac96-4bb5-a33e-90d69df5d797-var-run\") pod \"ovn-controller-ovs-hmb5w\" (UID: \"cafa7cbf-ac96-4bb5-a33e-90d69df5d797\") " pod="openstack/ovn-controller-ovs-hmb5w" Mar 09 09:23:13 crc kubenswrapper[4861]: I0309 09:23:13.710036 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pf29b\" (UniqueName: \"kubernetes.io/projected/4e354b06-2ae2-41af-b5d7-2909bca8cff6-kube-api-access-pf29b\") pod \"ovn-controller-s7nq5\" (UID: \"4e354b06-2ae2-41af-b5d7-2909bca8cff6\") " pod="openstack/ovn-controller-s7nq5" Mar 09 09:23:13 crc kubenswrapper[4861]: I0309 09:23:13.710058 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4e354b06-2ae2-41af-b5d7-2909bca8cff6-scripts\") pod \"ovn-controller-s7nq5\" (UID: \"4e354b06-2ae2-41af-b5d7-2909bca8cff6\") " pod="openstack/ovn-controller-s7nq5" Mar 09 09:23:13 crc kubenswrapper[4861]: I0309 09:23:13.710114 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/cafa7cbf-ac96-4bb5-a33e-90d69df5d797-var-lib\") pod \"ovn-controller-ovs-hmb5w\" (UID: \"cafa7cbf-ac96-4bb5-a33e-90d69df5d797\") " pod="openstack/ovn-controller-ovs-hmb5w" Mar 09 09:23:13 crc kubenswrapper[4861]: I0309 09:23:13.710523 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/cafa7cbf-ac96-4bb5-a33e-90d69df5d797-var-log\") pod \"ovn-controller-ovs-hmb5w\" (UID: \"cafa7cbf-ac96-4bb5-a33e-90d69df5d797\") " pod="openstack/ovn-controller-ovs-hmb5w" Mar 09 09:23:13 crc kubenswrapper[4861]: I0309 09:23:13.710549 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/cafa7cbf-ac96-4bb5-a33e-90d69df5d797-etc-ovs\") pod \"ovn-controller-ovs-hmb5w\" (UID: \"cafa7cbf-ac96-4bb5-a33e-90d69df5d797\") " pod="openstack/ovn-controller-ovs-hmb5w" Mar 09 09:23:13 crc kubenswrapper[4861]: I0309 09:23:13.710577 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4e354b06-2ae2-41af-b5d7-2909bca8cff6-var-run\") pod \"ovn-controller-s7nq5\" (UID: \"4e354b06-2ae2-41af-b5d7-2909bca8cff6\") " pod="openstack/ovn-controller-s7nq5" Mar 09 09:23:13 crc kubenswrapper[4861]: I0309 09:23:13.710621 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4e354b06-2ae2-41af-b5d7-2909bca8cff6-var-log-ovn\") pod \"ovn-controller-s7nq5\" (UID: \"4e354b06-2ae2-41af-b5d7-2909bca8cff6\") " pod="openstack/ovn-controller-s7nq5" Mar 09 09:23:13 crc kubenswrapper[4861]: I0309 09:23:13.710727 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4e354b06-2ae2-41af-b5d7-2909bca8cff6-var-run-ovn\") pod \"ovn-controller-s7nq5\" (UID: \"4e354b06-2ae2-41af-b5d7-2909bca8cff6\") " pod="openstack/ovn-controller-s7nq5" Mar 09 09:23:13 crc kubenswrapper[4861]: I0309 09:23:13.710771 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cafa7cbf-ac96-4bb5-a33e-90d69df5d797-var-run\") pod \"ovn-controller-ovs-hmb5w\" (UID: \"cafa7cbf-ac96-4bb5-a33e-90d69df5d797\") " pod="openstack/ovn-controller-ovs-hmb5w" Mar 09 09:23:13 crc kubenswrapper[4861]: I0309 09:23:13.711842 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cafa7cbf-ac96-4bb5-a33e-90d69df5d797-scripts\") pod \"ovn-controller-ovs-hmb5w\" (UID: \"cafa7cbf-ac96-4bb5-a33e-90d69df5d797\") " pod="openstack/ovn-controller-ovs-hmb5w" Mar 09 09:23:13 crc kubenswrapper[4861]: I0309 09:23:13.714522 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4e354b06-2ae2-41af-b5d7-2909bca8cff6-scripts\") pod \"ovn-controller-s7nq5\" (UID: \"4e354b06-2ae2-41af-b5d7-2909bca8cff6\") " pod="openstack/ovn-controller-s7nq5" Mar 09 09:23:13 crc kubenswrapper[4861]: I0309 09:23:13.714699 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/cafa7cbf-ac96-4bb5-a33e-90d69df5d797-var-lib\") pod \"ovn-controller-ovs-hmb5w\" (UID: \"cafa7cbf-ac96-4bb5-a33e-90d69df5d797\") " pod="openstack/ovn-controller-ovs-hmb5w" Mar 09 09:23:13 crc kubenswrapper[4861]: I0309 09:23:13.724843 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqf76\" (UniqueName: \"kubernetes.io/projected/cafa7cbf-ac96-4bb5-a33e-90d69df5d797-kube-api-access-xqf76\") pod \"ovn-controller-ovs-hmb5w\" (UID: \"cafa7cbf-ac96-4bb5-a33e-90d69df5d797\") " pod="openstack/ovn-controller-ovs-hmb5w" Mar 09 09:23:13 crc kubenswrapper[4861]: I0309 09:23:13.725072 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/4e354b06-2ae2-41af-b5d7-2909bca8cff6-ovn-controller-tls-certs\") pod \"ovn-controller-s7nq5\" (UID: \"4e354b06-2ae2-41af-b5d7-2909bca8cff6\") " pod="openstack/ovn-controller-s7nq5" Mar 09 09:23:13 crc kubenswrapper[4861]: I0309 09:23:13.727928 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e354b06-2ae2-41af-b5d7-2909bca8cff6-combined-ca-bundle\") pod \"ovn-controller-s7nq5\" (UID: \"4e354b06-2ae2-41af-b5d7-2909bca8cff6\") " pod="openstack/ovn-controller-s7nq5" Mar 09 09:23:13 crc kubenswrapper[4861]: I0309 09:23:13.731752 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pf29b\" (UniqueName: \"kubernetes.io/projected/4e354b06-2ae2-41af-b5d7-2909bca8cff6-kube-api-access-pf29b\") pod \"ovn-controller-s7nq5\" (UID: \"4e354b06-2ae2-41af-b5d7-2909bca8cff6\") " pod="openstack/ovn-controller-s7nq5" Mar 09 09:23:13 crc kubenswrapper[4861]: I0309 09:23:13.757583 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s7nq5" Mar 09 09:23:13 crc kubenswrapper[4861]: I0309 09:23:13.780716 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-hmb5w" Mar 09 09:23:14 crc kubenswrapper[4861]: I0309 09:23:14.302862 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 09 09:23:14 crc kubenswrapper[4861]: I0309 09:23:14.304311 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 09 09:23:14 crc kubenswrapper[4861]: I0309 09:23:14.308305 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 09 09:23:14 crc kubenswrapper[4861]: I0309 09:23:14.308543 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 09 09:23:14 crc kubenswrapper[4861]: I0309 09:23:14.308694 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-k9pr2" Mar 09 09:23:14 crc kubenswrapper[4861]: I0309 09:23:14.309369 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 09 09:23:14 crc kubenswrapper[4861]: I0309 09:23:14.317820 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 09 09:23:14 crc kubenswrapper[4861]: I0309 09:23:14.343634 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 09 09:23:14 crc kubenswrapper[4861]: I0309 09:23:14.422455 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57sf7\" (UniqueName: \"kubernetes.io/projected/1d597158-3a33-4518-a0b9-37cf5b309a28-kube-api-access-57sf7\") pod \"ovsdbserver-nb-0\" (UID: \"1d597158-3a33-4518-a0b9-37cf5b309a28\") " pod="openstack/ovsdbserver-nb-0" Mar 09 09:23:14 crc kubenswrapper[4861]: I0309 09:23:14.422518 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d597158-3a33-4518-a0b9-37cf5b309a28-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"1d597158-3a33-4518-a0b9-37cf5b309a28\") " pod="openstack/ovsdbserver-nb-0" Mar 09 09:23:14 crc kubenswrapper[4861]: I0309 09:23:14.422585 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d597158-3a33-4518-a0b9-37cf5b309a28-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"1d597158-3a33-4518-a0b9-37cf5b309a28\") " pod="openstack/ovsdbserver-nb-0" Mar 09 09:23:14 crc kubenswrapper[4861]: I0309 09:23:14.422625 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"1d597158-3a33-4518-a0b9-37cf5b309a28\") " pod="openstack/ovsdbserver-nb-0" Mar 09 09:23:14 crc kubenswrapper[4861]: I0309 09:23:14.422673 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d597158-3a33-4518-a0b9-37cf5b309a28-config\") pod \"ovsdbserver-nb-0\" (UID: \"1d597158-3a33-4518-a0b9-37cf5b309a28\") " pod="openstack/ovsdbserver-nb-0" Mar 09 09:23:14 crc kubenswrapper[4861]: I0309 09:23:14.422710 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1d597158-3a33-4518-a0b9-37cf5b309a28-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"1d597158-3a33-4518-a0b9-37cf5b309a28\") " pod="openstack/ovsdbserver-nb-0" Mar 09 09:23:14 crc kubenswrapper[4861]: I0309 09:23:14.422730 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1d597158-3a33-4518-a0b9-37cf5b309a28-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"1d597158-3a33-4518-a0b9-37cf5b309a28\") " pod="openstack/ovsdbserver-nb-0" Mar 09 09:23:14 crc kubenswrapper[4861]: I0309 09:23:14.422749 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d597158-3a33-4518-a0b9-37cf5b309a28-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"1d597158-3a33-4518-a0b9-37cf5b309a28\") " pod="openstack/ovsdbserver-nb-0" Mar 09 09:23:14 crc kubenswrapper[4861]: I0309 09:23:14.523894 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1d597158-3a33-4518-a0b9-37cf5b309a28-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"1d597158-3a33-4518-a0b9-37cf5b309a28\") " pod="openstack/ovsdbserver-nb-0" Mar 09 09:23:14 crc kubenswrapper[4861]: I0309 09:23:14.523973 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1d597158-3a33-4518-a0b9-37cf5b309a28-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"1d597158-3a33-4518-a0b9-37cf5b309a28\") " pod="openstack/ovsdbserver-nb-0" Mar 09 09:23:14 crc kubenswrapper[4861]: I0309 09:23:14.524004 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d597158-3a33-4518-a0b9-37cf5b309a28-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"1d597158-3a33-4518-a0b9-37cf5b309a28\") " pod="openstack/ovsdbserver-nb-0" Mar 09 09:23:14 crc kubenswrapper[4861]: I0309 09:23:14.524044 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57sf7\" (UniqueName: \"kubernetes.io/projected/1d597158-3a33-4518-a0b9-37cf5b309a28-kube-api-access-57sf7\") pod \"ovsdbserver-nb-0\" (UID: \"1d597158-3a33-4518-a0b9-37cf5b309a28\") " pod="openstack/ovsdbserver-nb-0" Mar 09 09:23:14 crc kubenswrapper[4861]: I0309 09:23:14.524077 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d597158-3a33-4518-a0b9-37cf5b309a28-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"1d597158-3a33-4518-a0b9-37cf5b309a28\") " pod="openstack/ovsdbserver-nb-0" Mar 09 09:23:14 crc kubenswrapper[4861]: I0309 09:23:14.524113 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d597158-3a33-4518-a0b9-37cf5b309a28-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"1d597158-3a33-4518-a0b9-37cf5b309a28\") " pod="openstack/ovsdbserver-nb-0" Mar 09 09:23:14 crc kubenswrapper[4861]: I0309 09:23:14.524154 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"1d597158-3a33-4518-a0b9-37cf5b309a28\") " pod="openstack/ovsdbserver-nb-0" Mar 09 09:23:14 crc kubenswrapper[4861]: I0309 09:23:14.524222 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d597158-3a33-4518-a0b9-37cf5b309a28-config\") pod \"ovsdbserver-nb-0\" (UID: \"1d597158-3a33-4518-a0b9-37cf5b309a28\") " pod="openstack/ovsdbserver-nb-0" Mar 09 09:23:14 crc kubenswrapper[4861]: I0309 09:23:14.524604 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1d597158-3a33-4518-a0b9-37cf5b309a28-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"1d597158-3a33-4518-a0b9-37cf5b309a28\") " pod="openstack/ovsdbserver-nb-0" Mar 09 09:23:14 crc kubenswrapper[4861]: I0309 09:23:14.525073 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d597158-3a33-4518-a0b9-37cf5b309a28-config\") pod \"ovsdbserver-nb-0\" (UID: \"1d597158-3a33-4518-a0b9-37cf5b309a28\") " pod="openstack/ovsdbserver-nb-0" Mar 09 09:23:14 crc kubenswrapper[4861]: I0309 09:23:14.525322 4861 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"1d597158-3a33-4518-a0b9-37cf5b309a28\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/ovsdbserver-nb-0" Mar 09 09:23:14 crc kubenswrapper[4861]: I0309 09:23:14.525631 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1d597158-3a33-4518-a0b9-37cf5b309a28-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"1d597158-3a33-4518-a0b9-37cf5b309a28\") " pod="openstack/ovsdbserver-nb-0" Mar 09 09:23:14 crc kubenswrapper[4861]: I0309 09:23:14.528511 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d597158-3a33-4518-a0b9-37cf5b309a28-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"1d597158-3a33-4518-a0b9-37cf5b309a28\") " pod="openstack/ovsdbserver-nb-0" Mar 09 09:23:14 crc kubenswrapper[4861]: I0309 09:23:14.529797 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d597158-3a33-4518-a0b9-37cf5b309a28-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"1d597158-3a33-4518-a0b9-37cf5b309a28\") " pod="openstack/ovsdbserver-nb-0" Mar 09 09:23:14 crc kubenswrapper[4861]: I0309 09:23:14.531049 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d597158-3a33-4518-a0b9-37cf5b309a28-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"1d597158-3a33-4518-a0b9-37cf5b309a28\") " pod="openstack/ovsdbserver-nb-0" Mar 09 09:23:14 crc kubenswrapper[4861]: I0309 09:23:14.545686 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"1d597158-3a33-4518-a0b9-37cf5b309a28\") " pod="openstack/ovsdbserver-nb-0" Mar 09 09:23:14 crc kubenswrapper[4861]: I0309 09:23:14.546564 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57sf7\" (UniqueName: \"kubernetes.io/projected/1d597158-3a33-4518-a0b9-37cf5b309a28-kube-api-access-57sf7\") pod \"ovsdbserver-nb-0\" (UID: \"1d597158-3a33-4518-a0b9-37cf5b309a28\") " pod="openstack/ovsdbserver-nb-0" Mar 09 09:23:14 crc kubenswrapper[4861]: I0309 09:23:14.659950 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 09 09:23:17 crc kubenswrapper[4861]: I0309 09:23:17.241609 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 09 09:23:17 crc kubenswrapper[4861]: I0309 09:23:17.244280 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 09 09:23:17 crc kubenswrapper[4861]: I0309 09:23:17.246350 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 09 09:23:17 crc kubenswrapper[4861]: I0309 09:23:17.246662 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 09 09:23:17 crc kubenswrapper[4861]: I0309 09:23:17.246901 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-9x5jt" Mar 09 09:23:17 crc kubenswrapper[4861]: I0309 09:23:17.247129 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 09 09:23:17 crc kubenswrapper[4861]: I0309 09:23:17.272710 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 09 09:23:17 crc kubenswrapper[4861]: I0309 09:23:17.387473 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"fc895133-add5-4388-8e97-1b0d16306648\") " pod="openstack/ovsdbserver-sb-0" Mar 09 09:23:17 crc kubenswrapper[4861]: I0309 09:23:17.387596 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc895133-add5-4388-8e97-1b0d16306648-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"fc895133-add5-4388-8e97-1b0d16306648\") " pod="openstack/ovsdbserver-sb-0" Mar 09 09:23:17 crc kubenswrapper[4861]: I0309 09:23:17.387632 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc895133-add5-4388-8e97-1b0d16306648-config\") pod \"ovsdbserver-sb-0\" (UID: \"fc895133-add5-4388-8e97-1b0d16306648\") " pod="openstack/ovsdbserver-sb-0" Mar 09 09:23:17 crc kubenswrapper[4861]: I0309 09:23:17.387667 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxgnv\" (UniqueName: \"kubernetes.io/projected/fc895133-add5-4388-8e97-1b0d16306648-kube-api-access-fxgnv\") pod \"ovsdbserver-sb-0\" (UID: \"fc895133-add5-4388-8e97-1b0d16306648\") " pod="openstack/ovsdbserver-sb-0" Mar 09 09:23:17 crc kubenswrapper[4861]: I0309 09:23:17.387826 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc895133-add5-4388-8e97-1b0d16306648-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"fc895133-add5-4388-8e97-1b0d16306648\") " pod="openstack/ovsdbserver-sb-0" Mar 09 09:23:17 crc kubenswrapper[4861]: I0309 09:23:17.387963 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc895133-add5-4388-8e97-1b0d16306648-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"fc895133-add5-4388-8e97-1b0d16306648\") " pod="openstack/ovsdbserver-sb-0" Mar 09 09:23:17 crc kubenswrapper[4861]: I0309 09:23:17.388013 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc895133-add5-4388-8e97-1b0d16306648-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"fc895133-add5-4388-8e97-1b0d16306648\") " pod="openstack/ovsdbserver-sb-0" Mar 09 09:23:17 crc kubenswrapper[4861]: I0309 09:23:17.388212 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fc895133-add5-4388-8e97-1b0d16306648-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"fc895133-add5-4388-8e97-1b0d16306648\") " pod="openstack/ovsdbserver-sb-0" Mar 09 09:23:17 crc kubenswrapper[4861]: I0309 09:23:17.490092 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc895133-add5-4388-8e97-1b0d16306648-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"fc895133-add5-4388-8e97-1b0d16306648\") " pod="openstack/ovsdbserver-sb-0" Mar 09 09:23:17 crc kubenswrapper[4861]: I0309 09:23:17.490178 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc895133-add5-4388-8e97-1b0d16306648-config\") pod \"ovsdbserver-sb-0\" (UID: \"fc895133-add5-4388-8e97-1b0d16306648\") " pod="openstack/ovsdbserver-sb-0" Mar 09 09:23:17 crc kubenswrapper[4861]: I0309 09:23:17.490227 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxgnv\" (UniqueName: \"kubernetes.io/projected/fc895133-add5-4388-8e97-1b0d16306648-kube-api-access-fxgnv\") pod \"ovsdbserver-sb-0\" (UID: \"fc895133-add5-4388-8e97-1b0d16306648\") " pod="openstack/ovsdbserver-sb-0" Mar 09 09:23:17 crc kubenswrapper[4861]: I0309 09:23:17.490275 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc895133-add5-4388-8e97-1b0d16306648-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"fc895133-add5-4388-8e97-1b0d16306648\") " pod="openstack/ovsdbserver-sb-0" Mar 09 09:23:17 crc kubenswrapper[4861]: I0309 09:23:17.490334 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc895133-add5-4388-8e97-1b0d16306648-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"fc895133-add5-4388-8e97-1b0d16306648\") " pod="openstack/ovsdbserver-sb-0" Mar 09 09:23:17 crc kubenswrapper[4861]: I0309 09:23:17.490397 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc895133-add5-4388-8e97-1b0d16306648-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"fc895133-add5-4388-8e97-1b0d16306648\") " pod="openstack/ovsdbserver-sb-0" Mar 09 09:23:17 crc kubenswrapper[4861]: I0309 09:23:17.490441 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fc895133-add5-4388-8e97-1b0d16306648-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"fc895133-add5-4388-8e97-1b0d16306648\") " pod="openstack/ovsdbserver-sb-0" Mar 09 09:23:17 crc kubenswrapper[4861]: I0309 09:23:17.490549 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"fc895133-add5-4388-8e97-1b0d16306648\") " pod="openstack/ovsdbserver-sb-0" Mar 09 09:23:17 crc kubenswrapper[4861]: I0309 09:23:17.491066 4861 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"fc895133-add5-4388-8e97-1b0d16306648\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-sb-0" Mar 09 09:23:17 crc kubenswrapper[4861]: I0309 09:23:17.491350 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fc895133-add5-4388-8e97-1b0d16306648-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"fc895133-add5-4388-8e97-1b0d16306648\") " pod="openstack/ovsdbserver-sb-0" Mar 09 09:23:17 crc kubenswrapper[4861]: I0309 09:23:17.491568 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc895133-add5-4388-8e97-1b0d16306648-config\") pod \"ovsdbserver-sb-0\" (UID: \"fc895133-add5-4388-8e97-1b0d16306648\") " pod="openstack/ovsdbserver-sb-0" Mar 09 09:23:17 crc kubenswrapper[4861]: I0309 09:23:17.493089 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fc895133-add5-4388-8e97-1b0d16306648-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"fc895133-add5-4388-8e97-1b0d16306648\") " pod="openstack/ovsdbserver-sb-0" Mar 09 09:23:17 crc kubenswrapper[4861]: I0309 09:23:17.497338 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc895133-add5-4388-8e97-1b0d16306648-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"fc895133-add5-4388-8e97-1b0d16306648\") " pod="openstack/ovsdbserver-sb-0" Mar 09 09:23:17 crc kubenswrapper[4861]: I0309 09:23:17.499260 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc895133-add5-4388-8e97-1b0d16306648-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"fc895133-add5-4388-8e97-1b0d16306648\") " pod="openstack/ovsdbserver-sb-0" Mar 09 09:23:17 crc kubenswrapper[4861]: I0309 09:23:17.504157 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc895133-add5-4388-8e97-1b0d16306648-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"fc895133-add5-4388-8e97-1b0d16306648\") " pod="openstack/ovsdbserver-sb-0" Mar 09 09:23:17 crc kubenswrapper[4861]: I0309 09:23:17.517030 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxgnv\" (UniqueName: \"kubernetes.io/projected/fc895133-add5-4388-8e97-1b0d16306648-kube-api-access-fxgnv\") pod \"ovsdbserver-sb-0\" (UID: \"fc895133-add5-4388-8e97-1b0d16306648\") " pod="openstack/ovsdbserver-sb-0" Mar 09 09:23:17 crc kubenswrapper[4861]: I0309 09:23:17.526000 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"fc895133-add5-4388-8e97-1b0d16306648\") " pod="openstack/ovsdbserver-sb-0" Mar 09 09:23:17 crc kubenswrapper[4861]: I0309 09:23:17.567979 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 09 09:23:18 crc kubenswrapper[4861]: E0309 09:23:18.918505 4861 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514" Mar 09 09:23:18 crc kubenswrapper[4861]: E0309 09:23:18.918729 4861 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-st6tz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-589db6c89c-wzkz2_openstack(a3b1b7ac-3322-4520-892a-1c881ee10c50): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 09:23:18 crc kubenswrapper[4861]: E0309 09:23:18.921563 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-589db6c89c-wzkz2" podUID="a3b1b7ac-3322-4520-892a-1c881ee10c50" Mar 09 09:23:18 crc kubenswrapper[4861]: E0309 09:23:18.967818 4861 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514" Mar 09 09:23:18 crc kubenswrapper[4861]: E0309 09:23:18.968052 4861 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dhqxr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-86bbd886cf-47h2z_openstack(aedcdae4-7043-4e90-a4cd-b8d58ceeafb3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 09:23:18 crc kubenswrapper[4861]: E0309 09:23:18.969665 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-86bbd886cf-47h2z" podUID="aedcdae4-7043-4e90-a4cd-b8d58ceeafb3" Mar 09 09:23:19 crc kubenswrapper[4861]: I0309 09:23:19.547711 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-47h2z" Mar 09 09:23:19 crc kubenswrapper[4861]: I0309 09:23:19.557522 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-wzkz2" Mar 09 09:23:19 crc kubenswrapper[4861]: I0309 09:23:19.735647 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-st6tz\" (UniqueName: \"kubernetes.io/projected/a3b1b7ac-3322-4520-892a-1c881ee10c50-kube-api-access-st6tz\") pod \"a3b1b7ac-3322-4520-892a-1c881ee10c50\" (UID: \"a3b1b7ac-3322-4520-892a-1c881ee10c50\") " Mar 09 09:23:19 crc kubenswrapper[4861]: I0309 09:23:19.735715 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aedcdae4-7043-4e90-a4cd-b8d58ceeafb3-dns-svc\") pod \"aedcdae4-7043-4e90-a4cd-b8d58ceeafb3\" (UID: \"aedcdae4-7043-4e90-a4cd-b8d58ceeafb3\") " Mar 09 09:23:19 crc kubenswrapper[4861]: I0309 09:23:19.735805 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aedcdae4-7043-4e90-a4cd-b8d58ceeafb3-config\") pod \"aedcdae4-7043-4e90-a4cd-b8d58ceeafb3\" (UID: \"aedcdae4-7043-4e90-a4cd-b8d58ceeafb3\") " Mar 09 09:23:19 crc kubenswrapper[4861]: I0309 09:23:19.735830 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhqxr\" (UniqueName: \"kubernetes.io/projected/aedcdae4-7043-4e90-a4cd-b8d58ceeafb3-kube-api-access-dhqxr\") pod \"aedcdae4-7043-4e90-a4cd-b8d58ceeafb3\" (UID: \"aedcdae4-7043-4e90-a4cd-b8d58ceeafb3\") " Mar 09 09:23:19 crc kubenswrapper[4861]: I0309 09:23:19.735888 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3b1b7ac-3322-4520-892a-1c881ee10c50-config\") pod \"a3b1b7ac-3322-4520-892a-1c881ee10c50\" (UID: \"a3b1b7ac-3322-4520-892a-1c881ee10c50\") " Mar 09 09:23:19 crc kubenswrapper[4861]: I0309 09:23:19.736462 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aedcdae4-7043-4e90-a4cd-b8d58ceeafb3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "aedcdae4-7043-4e90-a4cd-b8d58ceeafb3" (UID: "aedcdae4-7043-4e90-a4cd-b8d58ceeafb3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:23:19 crc kubenswrapper[4861]: I0309 09:23:19.736483 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aedcdae4-7043-4e90-a4cd-b8d58ceeafb3-config" (OuterVolumeSpecName: "config") pod "aedcdae4-7043-4e90-a4cd-b8d58ceeafb3" (UID: "aedcdae4-7043-4e90-a4cd-b8d58ceeafb3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:23:19 crc kubenswrapper[4861]: I0309 09:23:19.736480 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3b1b7ac-3322-4520-892a-1c881ee10c50-config" (OuterVolumeSpecName: "config") pod "a3b1b7ac-3322-4520-892a-1c881ee10c50" (UID: "a3b1b7ac-3322-4520-892a-1c881ee10c50"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:23:19 crc kubenswrapper[4861]: I0309 09:23:19.740394 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3b1b7ac-3322-4520-892a-1c881ee10c50-kube-api-access-st6tz" (OuterVolumeSpecName: "kube-api-access-st6tz") pod "a3b1b7ac-3322-4520-892a-1c881ee10c50" (UID: "a3b1b7ac-3322-4520-892a-1c881ee10c50"). InnerVolumeSpecName "kube-api-access-st6tz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:23:19 crc kubenswrapper[4861]: I0309 09:23:19.740999 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aedcdae4-7043-4e90-a4cd-b8d58ceeafb3-kube-api-access-dhqxr" (OuterVolumeSpecName: "kube-api-access-dhqxr") pod "aedcdae4-7043-4e90-a4cd-b8d58ceeafb3" (UID: "aedcdae4-7043-4e90-a4cd-b8d58ceeafb3"). InnerVolumeSpecName "kube-api-access-dhqxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:23:19 crc kubenswrapper[4861]: I0309 09:23:19.843906 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-st6tz\" (UniqueName: \"kubernetes.io/projected/a3b1b7ac-3322-4520-892a-1c881ee10c50-kube-api-access-st6tz\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:19 crc kubenswrapper[4861]: I0309 09:23:19.844018 4861 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aedcdae4-7043-4e90-a4cd-b8d58ceeafb3-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:19 crc kubenswrapper[4861]: I0309 09:23:19.844061 4861 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aedcdae4-7043-4e90-a4cd-b8d58ceeafb3-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:19 crc kubenswrapper[4861]: I0309 09:23:19.844073 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhqxr\" (UniqueName: \"kubernetes.io/projected/aedcdae4-7043-4e90-a4cd-b8d58ceeafb3-kube-api-access-dhqxr\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:19 crc kubenswrapper[4861]: I0309 09:23:19.844085 4861 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3b1b7ac-3322-4520-892a-1c881ee10c50-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:20 crc kubenswrapper[4861]: I0309 09:23:20.019294 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589db6c89c-wzkz2" event={"ID":"a3b1b7ac-3322-4520-892a-1c881ee10c50","Type":"ContainerDied","Data":"dc28abf7115da75e366ee5a259691d0fe38b9f2f268c369846e5cce9e127694a"} Mar 09 09:23:20 crc kubenswrapper[4861]: I0309 09:23:20.019334 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-wzkz2" Mar 09 09:23:20 crc kubenswrapper[4861]: I0309 09:23:20.022513 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86bbd886cf-47h2z" event={"ID":"aedcdae4-7043-4e90-a4cd-b8d58ceeafb3","Type":"ContainerDied","Data":"88a876c7e47434ebc46b1506439ebc3eab638a39c0f407cfbd77bda53dddbf00"} Mar 09 09:23:20 crc kubenswrapper[4861]: I0309 09:23:20.022600 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-47h2z" Mar 09 09:23:20 crc kubenswrapper[4861]: I0309 09:23:20.025456 4861 generic.go:334] "Generic (PLEG): container finished" podID="d4847dbd-e086-4995-b488-c7611173b6e8" containerID="6ddbfec8faee2cdbe27eb2e4243165e64f47f0d6cb0889ca6757e48b1a09f1d2" exitCode=0 Mar 09 09:23:20 crc kubenswrapper[4861]: I0309 09:23:20.026217 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c47bcb9f9-zbnlj" event={"ID":"d4847dbd-e086-4995-b488-c7611173b6e8","Type":"ContainerDied","Data":"6ddbfec8faee2cdbe27eb2e4243165e64f47f0d6cb0889ca6757e48b1a09f1d2"} Mar 09 09:23:20 crc kubenswrapper[4861]: I0309 09:23:20.029925 4861 generic.go:334] "Generic (PLEG): container finished" podID="eba69073-9a91-4b38-a2d1-2880e6b6882f" containerID="402c852b03159020990d5dd416f13c035f2d92f977112589f886aed9289e2b2f" exitCode=0 Mar 09 09:23:20 crc kubenswrapper[4861]: I0309 09:23:20.030009 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cb4465c9-6bxvk" event={"ID":"eba69073-9a91-4b38-a2d1-2880e6b6882f","Type":"ContainerDied","Data":"402c852b03159020990d5dd416f13c035f2d92f977112589f886aed9289e2b2f"} Mar 09 09:23:20 crc kubenswrapper[4861]: I0309 09:23:20.140882 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-47h2z"] Mar 09 09:23:20 crc kubenswrapper[4861]: I0309 09:23:20.147792 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-47h2z"] Mar 09 09:23:20 crc kubenswrapper[4861]: I0309 09:23:20.175035 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 09 09:23:20 crc kubenswrapper[4861]: I0309 09:23:20.187020 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-s7nq5"] Mar 09 09:23:20 crc kubenswrapper[4861]: I0309 09:23:20.197791 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 09 09:23:20 crc kubenswrapper[4861]: I0309 09:23:20.205509 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 09:23:20 crc kubenswrapper[4861]: I0309 09:23:20.215015 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 09:23:20 crc kubenswrapper[4861]: W0309 09:23:20.241246 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e354b06_2ae2_41af_b5d7_2909bca8cff6.slice/crio-c0e585adcb8dce9c5ba3fb2326c73db93c834b8baba348a619970a02fe4e621d WatchSource:0}: Error finding container c0e585adcb8dce9c5ba3fb2326c73db93c834b8baba348a619970a02fe4e621d: Status 404 returned error can't find the container with id c0e585adcb8dce9c5ba3fb2326c73db93c834b8baba348a619970a02fe4e621d Mar 09 09:23:20 crc kubenswrapper[4861]: I0309 09:23:20.254723 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 09 09:23:20 crc kubenswrapper[4861]: I0309 09:23:20.263035 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 09:23:20 crc kubenswrapper[4861]: I0309 09:23:20.303221 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-wzkz2"] Mar 09 09:23:20 crc kubenswrapper[4861]: I0309 09:23:20.311722 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-wzkz2"] Mar 09 09:23:20 crc kubenswrapper[4861]: I0309 09:23:20.321583 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 09 09:23:20 crc kubenswrapper[4861]: I0309 09:23:20.395431 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-hmb5w"] Mar 09 09:23:20 crc kubenswrapper[4861]: W0309 09:23:20.403777 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcafa7cbf_ac96_4bb5_a33e_90d69df5d797.slice/crio-f753741c4f5c506b3e004ff252dfa977da62d8799fd58de8d39b095c9745a684 WatchSource:0}: Error finding container f753741c4f5c506b3e004ff252dfa977da62d8799fd58de8d39b095c9745a684: Status 404 returned error can't find the container with id f753741c4f5c506b3e004ff252dfa977da62d8799fd58de8d39b095c9745a684 Mar 09 09:23:21 crc kubenswrapper[4861]: I0309 09:23:21.039606 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"03452acf-c21f-4d68-a813-772c30604a60","Type":"ContainerStarted","Data":"55cc8a611264a1d0d865a6150bf4e505782ad035c68ffdad302f5ad95fd38caf"} Mar 09 09:23:21 crc kubenswrapper[4861]: I0309 09:23:21.042478 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cb4465c9-6bxvk" event={"ID":"eba69073-9a91-4b38-a2d1-2880e6b6882f","Type":"ContainerStarted","Data":"c993abb3ee0966f09548c9a4ccdcbb056dcc7bb5eb5f40057033ac0a88b61cfa"} Mar 09 09:23:21 crc kubenswrapper[4861]: I0309 09:23:21.043446 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-78cb4465c9-6bxvk" Mar 09 09:23:21 crc kubenswrapper[4861]: I0309 09:23:21.046919 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c47bcb9f9-zbnlj" event={"ID":"d4847dbd-e086-4995-b488-c7611173b6e8","Type":"ContainerStarted","Data":"c4af947936706a88621a7b2d5ec3639ed298559375ad0f3c182cb888cf1d8117"} Mar 09 09:23:21 crc kubenswrapper[4861]: I0309 09:23:21.047438 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7c47bcb9f9-zbnlj" Mar 09 09:23:21 crc kubenswrapper[4861]: I0309 09:23:21.048658 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8f114ae0-31de-4f17-9bad-1bc0b895d006","Type":"ContainerStarted","Data":"2cd6331216667402a7eb60f0f498e5b4faa400327cae2822314b04d94b25a544"} Mar 09 09:23:21 crc kubenswrapper[4861]: I0309 09:23:21.050000 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"a3671a10-52be-44e3-9c3d-11ba14e8e449","Type":"ContainerStarted","Data":"dcc61e70fce6cd33c97e23f423e9e819a077e83e28f39d5dbcb4a52ccdbeb408"} Mar 09 09:23:21 crc kubenswrapper[4861]: I0309 09:23:21.051204 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b9b83355-ea40-4408-9b77-c717df91e1a9","Type":"ContainerStarted","Data":"aa8b234a3c997c263789e227094be28f7940c63c80e3a887eaa0d8442315f90d"} Mar 09 09:23:21 crc kubenswrapper[4861]: I0309 09:23:21.052501 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-hmb5w" event={"ID":"cafa7cbf-ac96-4bb5-a33e-90d69df5d797","Type":"ContainerStarted","Data":"f753741c4f5c506b3e004ff252dfa977da62d8799fd58de8d39b095c9745a684"} Mar 09 09:23:21 crc kubenswrapper[4861]: I0309 09:23:21.056772 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"1d597158-3a33-4518-a0b9-37cf5b309a28","Type":"ContainerStarted","Data":"5e9d34eac5f749ac91a1662ff53b76d7a37f58ab21ab46dfc885fc4780fbb7a5"} Mar 09 09:23:21 crc kubenswrapper[4861]: I0309 09:23:21.061027 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-s7nq5" event={"ID":"4e354b06-2ae2-41af-b5d7-2909bca8cff6","Type":"ContainerStarted","Data":"c0e585adcb8dce9c5ba3fb2326c73db93c834b8baba348a619970a02fe4e621d"} Mar 09 09:23:21 crc kubenswrapper[4861]: I0309 09:23:21.062790 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-78cb4465c9-6bxvk" podStartSLOduration=3.660527053 podStartE2EDuration="18.062777984s" podCreationTimestamp="2026-03-09 09:23:03 +0000 UTC" firstStartedPulling="2026-03-09 09:23:04.658637365 +0000 UTC m=+1027.743676766" lastFinishedPulling="2026-03-09 09:23:19.060888286 +0000 UTC m=+1042.145927697" observedRunningTime="2026-03-09 09:23:21.058722513 +0000 UTC m=+1044.143761914" watchObservedRunningTime="2026-03-09 09:23:21.062777984 +0000 UTC m=+1044.147817385" Mar 09 09:23:21 crc kubenswrapper[4861]: I0309 09:23:21.064158 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f095ca7b-1959-4cda-bde8-40ca6446e34d","Type":"ContainerStarted","Data":"2ac88f41b13dcd66db618e54e1b7a0517a285f180006948fd9953aae67312281"} Mar 09 09:23:21 crc kubenswrapper[4861]: I0309 09:23:21.065534 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0ab732e3-1122-4f45-a9af-b36eaa88c19e","Type":"ContainerStarted","Data":"a8c9c86666b805adc00881ac9cf4f014e28b2173a30dc8653163a7a2534422d3"} Mar 09 09:23:21 crc kubenswrapper[4861]: I0309 09:23:21.076212 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7c47bcb9f9-zbnlj" podStartSLOduration=5.37269651 podStartE2EDuration="17.076196104s" podCreationTimestamp="2026-03-09 09:23:04 +0000 UTC" firstStartedPulling="2026-03-09 09:23:07.452962153 +0000 UTC m=+1030.538001554" lastFinishedPulling="2026-03-09 09:23:19.156461747 +0000 UTC m=+1042.241501148" observedRunningTime="2026-03-09 09:23:21.075500813 +0000 UTC m=+1044.160540214" watchObservedRunningTime="2026-03-09 09:23:21.076196104 +0000 UTC m=+1044.161235505" Mar 09 09:23:21 crc kubenswrapper[4861]: I0309 09:23:21.127749 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 09 09:23:21 crc kubenswrapper[4861]: W0309 09:23:21.145080 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc895133_add5_4388_8e97_1b0d16306648.slice/crio-c53b78011276c5f5020a9d56e155b4a25d6c74270c5dee43b7b84206b2effb7a WatchSource:0}: Error finding container c53b78011276c5f5020a9d56e155b4a25d6c74270c5dee43b7b84206b2effb7a: Status 404 returned error can't find the container with id c53b78011276c5f5020a9d56e155b4a25d6c74270c5dee43b7b84206b2effb7a Mar 09 09:23:21 crc kubenswrapper[4861]: I0309 09:23:21.687235 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3b1b7ac-3322-4520-892a-1c881ee10c50" path="/var/lib/kubelet/pods/a3b1b7ac-3322-4520-892a-1c881ee10c50/volumes" Mar 09 09:23:21 crc kubenswrapper[4861]: I0309 09:23:21.688258 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aedcdae4-7043-4e90-a4cd-b8d58ceeafb3" path="/var/lib/kubelet/pods/aedcdae4-7043-4e90-a4cd-b8d58ceeafb3/volumes" Mar 09 09:23:22 crc kubenswrapper[4861]: I0309 09:23:22.072073 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"fc895133-add5-4388-8e97-1b0d16306648","Type":"ContainerStarted","Data":"c53b78011276c5f5020a9d56e155b4a25d6c74270c5dee43b7b84206b2effb7a"} Mar 09 09:23:28 crc kubenswrapper[4861]: I0309 09:23:28.126842 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"a3671a10-52be-44e3-9c3d-11ba14e8e449","Type":"ContainerStarted","Data":"2975d09b04bd5c9f69cd314b121706a09d8421c8665d88f7289fc5fb6519738d"} Mar 09 09:23:28 crc kubenswrapper[4861]: I0309 09:23:28.127656 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 09 09:23:28 crc kubenswrapper[4861]: I0309 09:23:28.147514 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=13.194727319 podStartE2EDuration="20.147499033s" podCreationTimestamp="2026-03-09 09:23:08 +0000 UTC" firstStartedPulling="2026-03-09 09:23:20.160935511 +0000 UTC m=+1043.245974912" lastFinishedPulling="2026-03-09 09:23:27.113707225 +0000 UTC m=+1050.198746626" observedRunningTime="2026-03-09 09:23:28.142026899 +0000 UTC m=+1051.227066320" watchObservedRunningTime="2026-03-09 09:23:28.147499033 +0000 UTC m=+1051.232538434" Mar 09 09:23:29 crc kubenswrapper[4861]: I0309 09:23:29.099586 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-78cb4465c9-6bxvk" Mar 09 09:23:29 crc kubenswrapper[4861]: I0309 09:23:29.141650 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8f114ae0-31de-4f17-9bad-1bc0b895d006","Type":"ContainerStarted","Data":"786aa50f7103d9a4ad3e60d0286ae85151dd2f5fb70f4204d1465f5132daf446"} Mar 09 09:23:29 crc kubenswrapper[4861]: I0309 09:23:29.142254 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 09 09:23:29 crc kubenswrapper[4861]: I0309 09:23:29.152522 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-hmb5w" event={"ID":"cafa7cbf-ac96-4bb5-a33e-90d69df5d797","Type":"ContainerStarted","Data":"6f2db70212be1dc1e9795a0d7f535b5a4bcea5b2ca97f98dd6baa2173f5fb4ed"} Mar 09 09:23:29 crc kubenswrapper[4861]: I0309 09:23:29.162524 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"1d597158-3a33-4518-a0b9-37cf5b309a28","Type":"ContainerStarted","Data":"eb65d940429679b23ea53b6ab363d0a33219f1b14b445812e723f802d914559a"} Mar 09 09:23:29 crc kubenswrapper[4861]: I0309 09:23:29.164650 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=11.445477126 podStartE2EDuration="19.164627303s" podCreationTimestamp="2026-03-09 09:23:10 +0000 UTC" firstStartedPulling="2026-03-09 09:23:20.248308436 +0000 UTC m=+1043.333347837" lastFinishedPulling="2026-03-09 09:23:27.967458613 +0000 UTC m=+1051.052498014" observedRunningTime="2026-03-09 09:23:29.157746339 +0000 UTC m=+1052.242785740" watchObservedRunningTime="2026-03-09 09:23:29.164627303 +0000 UTC m=+1052.249666704" Mar 09 09:23:29 crc kubenswrapper[4861]: I0309 09:23:29.168008 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-s7nq5" event={"ID":"4e354b06-2ae2-41af-b5d7-2909bca8cff6","Type":"ContainerStarted","Data":"a6eb17a021bb7d3977462605d98f78a703b592a765acfa03d8ab8213d12a59b4"} Mar 09 09:23:29 crc kubenswrapper[4861]: I0309 09:23:29.168139 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-s7nq5" Mar 09 09:23:29 crc kubenswrapper[4861]: I0309 09:23:29.169758 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"fc895133-add5-4388-8e97-1b0d16306648","Type":"ContainerStarted","Data":"f918c74ecae8f725a57e78a3dba72ca743c381af306a3b52415eac1b4165c237"} Mar 09 09:23:29 crc kubenswrapper[4861]: I0309 09:23:29.173441 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f095ca7b-1959-4cda-bde8-40ca6446e34d","Type":"ContainerStarted","Data":"5355f4f86bb8a2c94ddd09331b4109dcd3a400a1e17d1c951ec47a1d786542ec"} Mar 09 09:23:29 crc kubenswrapper[4861]: I0309 09:23:29.186467 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0ab732e3-1122-4f45-a9af-b36eaa88c19e","Type":"ContainerStarted","Data":"c8b4ef503d4a8edf179b00c721a7afafb6d7d3d7e42df7e3e52d63bc1ce15be8"} Mar 09 09:23:29 crc kubenswrapper[4861]: I0309 09:23:29.206330 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-s7nq5" podStartSLOduration=8.936859668 podStartE2EDuration="16.206309136s" podCreationTimestamp="2026-03-09 09:23:13 +0000 UTC" firstStartedPulling="2026-03-09 09:23:20.247023257 +0000 UTC m=+1043.332062658" lastFinishedPulling="2026-03-09 09:23:27.516472725 +0000 UTC m=+1050.601512126" observedRunningTime="2026-03-09 09:23:29.202269036 +0000 UTC m=+1052.287308457" watchObservedRunningTime="2026-03-09 09:23:29.206309136 +0000 UTC m=+1052.291348537" Mar 09 09:23:29 crc kubenswrapper[4861]: I0309 09:23:29.408531 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7c47bcb9f9-zbnlj" Mar 09 09:23:29 crc kubenswrapper[4861]: I0309 09:23:29.463715 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-6bxvk"] Mar 09 09:23:29 crc kubenswrapper[4861]: I0309 09:23:29.463941 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-78cb4465c9-6bxvk" podUID="eba69073-9a91-4b38-a2d1-2880e6b6882f" containerName="dnsmasq-dns" containerID="cri-o://c993abb3ee0966f09548c9a4ccdcbb056dcc7bb5eb5f40057033ac0a88b61cfa" gracePeriod=10 Mar 09 09:23:30 crc kubenswrapper[4861]: I0309 09:23:30.024728 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cb4465c9-6bxvk" Mar 09 09:23:30 crc kubenswrapper[4861]: I0309 09:23:30.119209 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eba69073-9a91-4b38-a2d1-2880e6b6882f-config\") pod \"eba69073-9a91-4b38-a2d1-2880e6b6882f\" (UID: \"eba69073-9a91-4b38-a2d1-2880e6b6882f\") " Mar 09 09:23:30 crc kubenswrapper[4861]: I0309 09:23:30.119612 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xbl8\" (UniqueName: \"kubernetes.io/projected/eba69073-9a91-4b38-a2d1-2880e6b6882f-kube-api-access-5xbl8\") pod \"eba69073-9a91-4b38-a2d1-2880e6b6882f\" (UID: \"eba69073-9a91-4b38-a2d1-2880e6b6882f\") " Mar 09 09:23:30 crc kubenswrapper[4861]: I0309 09:23:30.119839 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eba69073-9a91-4b38-a2d1-2880e6b6882f-dns-svc\") pod \"eba69073-9a91-4b38-a2d1-2880e6b6882f\" (UID: \"eba69073-9a91-4b38-a2d1-2880e6b6882f\") " Mar 09 09:23:30 crc kubenswrapper[4861]: I0309 09:23:30.130629 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eba69073-9a91-4b38-a2d1-2880e6b6882f-kube-api-access-5xbl8" (OuterVolumeSpecName: "kube-api-access-5xbl8") pod "eba69073-9a91-4b38-a2d1-2880e6b6882f" (UID: "eba69073-9a91-4b38-a2d1-2880e6b6882f"). InnerVolumeSpecName "kube-api-access-5xbl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:23:30 crc kubenswrapper[4861]: I0309 09:23:30.170687 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eba69073-9a91-4b38-a2d1-2880e6b6882f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "eba69073-9a91-4b38-a2d1-2880e6b6882f" (UID: "eba69073-9a91-4b38-a2d1-2880e6b6882f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:23:30 crc kubenswrapper[4861]: I0309 09:23:30.173062 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eba69073-9a91-4b38-a2d1-2880e6b6882f-config" (OuterVolumeSpecName: "config") pod "eba69073-9a91-4b38-a2d1-2880e6b6882f" (UID: "eba69073-9a91-4b38-a2d1-2880e6b6882f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:23:30 crc kubenswrapper[4861]: I0309 09:23:30.201467 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b9b83355-ea40-4408-9b77-c717df91e1a9","Type":"ContainerStarted","Data":"31586b8681c909460b5f41e700f0e45d5675654a1c0fc223423f5e764a90eb87"} Mar 09 09:23:30 crc kubenswrapper[4861]: I0309 09:23:30.211295 4861 generic.go:334] "Generic (PLEG): container finished" podID="cafa7cbf-ac96-4bb5-a33e-90d69df5d797" containerID="6f2db70212be1dc1e9795a0d7f535b5a4bcea5b2ca97f98dd6baa2173f5fb4ed" exitCode=0 Mar 09 09:23:30 crc kubenswrapper[4861]: I0309 09:23:30.211663 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-hmb5w" event={"ID":"cafa7cbf-ac96-4bb5-a33e-90d69df5d797","Type":"ContainerDied","Data":"6f2db70212be1dc1e9795a0d7f535b5a4bcea5b2ca97f98dd6baa2173f5fb4ed"} Mar 09 09:23:30 crc kubenswrapper[4861]: I0309 09:23:30.214575 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"03452acf-c21f-4d68-a813-772c30604a60","Type":"ContainerStarted","Data":"a9f20eea0867df7623d564d9ee258fb8c480222ebfae15bc43c9f672e31b8bfc"} Mar 09 09:23:30 crc kubenswrapper[4861]: I0309 09:23:30.217715 4861 generic.go:334] "Generic (PLEG): container finished" podID="eba69073-9a91-4b38-a2d1-2880e6b6882f" containerID="c993abb3ee0966f09548c9a4ccdcbb056dcc7bb5eb5f40057033ac0a88b61cfa" exitCode=0 Mar 09 09:23:30 crc kubenswrapper[4861]: I0309 09:23:30.217747 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cb4465c9-6bxvk" event={"ID":"eba69073-9a91-4b38-a2d1-2880e6b6882f","Type":"ContainerDied","Data":"c993abb3ee0966f09548c9a4ccdcbb056dcc7bb5eb5f40057033ac0a88b61cfa"} Mar 09 09:23:30 crc kubenswrapper[4861]: I0309 09:23:30.217785 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78cb4465c9-6bxvk" event={"ID":"eba69073-9a91-4b38-a2d1-2880e6b6882f","Type":"ContainerDied","Data":"4dd8779e44cab1535a198761db1d7878ebfc9dd40ef53862ee5e0f2d02f7ef1f"} Mar 09 09:23:30 crc kubenswrapper[4861]: I0309 09:23:30.217805 4861 scope.go:117] "RemoveContainer" containerID="c993abb3ee0966f09548c9a4ccdcbb056dcc7bb5eb5f40057033ac0a88b61cfa" Mar 09 09:23:30 crc kubenswrapper[4861]: I0309 09:23:30.217807 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78cb4465c9-6bxvk" Mar 09 09:23:30 crc kubenswrapper[4861]: I0309 09:23:30.221179 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xbl8\" (UniqueName: \"kubernetes.io/projected/eba69073-9a91-4b38-a2d1-2880e6b6882f-kube-api-access-5xbl8\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:30 crc kubenswrapper[4861]: I0309 09:23:30.221207 4861 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eba69073-9a91-4b38-a2d1-2880e6b6882f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:30 crc kubenswrapper[4861]: I0309 09:23:30.221216 4861 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eba69073-9a91-4b38-a2d1-2880e6b6882f-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:30 crc kubenswrapper[4861]: I0309 09:23:30.247821 4861 scope.go:117] "RemoveContainer" containerID="402c852b03159020990d5dd416f13c035f2d92f977112589f886aed9289e2b2f" Mar 09 09:23:30 crc kubenswrapper[4861]: I0309 09:23:30.291839 4861 scope.go:117] "RemoveContainer" containerID="c993abb3ee0966f09548c9a4ccdcbb056dcc7bb5eb5f40057033ac0a88b61cfa" Mar 09 09:23:30 crc kubenswrapper[4861]: I0309 09:23:30.292313 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-6bxvk"] Mar 09 09:23:30 crc kubenswrapper[4861]: E0309 09:23:30.292356 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c993abb3ee0966f09548c9a4ccdcbb056dcc7bb5eb5f40057033ac0a88b61cfa\": container with ID starting with c993abb3ee0966f09548c9a4ccdcbb056dcc7bb5eb5f40057033ac0a88b61cfa not found: ID does not exist" containerID="c993abb3ee0966f09548c9a4ccdcbb056dcc7bb5eb5f40057033ac0a88b61cfa" Mar 09 09:23:30 crc kubenswrapper[4861]: I0309 09:23:30.292423 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c993abb3ee0966f09548c9a4ccdcbb056dcc7bb5eb5f40057033ac0a88b61cfa"} err="failed to get container status \"c993abb3ee0966f09548c9a4ccdcbb056dcc7bb5eb5f40057033ac0a88b61cfa\": rpc error: code = NotFound desc = could not find container \"c993abb3ee0966f09548c9a4ccdcbb056dcc7bb5eb5f40057033ac0a88b61cfa\": container with ID starting with c993abb3ee0966f09548c9a4ccdcbb056dcc7bb5eb5f40057033ac0a88b61cfa not found: ID does not exist" Mar 09 09:23:30 crc kubenswrapper[4861]: I0309 09:23:30.292449 4861 scope.go:117] "RemoveContainer" containerID="402c852b03159020990d5dd416f13c035f2d92f977112589f886aed9289e2b2f" Mar 09 09:23:30 crc kubenswrapper[4861]: E0309 09:23:30.292985 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"402c852b03159020990d5dd416f13c035f2d92f977112589f886aed9289e2b2f\": container with ID starting with 402c852b03159020990d5dd416f13c035f2d92f977112589f886aed9289e2b2f not found: ID does not exist" containerID="402c852b03159020990d5dd416f13c035f2d92f977112589f886aed9289e2b2f" Mar 09 09:23:30 crc kubenswrapper[4861]: I0309 09:23:30.293019 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"402c852b03159020990d5dd416f13c035f2d92f977112589f886aed9289e2b2f"} err="failed to get container status \"402c852b03159020990d5dd416f13c035f2d92f977112589f886aed9289e2b2f\": rpc error: code = NotFound desc = could not find container \"402c852b03159020990d5dd416f13c035f2d92f977112589f886aed9289e2b2f\": container with ID starting with 402c852b03159020990d5dd416f13c035f2d92f977112589f886aed9289e2b2f not found: ID does not exist" Mar 09 09:23:30 crc kubenswrapper[4861]: I0309 09:23:30.300575 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78cb4465c9-6bxvk"] Mar 09 09:23:31 crc kubenswrapper[4861]: I0309 09:23:31.226568 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-hmb5w" event={"ID":"cafa7cbf-ac96-4bb5-a33e-90d69df5d797","Type":"ContainerStarted","Data":"cded6d55ea8f0357ed9d8568ce7be69a9274560518513e7df836dc7d11899943"} Mar 09 09:23:31 crc kubenswrapper[4861]: I0309 09:23:31.227134 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-hmb5w" event={"ID":"cafa7cbf-ac96-4bb5-a33e-90d69df5d797","Type":"ContainerStarted","Data":"daf78efd69028adba2d4bafdf7242240e59bb89d9c7fda8af9885d73e2587220"} Mar 09 09:23:31 crc kubenswrapper[4861]: I0309 09:23:31.227155 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-hmb5w" Mar 09 09:23:31 crc kubenswrapper[4861]: I0309 09:23:31.227168 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-hmb5w" Mar 09 09:23:31 crc kubenswrapper[4861]: I0309 09:23:31.247812 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-hmb5w" podStartSLOduration=11.34801146 podStartE2EDuration="18.247795814s" podCreationTimestamp="2026-03-09 09:23:13 +0000 UTC" firstStartedPulling="2026-03-09 09:23:20.406293467 +0000 UTC m=+1043.491332868" lastFinishedPulling="2026-03-09 09:23:27.306077821 +0000 UTC m=+1050.391117222" observedRunningTime="2026-03-09 09:23:31.245488546 +0000 UTC m=+1054.330527957" watchObservedRunningTime="2026-03-09 09:23:31.247795814 +0000 UTC m=+1054.332835215" Mar 09 09:23:31 crc kubenswrapper[4861]: I0309 09:23:31.668000 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eba69073-9a91-4b38-a2d1-2880e6b6882f" path="/var/lib/kubelet/pods/eba69073-9a91-4b38-a2d1-2880e6b6882f/volumes" Mar 09 09:23:33 crc kubenswrapper[4861]: I0309 09:23:33.243992 4861 generic.go:334] "Generic (PLEG): container finished" podID="f095ca7b-1959-4cda-bde8-40ca6446e34d" containerID="5355f4f86bb8a2c94ddd09331b4109dcd3a400a1e17d1c951ec47a1d786542ec" exitCode=0 Mar 09 09:23:33 crc kubenswrapper[4861]: I0309 09:23:33.244176 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f095ca7b-1959-4cda-bde8-40ca6446e34d","Type":"ContainerDied","Data":"5355f4f86bb8a2c94ddd09331b4109dcd3a400a1e17d1c951ec47a1d786542ec"} Mar 09 09:23:33 crc kubenswrapper[4861]: I0309 09:23:33.246690 4861 generic.go:334] "Generic (PLEG): container finished" podID="0ab732e3-1122-4f45-a9af-b36eaa88c19e" containerID="c8b4ef503d4a8edf179b00c721a7afafb6d7d3d7e42df7e3e52d63bc1ce15be8" exitCode=0 Mar 09 09:23:33 crc kubenswrapper[4861]: I0309 09:23:33.246819 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0ab732e3-1122-4f45-a9af-b36eaa88c19e","Type":"ContainerDied","Data":"c8b4ef503d4a8edf179b00c721a7afafb6d7d3d7e42df7e3e52d63bc1ce15be8"} Mar 09 09:23:33 crc kubenswrapper[4861]: I0309 09:23:33.249846 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"1d597158-3a33-4518-a0b9-37cf5b309a28","Type":"ContainerStarted","Data":"a065792ed58fa06207dbdce3f558c00a66ecfbfd980396d28086b045f9ff3e96"} Mar 09 09:23:33 crc kubenswrapper[4861]: I0309 09:23:33.274858 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"fc895133-add5-4388-8e97-1b0d16306648","Type":"ContainerStarted","Data":"3530040610eb72dd7099e522b1a6abbf54570a0e04b9f8c4fce9716e46923074"} Mar 09 09:23:33 crc kubenswrapper[4861]: I0309 09:23:33.335268 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=8.070316839 podStartE2EDuration="20.335243823s" podCreationTimestamp="2026-03-09 09:23:13 +0000 UTC" firstStartedPulling="2026-03-09 09:23:20.32891706 +0000 UTC m=+1043.413956461" lastFinishedPulling="2026-03-09 09:23:32.593844044 +0000 UTC m=+1055.678883445" observedRunningTime="2026-03-09 09:23:33.332773919 +0000 UTC m=+1056.417813320" watchObservedRunningTime="2026-03-09 09:23:33.335243823 +0000 UTC m=+1056.420283224" Mar 09 09:23:33 crc kubenswrapper[4861]: I0309 09:23:33.369625 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=5.936267842 podStartE2EDuration="17.369605348s" podCreationTimestamp="2026-03-09 09:23:16 +0000 UTC" firstStartedPulling="2026-03-09 09:23:21.147065477 +0000 UTC m=+1044.232104878" lastFinishedPulling="2026-03-09 09:23:32.580402983 +0000 UTC m=+1055.665442384" observedRunningTime="2026-03-09 09:23:33.353540518 +0000 UTC m=+1056.438579929" watchObservedRunningTime="2026-03-09 09:23:33.369605348 +0000 UTC m=+1056.454644759" Mar 09 09:23:33 crc kubenswrapper[4861]: I0309 09:23:33.375537 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 09 09:23:34 crc kubenswrapper[4861]: I0309 09:23:34.283246 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f095ca7b-1959-4cda-bde8-40ca6446e34d","Type":"ContainerStarted","Data":"6deebb651b98f783c48110999f9f926730995476e863c1618d5b1718e4573c58"} Mar 09 09:23:34 crc kubenswrapper[4861]: I0309 09:23:34.285296 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"0ab732e3-1122-4f45-a9af-b36eaa88c19e","Type":"ContainerStarted","Data":"cee746123b6f12cd37c2f04b711cef1e6255d334859930c476800cf3b89c1282"} Mar 09 09:23:34 crc kubenswrapper[4861]: I0309 09:23:34.308721 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=21.177353033 podStartE2EDuration="28.308704492s" podCreationTimestamp="2026-03-09 09:23:06 +0000 UTC" firstStartedPulling="2026-03-09 09:23:20.263519199 +0000 UTC m=+1043.348558600" lastFinishedPulling="2026-03-09 09:23:27.394870658 +0000 UTC m=+1050.479910059" observedRunningTime="2026-03-09 09:23:34.300092695 +0000 UTC m=+1057.385132106" watchObservedRunningTime="2026-03-09 09:23:34.308704492 +0000 UTC m=+1057.393743883" Mar 09 09:23:34 crc kubenswrapper[4861]: I0309 09:23:34.331629 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=21.697474472 podStartE2EDuration="29.331610405s" podCreationTimestamp="2026-03-09 09:23:05 +0000 UTC" firstStartedPulling="2026-03-09 09:23:20.225806065 +0000 UTC m=+1043.310845466" lastFinishedPulling="2026-03-09 09:23:27.859941988 +0000 UTC m=+1050.944981399" observedRunningTime="2026-03-09 09:23:34.326652647 +0000 UTC m=+1057.411692068" watchObservedRunningTime="2026-03-09 09:23:34.331610405 +0000 UTC m=+1057.416649806" Mar 09 09:23:34 crc kubenswrapper[4861]: I0309 09:23:34.660958 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 09 09:23:35 crc kubenswrapper[4861]: I0309 09:23:35.569064 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 09 09:23:35 crc kubenswrapper[4861]: I0309 09:23:35.645579 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 09 09:23:35 crc kubenswrapper[4861]: I0309 09:23:35.676749 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 09 09:23:35 crc kubenswrapper[4861]: I0309 09:23:35.711159 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.298509 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.339789 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.340438 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.596934 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6444958b7f-68zjz"] Mar 09 09:23:36 crc kubenswrapper[4861]: E0309 09:23:36.597307 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eba69073-9a91-4b38-a2d1-2880e6b6882f" containerName="init" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.597326 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="eba69073-9a91-4b38-a2d1-2880e6b6882f" containerName="init" Mar 09 09:23:36 crc kubenswrapper[4861]: E0309 09:23:36.597356 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eba69073-9a91-4b38-a2d1-2880e6b6882f" containerName="dnsmasq-dns" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.597364 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="eba69073-9a91-4b38-a2d1-2880e6b6882f" containerName="dnsmasq-dns" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.597555 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="eba69073-9a91-4b38-a2d1-2880e6b6882f" containerName="dnsmasq-dns" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.598604 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6444958b7f-68zjz" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.601747 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.609533 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6444958b7f-68zjz"] Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.658110 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-xvlmj"] Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.659462 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-xvlmj" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.661042 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d44ecacc-e8b0-4ffd-84fd-bd3e76203d47-ovsdbserver-nb\") pod \"dnsmasq-dns-6444958b7f-68zjz\" (UID: \"d44ecacc-e8b0-4ffd-84fd-bd3e76203d47\") " pod="openstack/dnsmasq-dns-6444958b7f-68zjz" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.664684 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbqmb\" (UniqueName: \"kubernetes.io/projected/d44ecacc-e8b0-4ffd-84fd-bd3e76203d47-kube-api-access-sbqmb\") pod \"dnsmasq-dns-6444958b7f-68zjz\" (UID: \"d44ecacc-e8b0-4ffd-84fd-bd3e76203d47\") " pod="openstack/dnsmasq-dns-6444958b7f-68zjz" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.664952 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d44ecacc-e8b0-4ffd-84fd-bd3e76203d47-config\") pod \"dnsmasq-dns-6444958b7f-68zjz\" (UID: \"d44ecacc-e8b0-4ffd-84fd-bd3e76203d47\") " pod="openstack/dnsmasq-dns-6444958b7f-68zjz" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.665191 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d44ecacc-e8b0-4ffd-84fd-bd3e76203d47-dns-svc\") pod \"dnsmasq-dns-6444958b7f-68zjz\" (UID: \"d44ecacc-e8b0-4ffd-84fd-bd3e76203d47\") " pod="openstack/dnsmasq-dns-6444958b7f-68zjz" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.665498 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.670927 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-xvlmj"] Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.724962 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.726176 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.729801 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.729844 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.729982 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.729801 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-cfxwc" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.753805 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.766101 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82770fbe-3052-4367-9c2d-a19a11d3a695-config\") pod \"ovn-controller-metrics-xvlmj\" (UID: \"82770fbe-3052-4367-9c2d-a19a11d3a695\") " pod="openstack/ovn-controller-metrics-xvlmj" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.766140 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/82770fbe-3052-4367-9c2d-a19a11d3a695-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-xvlmj\" (UID: \"82770fbe-3052-4367-9c2d-a19a11d3a695\") " pod="openstack/ovn-controller-metrics-xvlmj" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.766174 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbqmb\" (UniqueName: \"kubernetes.io/projected/d44ecacc-e8b0-4ffd-84fd-bd3e76203d47-kube-api-access-sbqmb\") pod \"dnsmasq-dns-6444958b7f-68zjz\" (UID: \"d44ecacc-e8b0-4ffd-84fd-bd3e76203d47\") " pod="openstack/dnsmasq-dns-6444958b7f-68zjz" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.766260 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d44ecacc-e8b0-4ffd-84fd-bd3e76203d47-config\") pod \"dnsmasq-dns-6444958b7f-68zjz\" (UID: \"d44ecacc-e8b0-4ffd-84fd-bd3e76203d47\") " pod="openstack/dnsmasq-dns-6444958b7f-68zjz" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.766313 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82770fbe-3052-4367-9c2d-a19a11d3a695-combined-ca-bundle\") pod \"ovn-controller-metrics-xvlmj\" (UID: \"82770fbe-3052-4367-9c2d-a19a11d3a695\") " pod="openstack/ovn-controller-metrics-xvlmj" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.766350 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/82770fbe-3052-4367-9c2d-a19a11d3a695-ovs-rundir\") pod \"ovn-controller-metrics-xvlmj\" (UID: \"82770fbe-3052-4367-9c2d-a19a11d3a695\") " pod="openstack/ovn-controller-metrics-xvlmj" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.769000 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d44ecacc-e8b0-4ffd-84fd-bd3e76203d47-dns-svc\") pod \"dnsmasq-dns-6444958b7f-68zjz\" (UID: \"d44ecacc-e8b0-4ffd-84fd-bd3e76203d47\") " pod="openstack/dnsmasq-dns-6444958b7f-68zjz" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.769046 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/82770fbe-3052-4367-9c2d-a19a11d3a695-ovn-rundir\") pod \"ovn-controller-metrics-xvlmj\" (UID: \"82770fbe-3052-4367-9c2d-a19a11d3a695\") " pod="openstack/ovn-controller-metrics-xvlmj" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.769150 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7996k\" (UniqueName: \"kubernetes.io/projected/82770fbe-3052-4367-9c2d-a19a11d3a695-kube-api-access-7996k\") pod \"ovn-controller-metrics-xvlmj\" (UID: \"82770fbe-3052-4367-9c2d-a19a11d3a695\") " pod="openstack/ovn-controller-metrics-xvlmj" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.769186 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d44ecacc-e8b0-4ffd-84fd-bd3e76203d47-ovsdbserver-nb\") pod \"dnsmasq-dns-6444958b7f-68zjz\" (UID: \"d44ecacc-e8b0-4ffd-84fd-bd3e76203d47\") " pod="openstack/dnsmasq-dns-6444958b7f-68zjz" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.770101 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d44ecacc-e8b0-4ffd-84fd-bd3e76203d47-ovsdbserver-nb\") pod \"dnsmasq-dns-6444958b7f-68zjz\" (UID: \"d44ecacc-e8b0-4ffd-84fd-bd3e76203d47\") " pod="openstack/dnsmasq-dns-6444958b7f-68zjz" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.770704 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d44ecacc-e8b0-4ffd-84fd-bd3e76203d47-config\") pod \"dnsmasq-dns-6444958b7f-68zjz\" (UID: \"d44ecacc-e8b0-4ffd-84fd-bd3e76203d47\") " pod="openstack/dnsmasq-dns-6444958b7f-68zjz" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.766143 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6444958b7f-68zjz"] Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.770908 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d44ecacc-e8b0-4ffd-84fd-bd3e76203d47-dns-svc\") pod \"dnsmasq-dns-6444958b7f-68zjz\" (UID: \"d44ecacc-e8b0-4ffd-84fd-bd3e76203d47\") " pod="openstack/dnsmasq-dns-6444958b7f-68zjz" Mar 09 09:23:36 crc kubenswrapper[4861]: E0309 09:23:36.771210 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-sbqmb ovsdbserver-nb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-6444958b7f-68zjz" podUID="d44ecacc-e8b0-4ffd-84fd-bd3e76203d47" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.803730 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbqmb\" (UniqueName: \"kubernetes.io/projected/d44ecacc-e8b0-4ffd-84fd-bd3e76203d47-kube-api-access-sbqmb\") pod \"dnsmasq-dns-6444958b7f-68zjz\" (UID: \"d44ecacc-e8b0-4ffd-84fd-bd3e76203d47\") " pod="openstack/dnsmasq-dns-6444958b7f-68zjz" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.810423 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-8wmfs"] Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.811758 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b57d9888c-8wmfs" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.814053 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.826768 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-8wmfs"] Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.870872 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dea6c69a-803b-498c-b7e2-7d76629de3dc-config\") pod \"ovn-northd-0\" (UID: \"dea6c69a-803b-498c-b7e2-7d76629de3dc\") " pod="openstack/ovn-northd-0" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.870942 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pj72k\" (UniqueName: \"kubernetes.io/projected/dea6c69a-803b-498c-b7e2-7d76629de3dc-kube-api-access-pj72k\") pod \"ovn-northd-0\" (UID: \"dea6c69a-803b-498c-b7e2-7d76629de3dc\") " pod="openstack/ovn-northd-0" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.870980 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82770fbe-3052-4367-9c2d-a19a11d3a695-config\") pod \"ovn-controller-metrics-xvlmj\" (UID: \"82770fbe-3052-4367-9c2d-a19a11d3a695\") " pod="openstack/ovn-controller-metrics-xvlmj" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.870999 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/dea6c69a-803b-498c-b7e2-7d76629de3dc-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"dea6c69a-803b-498c-b7e2-7d76629de3dc\") " pod="openstack/ovn-northd-0" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.871017 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/82770fbe-3052-4367-9c2d-a19a11d3a695-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-xvlmj\" (UID: \"82770fbe-3052-4367-9c2d-a19a11d3a695\") " pod="openstack/ovn-controller-metrics-xvlmj" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.871046 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82770fbe-3052-4367-9c2d-a19a11d3a695-combined-ca-bundle\") pod \"ovn-controller-metrics-xvlmj\" (UID: \"82770fbe-3052-4367-9c2d-a19a11d3a695\") " pod="openstack/ovn-controller-metrics-xvlmj" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.871065 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dea6c69a-803b-498c-b7e2-7d76629de3dc-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"dea6c69a-803b-498c-b7e2-7d76629de3dc\") " pod="openstack/ovn-northd-0" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.871080 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/82770fbe-3052-4367-9c2d-a19a11d3a695-ovs-rundir\") pod \"ovn-controller-metrics-xvlmj\" (UID: \"82770fbe-3052-4367-9c2d-a19a11d3a695\") " pod="openstack/ovn-controller-metrics-xvlmj" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.871118 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dea6c69a-803b-498c-b7e2-7d76629de3dc-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"dea6c69a-803b-498c-b7e2-7d76629de3dc\") " pod="openstack/ovn-northd-0" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.872090 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82770fbe-3052-4367-9c2d-a19a11d3a695-config\") pod \"ovn-controller-metrics-xvlmj\" (UID: \"82770fbe-3052-4367-9c2d-a19a11d3a695\") " pod="openstack/ovn-controller-metrics-xvlmj" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.872444 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dea6c69a-803b-498c-b7e2-7d76629de3dc-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"dea6c69a-803b-498c-b7e2-7d76629de3dc\") " pod="openstack/ovn-northd-0" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.872492 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/82770fbe-3052-4367-9c2d-a19a11d3a695-ovn-rundir\") pod \"ovn-controller-metrics-xvlmj\" (UID: \"82770fbe-3052-4367-9c2d-a19a11d3a695\") " pod="openstack/ovn-controller-metrics-xvlmj" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.872539 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dea6c69a-803b-498c-b7e2-7d76629de3dc-scripts\") pod \"ovn-northd-0\" (UID: \"dea6c69a-803b-498c-b7e2-7d76629de3dc\") " pod="openstack/ovn-northd-0" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.872535 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/82770fbe-3052-4367-9c2d-a19a11d3a695-ovs-rundir\") pod \"ovn-controller-metrics-xvlmj\" (UID: \"82770fbe-3052-4367-9c2d-a19a11d3a695\") " pod="openstack/ovn-controller-metrics-xvlmj" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.872581 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7996k\" (UniqueName: \"kubernetes.io/projected/82770fbe-3052-4367-9c2d-a19a11d3a695-kube-api-access-7996k\") pod \"ovn-controller-metrics-xvlmj\" (UID: \"82770fbe-3052-4367-9c2d-a19a11d3a695\") " pod="openstack/ovn-controller-metrics-xvlmj" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.872622 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/82770fbe-3052-4367-9c2d-a19a11d3a695-ovn-rundir\") pod \"ovn-controller-metrics-xvlmj\" (UID: \"82770fbe-3052-4367-9c2d-a19a11d3a695\") " pod="openstack/ovn-controller-metrics-xvlmj" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.874088 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/82770fbe-3052-4367-9c2d-a19a11d3a695-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-xvlmj\" (UID: \"82770fbe-3052-4367-9c2d-a19a11d3a695\") " pod="openstack/ovn-controller-metrics-xvlmj" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.875342 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82770fbe-3052-4367-9c2d-a19a11d3a695-combined-ca-bundle\") pod \"ovn-controller-metrics-xvlmj\" (UID: \"82770fbe-3052-4367-9c2d-a19a11d3a695\") " pod="openstack/ovn-controller-metrics-xvlmj" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.890098 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7996k\" (UniqueName: \"kubernetes.io/projected/82770fbe-3052-4367-9c2d-a19a11d3a695-kube-api-access-7996k\") pod \"ovn-controller-metrics-xvlmj\" (UID: \"82770fbe-3052-4367-9c2d-a19a11d3a695\") " pod="openstack/ovn-controller-metrics-xvlmj" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.923733 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.923781 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.973817 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dea6c69a-803b-498c-b7e2-7d76629de3dc-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"dea6c69a-803b-498c-b7e2-7d76629de3dc\") " pod="openstack/ovn-northd-0" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.973866 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dea6c69a-803b-498c-b7e2-7d76629de3dc-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"dea6c69a-803b-498c-b7e2-7d76629de3dc\") " pod="openstack/ovn-northd-0" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.973919 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dea6c69a-803b-498c-b7e2-7d76629de3dc-scripts\") pod \"ovn-northd-0\" (UID: \"dea6c69a-803b-498c-b7e2-7d76629de3dc\") " pod="openstack/ovn-northd-0" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.973954 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dea6c69a-803b-498c-b7e2-7d76629de3dc-config\") pod \"ovn-northd-0\" (UID: \"dea6c69a-803b-498c-b7e2-7d76629de3dc\") " pod="openstack/ovn-northd-0" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.973983 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pj72k\" (UniqueName: \"kubernetes.io/projected/dea6c69a-803b-498c-b7e2-7d76629de3dc-kube-api-access-pj72k\") pod \"ovn-northd-0\" (UID: \"dea6c69a-803b-498c-b7e2-7d76629de3dc\") " pod="openstack/ovn-northd-0" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.974000 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/dea6c69a-803b-498c-b7e2-7d76629de3dc-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"dea6c69a-803b-498c-b7e2-7d76629de3dc\") " pod="openstack/ovn-northd-0" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.974019 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c8a795fa-6d2f-4387-92c9-e2e5e4b9937b-ovsdbserver-nb\") pod \"dnsmasq-dns-7b57d9888c-8wmfs\" (UID: \"c8a795fa-6d2f-4387-92c9-e2e5e4b9937b\") " pod="openstack/dnsmasq-dns-7b57d9888c-8wmfs" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.974036 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngncj\" (UniqueName: \"kubernetes.io/projected/c8a795fa-6d2f-4387-92c9-e2e5e4b9937b-kube-api-access-ngncj\") pod \"dnsmasq-dns-7b57d9888c-8wmfs\" (UID: \"c8a795fa-6d2f-4387-92c9-e2e5e4b9937b\") " pod="openstack/dnsmasq-dns-7b57d9888c-8wmfs" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.974059 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c8a795fa-6d2f-4387-92c9-e2e5e4b9937b-ovsdbserver-sb\") pod \"dnsmasq-dns-7b57d9888c-8wmfs\" (UID: \"c8a795fa-6d2f-4387-92c9-e2e5e4b9937b\") " pod="openstack/dnsmasq-dns-7b57d9888c-8wmfs" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.974078 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8a795fa-6d2f-4387-92c9-e2e5e4b9937b-config\") pod \"dnsmasq-dns-7b57d9888c-8wmfs\" (UID: \"c8a795fa-6d2f-4387-92c9-e2e5e4b9937b\") " pod="openstack/dnsmasq-dns-7b57d9888c-8wmfs" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.975254 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dea6c69a-803b-498c-b7e2-7d76629de3dc-scripts\") pod \"ovn-northd-0\" (UID: \"dea6c69a-803b-498c-b7e2-7d76629de3dc\") " pod="openstack/ovn-northd-0" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.975302 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dea6c69a-803b-498c-b7e2-7d76629de3dc-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"dea6c69a-803b-498c-b7e2-7d76629de3dc\") " pod="openstack/ovn-northd-0" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.975395 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8a795fa-6d2f-4387-92c9-e2e5e4b9937b-dns-svc\") pod \"dnsmasq-dns-7b57d9888c-8wmfs\" (UID: \"c8a795fa-6d2f-4387-92c9-e2e5e4b9937b\") " pod="openstack/dnsmasq-dns-7b57d9888c-8wmfs" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.976562 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/dea6c69a-803b-498c-b7e2-7d76629de3dc-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"dea6c69a-803b-498c-b7e2-7d76629de3dc\") " pod="openstack/ovn-northd-0" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.977551 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dea6c69a-803b-498c-b7e2-7d76629de3dc-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"dea6c69a-803b-498c-b7e2-7d76629de3dc\") " pod="openstack/ovn-northd-0" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.977630 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/dea6c69a-803b-498c-b7e2-7d76629de3dc-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"dea6c69a-803b-498c-b7e2-7d76629de3dc\") " pod="openstack/ovn-northd-0" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.977665 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dea6c69a-803b-498c-b7e2-7d76629de3dc-config\") pod \"ovn-northd-0\" (UID: \"dea6c69a-803b-498c-b7e2-7d76629de3dc\") " pod="openstack/ovn-northd-0" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.978095 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dea6c69a-803b-498c-b7e2-7d76629de3dc-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"dea6c69a-803b-498c-b7e2-7d76629de3dc\") " pod="openstack/ovn-northd-0" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.980586 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-xvlmj" Mar 09 09:23:36 crc kubenswrapper[4861]: I0309 09:23:36.995073 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pj72k\" (UniqueName: \"kubernetes.io/projected/dea6c69a-803b-498c-b7e2-7d76629de3dc-kube-api-access-pj72k\") pod \"ovn-northd-0\" (UID: \"dea6c69a-803b-498c-b7e2-7d76629de3dc\") " pod="openstack/ovn-northd-0" Mar 09 09:23:37 crc kubenswrapper[4861]: I0309 09:23:37.048663 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 09 09:23:37 crc kubenswrapper[4861]: I0309 09:23:37.076566 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c8a795fa-6d2f-4387-92c9-e2e5e4b9937b-ovsdbserver-nb\") pod \"dnsmasq-dns-7b57d9888c-8wmfs\" (UID: \"c8a795fa-6d2f-4387-92c9-e2e5e4b9937b\") " pod="openstack/dnsmasq-dns-7b57d9888c-8wmfs" Mar 09 09:23:37 crc kubenswrapper[4861]: I0309 09:23:37.076611 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngncj\" (UniqueName: \"kubernetes.io/projected/c8a795fa-6d2f-4387-92c9-e2e5e4b9937b-kube-api-access-ngncj\") pod \"dnsmasq-dns-7b57d9888c-8wmfs\" (UID: \"c8a795fa-6d2f-4387-92c9-e2e5e4b9937b\") " pod="openstack/dnsmasq-dns-7b57d9888c-8wmfs" Mar 09 09:23:37 crc kubenswrapper[4861]: I0309 09:23:37.076643 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c8a795fa-6d2f-4387-92c9-e2e5e4b9937b-ovsdbserver-sb\") pod \"dnsmasq-dns-7b57d9888c-8wmfs\" (UID: \"c8a795fa-6d2f-4387-92c9-e2e5e4b9937b\") " pod="openstack/dnsmasq-dns-7b57d9888c-8wmfs" Mar 09 09:23:37 crc kubenswrapper[4861]: I0309 09:23:37.076671 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8a795fa-6d2f-4387-92c9-e2e5e4b9937b-config\") pod \"dnsmasq-dns-7b57d9888c-8wmfs\" (UID: \"c8a795fa-6d2f-4387-92c9-e2e5e4b9937b\") " pod="openstack/dnsmasq-dns-7b57d9888c-8wmfs" Mar 09 09:23:37 crc kubenswrapper[4861]: I0309 09:23:37.076720 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8a795fa-6d2f-4387-92c9-e2e5e4b9937b-dns-svc\") pod \"dnsmasq-dns-7b57d9888c-8wmfs\" (UID: \"c8a795fa-6d2f-4387-92c9-e2e5e4b9937b\") " pod="openstack/dnsmasq-dns-7b57d9888c-8wmfs" Mar 09 09:23:37 crc kubenswrapper[4861]: I0309 09:23:37.077891 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c8a795fa-6d2f-4387-92c9-e2e5e4b9937b-ovsdbserver-nb\") pod \"dnsmasq-dns-7b57d9888c-8wmfs\" (UID: \"c8a795fa-6d2f-4387-92c9-e2e5e4b9937b\") " pod="openstack/dnsmasq-dns-7b57d9888c-8wmfs" Mar 09 09:23:37 crc kubenswrapper[4861]: I0309 09:23:37.078518 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8a795fa-6d2f-4387-92c9-e2e5e4b9937b-dns-svc\") pod \"dnsmasq-dns-7b57d9888c-8wmfs\" (UID: \"c8a795fa-6d2f-4387-92c9-e2e5e4b9937b\") " pod="openstack/dnsmasq-dns-7b57d9888c-8wmfs" Mar 09 09:23:37 crc kubenswrapper[4861]: I0309 09:23:37.078532 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8a795fa-6d2f-4387-92c9-e2e5e4b9937b-config\") pod \"dnsmasq-dns-7b57d9888c-8wmfs\" (UID: \"c8a795fa-6d2f-4387-92c9-e2e5e4b9937b\") " pod="openstack/dnsmasq-dns-7b57d9888c-8wmfs" Mar 09 09:23:37 crc kubenswrapper[4861]: I0309 09:23:37.079622 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c8a795fa-6d2f-4387-92c9-e2e5e4b9937b-ovsdbserver-sb\") pod \"dnsmasq-dns-7b57d9888c-8wmfs\" (UID: \"c8a795fa-6d2f-4387-92c9-e2e5e4b9937b\") " pod="openstack/dnsmasq-dns-7b57d9888c-8wmfs" Mar 09 09:23:37 crc kubenswrapper[4861]: I0309 09:23:37.094241 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngncj\" (UniqueName: \"kubernetes.io/projected/c8a795fa-6d2f-4387-92c9-e2e5e4b9937b-kube-api-access-ngncj\") pod \"dnsmasq-dns-7b57d9888c-8wmfs\" (UID: \"c8a795fa-6d2f-4387-92c9-e2e5e4b9937b\") " pod="openstack/dnsmasq-dns-7b57d9888c-8wmfs" Mar 09 09:23:37 crc kubenswrapper[4861]: I0309 09:23:37.150804 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b57d9888c-8wmfs" Mar 09 09:23:37 crc kubenswrapper[4861]: I0309 09:23:37.306356 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6444958b7f-68zjz" Mar 09 09:23:37 crc kubenswrapper[4861]: I0309 09:23:37.355971 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6444958b7f-68zjz" Mar 09 09:23:37 crc kubenswrapper[4861]: W0309 09:23:37.425974 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82770fbe_3052_4367_9c2d_a19a11d3a695.slice/crio-c57fb19035b4e5333bdb875f1ad9d06ef30f80e0e9f3a7cec61524cccd23efd0 WatchSource:0}: Error finding container c57fb19035b4e5333bdb875f1ad9d06ef30f80e0e9f3a7cec61524cccd23efd0: Status 404 returned error can't find the container with id c57fb19035b4e5333bdb875f1ad9d06ef30f80e0e9f3a7cec61524cccd23efd0 Mar 09 09:23:37 crc kubenswrapper[4861]: I0309 09:23:37.430731 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-xvlmj"] Mar 09 09:23:37 crc kubenswrapper[4861]: I0309 09:23:37.481342 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbqmb\" (UniqueName: \"kubernetes.io/projected/d44ecacc-e8b0-4ffd-84fd-bd3e76203d47-kube-api-access-sbqmb\") pod \"d44ecacc-e8b0-4ffd-84fd-bd3e76203d47\" (UID: \"d44ecacc-e8b0-4ffd-84fd-bd3e76203d47\") " Mar 09 09:23:37 crc kubenswrapper[4861]: I0309 09:23:37.481421 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d44ecacc-e8b0-4ffd-84fd-bd3e76203d47-dns-svc\") pod \"d44ecacc-e8b0-4ffd-84fd-bd3e76203d47\" (UID: \"d44ecacc-e8b0-4ffd-84fd-bd3e76203d47\") " Mar 09 09:23:37 crc kubenswrapper[4861]: I0309 09:23:37.481540 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d44ecacc-e8b0-4ffd-84fd-bd3e76203d47-ovsdbserver-nb\") pod \"d44ecacc-e8b0-4ffd-84fd-bd3e76203d47\" (UID: \"d44ecacc-e8b0-4ffd-84fd-bd3e76203d47\") " Mar 09 09:23:37 crc kubenswrapper[4861]: I0309 09:23:37.481588 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d44ecacc-e8b0-4ffd-84fd-bd3e76203d47-config\") pod \"d44ecacc-e8b0-4ffd-84fd-bd3e76203d47\" (UID: \"d44ecacc-e8b0-4ffd-84fd-bd3e76203d47\") " Mar 09 09:23:37 crc kubenswrapper[4861]: I0309 09:23:37.483577 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d44ecacc-e8b0-4ffd-84fd-bd3e76203d47-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d44ecacc-e8b0-4ffd-84fd-bd3e76203d47" (UID: "d44ecacc-e8b0-4ffd-84fd-bd3e76203d47"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:23:37 crc kubenswrapper[4861]: I0309 09:23:37.483711 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d44ecacc-e8b0-4ffd-84fd-bd3e76203d47-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d44ecacc-e8b0-4ffd-84fd-bd3e76203d47" (UID: "d44ecacc-e8b0-4ffd-84fd-bd3e76203d47"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:23:37 crc kubenswrapper[4861]: I0309 09:23:37.483934 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d44ecacc-e8b0-4ffd-84fd-bd3e76203d47-config" (OuterVolumeSpecName: "config") pod "d44ecacc-e8b0-4ffd-84fd-bd3e76203d47" (UID: "d44ecacc-e8b0-4ffd-84fd-bd3e76203d47"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:23:37 crc kubenswrapper[4861]: I0309 09:23:37.485865 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d44ecacc-e8b0-4ffd-84fd-bd3e76203d47-kube-api-access-sbqmb" (OuterVolumeSpecName: "kube-api-access-sbqmb") pod "d44ecacc-e8b0-4ffd-84fd-bd3e76203d47" (UID: "d44ecacc-e8b0-4ffd-84fd-bd3e76203d47"). InnerVolumeSpecName "kube-api-access-sbqmb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:23:37 crc kubenswrapper[4861]: I0309 09:23:37.515138 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 09 09:23:37 crc kubenswrapper[4861]: W0309 09:23:37.524683 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddea6c69a_803b_498c_b7e2_7d76629de3dc.slice/crio-0b4ef7be45607577d522fdeb3c3feb248a29c800f7dc263fa391da1f45e7daf6 WatchSource:0}: Error finding container 0b4ef7be45607577d522fdeb3c3feb248a29c800f7dc263fa391da1f45e7daf6: Status 404 returned error can't find the container with id 0b4ef7be45607577d522fdeb3c3feb248a29c800f7dc263fa391da1f45e7daf6 Mar 09 09:23:37 crc kubenswrapper[4861]: I0309 09:23:37.583181 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbqmb\" (UniqueName: \"kubernetes.io/projected/d44ecacc-e8b0-4ffd-84fd-bd3e76203d47-kube-api-access-sbqmb\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:37 crc kubenswrapper[4861]: I0309 09:23:37.583210 4861 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d44ecacc-e8b0-4ffd-84fd-bd3e76203d47-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:37 crc kubenswrapper[4861]: I0309 09:23:37.583221 4861 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d44ecacc-e8b0-4ffd-84fd-bd3e76203d47-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:37 crc kubenswrapper[4861]: I0309 09:23:37.583229 4861 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d44ecacc-e8b0-4ffd-84fd-bd3e76203d47-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:37 crc kubenswrapper[4861]: I0309 09:23:37.633343 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-8wmfs"] Mar 09 09:23:38 crc kubenswrapper[4861]: I0309 09:23:38.242572 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 09 09:23:38 crc kubenswrapper[4861]: I0309 09:23:38.242996 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 09 09:23:38 crc kubenswrapper[4861]: I0309 09:23:38.315256 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"dea6c69a-803b-498c-b7e2-7d76629de3dc","Type":"ContainerStarted","Data":"0b4ef7be45607577d522fdeb3c3feb248a29c800f7dc263fa391da1f45e7daf6"} Mar 09 09:23:38 crc kubenswrapper[4861]: I0309 09:23:38.316756 4861 generic.go:334] "Generic (PLEG): container finished" podID="c8a795fa-6d2f-4387-92c9-e2e5e4b9937b" containerID="32a3eec9924c00b8c9ce84143fb5ea90e1ab226cafc83f83cdfc2d1c6482acf8" exitCode=0 Mar 09 09:23:38 crc kubenswrapper[4861]: I0309 09:23:38.316828 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b57d9888c-8wmfs" event={"ID":"c8a795fa-6d2f-4387-92c9-e2e5e4b9937b","Type":"ContainerDied","Data":"32a3eec9924c00b8c9ce84143fb5ea90e1ab226cafc83f83cdfc2d1c6482acf8"} Mar 09 09:23:38 crc kubenswrapper[4861]: I0309 09:23:38.316869 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b57d9888c-8wmfs" event={"ID":"c8a795fa-6d2f-4387-92c9-e2e5e4b9937b","Type":"ContainerStarted","Data":"9b8f369092454e048d5e3affe56c2a3b3de0605f0b912fcac19fa731c658b18a"} Mar 09 09:23:38 crc kubenswrapper[4861]: I0309 09:23:38.318535 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-xvlmj" event={"ID":"82770fbe-3052-4367-9c2d-a19a11d3a695","Type":"ContainerStarted","Data":"7bdf96a6489a4b9d25076afaebbc07dd9cb849938d3ab81625dfad2d64a49748"} Mar 09 09:23:38 crc kubenswrapper[4861]: I0309 09:23:38.318582 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-xvlmj" event={"ID":"82770fbe-3052-4367-9c2d-a19a11d3a695","Type":"ContainerStarted","Data":"c57fb19035b4e5333bdb875f1ad9d06ef30f80e0e9f3a7cec61524cccd23efd0"} Mar 09 09:23:38 crc kubenswrapper[4861]: I0309 09:23:38.318710 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6444958b7f-68zjz" Mar 09 09:23:38 crc kubenswrapper[4861]: I0309 09:23:38.356853 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-xvlmj" podStartSLOduration=2.356827829 podStartE2EDuration="2.356827829s" podCreationTimestamp="2026-03-09 09:23:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:23:38.353127258 +0000 UTC m=+1061.438166679" watchObservedRunningTime="2026-03-09 09:23:38.356827829 +0000 UTC m=+1061.441867230" Mar 09 09:23:38 crc kubenswrapper[4861]: I0309 09:23:38.392112 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6444958b7f-68zjz"] Mar 09 09:23:38 crc kubenswrapper[4861]: I0309 09:23:38.398462 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6444958b7f-68zjz"] Mar 09 09:23:38 crc kubenswrapper[4861]: E0309 09:23:38.874481 4861 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.163:41060->38.102.83.163:39285: write tcp 38.102.83.163:41060->38.102.83.163:39285: write: broken pipe Mar 09 09:23:39 crc kubenswrapper[4861]: I0309 09:23:39.326147 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"dea6c69a-803b-498c-b7e2-7d76629de3dc","Type":"ContainerStarted","Data":"3d9eb325a9c720c866c18df5f88f350173096fd82ef8e50dffbb70d3de9f0038"} Mar 09 09:23:39 crc kubenswrapper[4861]: I0309 09:23:39.326200 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 09 09:23:39 crc kubenswrapper[4861]: I0309 09:23:39.326212 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"dea6c69a-803b-498c-b7e2-7d76629de3dc","Type":"ContainerStarted","Data":"8df9cf6a683729a050c2976f1e5f9ec03891ce555b97c2bd6b59e3d7d12bfa9d"} Mar 09 09:23:39 crc kubenswrapper[4861]: I0309 09:23:39.327720 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b57d9888c-8wmfs" event={"ID":"c8a795fa-6d2f-4387-92c9-e2e5e4b9937b","Type":"ContainerStarted","Data":"f53db7f882d5aa744326a47fa892e440034f4966904e607ea0794c6ef99130c0"} Mar 09 09:23:39 crc kubenswrapper[4861]: I0309 09:23:39.348781 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.308895968 podStartE2EDuration="3.348762238s" podCreationTimestamp="2026-03-09 09:23:36 +0000 UTC" firstStartedPulling="2026-03-09 09:23:37.528358513 +0000 UTC m=+1060.613397914" lastFinishedPulling="2026-03-09 09:23:38.568224783 +0000 UTC m=+1061.653264184" observedRunningTime="2026-03-09 09:23:39.342687317 +0000 UTC m=+1062.427726718" watchObservedRunningTime="2026-03-09 09:23:39.348762238 +0000 UTC m=+1062.433801639" Mar 09 09:23:39 crc kubenswrapper[4861]: I0309 09:23:39.368102 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b57d9888c-8wmfs" podStartSLOduration=3.368080084 podStartE2EDuration="3.368080084s" podCreationTimestamp="2026-03-09 09:23:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:23:39.366442946 +0000 UTC m=+1062.451482357" watchObservedRunningTime="2026-03-09 09:23:39.368080084 +0000 UTC m=+1062.453119485" Mar 09 09:23:39 crc kubenswrapper[4861]: I0309 09:23:39.426745 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 09 09:23:39 crc kubenswrapper[4861]: I0309 09:23:39.490837 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 09 09:23:39 crc kubenswrapper[4861]: I0309 09:23:39.668674 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d44ecacc-e8b0-4ffd-84fd-bd3e76203d47" path="/var/lib/kubelet/pods/d44ecacc-e8b0-4ffd-84fd-bd3e76203d47/volumes" Mar 09 09:23:39 crc kubenswrapper[4861]: I0309 09:23:39.754249 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-a7e9-account-create-update-x5n4n"] Mar 09 09:23:39 crc kubenswrapper[4861]: I0309 09:23:39.755514 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-a7e9-account-create-update-x5n4n" Mar 09 09:23:39 crc kubenswrapper[4861]: I0309 09:23:39.757628 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 09 09:23:39 crc kubenswrapper[4861]: I0309 09:23:39.767429 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-a7e9-account-create-update-x5n4n"] Mar 09 09:23:39 crc kubenswrapper[4861]: I0309 09:23:39.820897 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-4mxxh"] Mar 09 09:23:39 crc kubenswrapper[4861]: I0309 09:23:39.822265 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-4mxxh" Mar 09 09:23:39 crc kubenswrapper[4861]: I0309 09:23:39.823111 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtppc\" (UniqueName: \"kubernetes.io/projected/591c3df3-47fb-4da5-9776-4a6ed3170472-kube-api-access-mtppc\") pod \"placement-a7e9-account-create-update-x5n4n\" (UID: \"591c3df3-47fb-4da5-9776-4a6ed3170472\") " pod="openstack/placement-a7e9-account-create-update-x5n4n" Mar 09 09:23:39 crc kubenswrapper[4861]: I0309 09:23:39.823329 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/591c3df3-47fb-4da5-9776-4a6ed3170472-operator-scripts\") pod \"placement-a7e9-account-create-update-x5n4n\" (UID: \"591c3df3-47fb-4da5-9776-4a6ed3170472\") " pod="openstack/placement-a7e9-account-create-update-x5n4n" Mar 09 09:23:39 crc kubenswrapper[4861]: I0309 09:23:39.834820 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-4mxxh"] Mar 09 09:23:39 crc kubenswrapper[4861]: I0309 09:23:39.924923 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtppc\" (UniqueName: \"kubernetes.io/projected/591c3df3-47fb-4da5-9776-4a6ed3170472-kube-api-access-mtppc\") pod \"placement-a7e9-account-create-update-x5n4n\" (UID: \"591c3df3-47fb-4da5-9776-4a6ed3170472\") " pod="openstack/placement-a7e9-account-create-update-x5n4n" Mar 09 09:23:39 crc kubenswrapper[4861]: I0309 09:23:39.925952 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tk9r5\" (UniqueName: \"kubernetes.io/projected/8499a71b-bdde-4468-9a1e-781db816f2f0-kube-api-access-tk9r5\") pod \"placement-db-create-4mxxh\" (UID: \"8499a71b-bdde-4468-9a1e-781db816f2f0\") " pod="openstack/placement-db-create-4mxxh" Mar 09 09:23:39 crc kubenswrapper[4861]: I0309 09:23:39.926111 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/591c3df3-47fb-4da5-9776-4a6ed3170472-operator-scripts\") pod \"placement-a7e9-account-create-update-x5n4n\" (UID: \"591c3df3-47fb-4da5-9776-4a6ed3170472\") " pod="openstack/placement-a7e9-account-create-update-x5n4n" Mar 09 09:23:39 crc kubenswrapper[4861]: I0309 09:23:39.926238 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8499a71b-bdde-4468-9a1e-781db816f2f0-operator-scripts\") pod \"placement-db-create-4mxxh\" (UID: \"8499a71b-bdde-4468-9a1e-781db816f2f0\") " pod="openstack/placement-db-create-4mxxh" Mar 09 09:23:39 crc kubenswrapper[4861]: I0309 09:23:39.927106 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/591c3df3-47fb-4da5-9776-4a6ed3170472-operator-scripts\") pod \"placement-a7e9-account-create-update-x5n4n\" (UID: \"591c3df3-47fb-4da5-9776-4a6ed3170472\") " pod="openstack/placement-a7e9-account-create-update-x5n4n" Mar 09 09:23:39 crc kubenswrapper[4861]: I0309 09:23:39.943711 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtppc\" (UniqueName: \"kubernetes.io/projected/591c3df3-47fb-4da5-9776-4a6ed3170472-kube-api-access-mtppc\") pod \"placement-a7e9-account-create-update-x5n4n\" (UID: \"591c3df3-47fb-4da5-9776-4a6ed3170472\") " pod="openstack/placement-a7e9-account-create-update-x5n4n" Mar 09 09:23:40 crc kubenswrapper[4861]: I0309 09:23:40.028033 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tk9r5\" (UniqueName: \"kubernetes.io/projected/8499a71b-bdde-4468-9a1e-781db816f2f0-kube-api-access-tk9r5\") pod \"placement-db-create-4mxxh\" (UID: \"8499a71b-bdde-4468-9a1e-781db816f2f0\") " pod="openstack/placement-db-create-4mxxh" Mar 09 09:23:40 crc kubenswrapper[4861]: I0309 09:23:40.028114 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8499a71b-bdde-4468-9a1e-781db816f2f0-operator-scripts\") pod \"placement-db-create-4mxxh\" (UID: \"8499a71b-bdde-4468-9a1e-781db816f2f0\") " pod="openstack/placement-db-create-4mxxh" Mar 09 09:23:40 crc kubenswrapper[4861]: I0309 09:23:40.028859 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8499a71b-bdde-4468-9a1e-781db816f2f0-operator-scripts\") pod \"placement-db-create-4mxxh\" (UID: \"8499a71b-bdde-4468-9a1e-781db816f2f0\") " pod="openstack/placement-db-create-4mxxh" Mar 09 09:23:40 crc kubenswrapper[4861]: I0309 09:23:40.044644 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tk9r5\" (UniqueName: \"kubernetes.io/projected/8499a71b-bdde-4468-9a1e-781db816f2f0-kube-api-access-tk9r5\") pod \"placement-db-create-4mxxh\" (UID: \"8499a71b-bdde-4468-9a1e-781db816f2f0\") " pod="openstack/placement-db-create-4mxxh" Mar 09 09:23:40 crc kubenswrapper[4861]: I0309 09:23:40.071076 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-a7e9-account-create-update-x5n4n" Mar 09 09:23:40 crc kubenswrapper[4861]: I0309 09:23:40.137623 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-4mxxh" Mar 09 09:23:40 crc kubenswrapper[4861]: I0309 09:23:40.335921 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b57d9888c-8wmfs" Mar 09 09:23:40 crc kubenswrapper[4861]: I0309 09:23:40.564398 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-a7e9-account-create-update-x5n4n"] Mar 09 09:23:40 crc kubenswrapper[4861]: W0309 09:23:40.568829 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod591c3df3_47fb_4da5_9776_4a6ed3170472.slice/crio-551b7f81f34d6052e95283d9917584afd361f6f94aeac538f52970ce911465fe WatchSource:0}: Error finding container 551b7f81f34d6052e95283d9917584afd361f6f94aeac538f52970ce911465fe: Status 404 returned error can't find the container with id 551b7f81f34d6052e95283d9917584afd361f6f94aeac538f52970ce911465fe Mar 09 09:23:40 crc kubenswrapper[4861]: I0309 09:23:40.673785 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 09 09:23:40 crc kubenswrapper[4861]: I0309 09:23:40.696100 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-8wmfs"] Mar 09 09:23:40 crc kubenswrapper[4861]: I0309 09:23:40.734351 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f7dd995-89gfl"] Mar 09 09:23:40 crc kubenswrapper[4861]: I0309 09:23:40.735697 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f7dd995-89gfl" Mar 09 09:23:40 crc kubenswrapper[4861]: I0309 09:23:40.767059 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-4mxxh"] Mar 09 09:23:40 crc kubenswrapper[4861]: I0309 09:23:40.779533 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f7dd995-89gfl"] Mar 09 09:23:40 crc kubenswrapper[4861]: I0309 09:23:40.860423 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 09 09:23:40 crc kubenswrapper[4861]: I0309 09:23:40.869577 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59rx2\" (UniqueName: \"kubernetes.io/projected/93d06a80-e66b-4934-8175-b3c6cb1032a9-kube-api-access-59rx2\") pod \"dnsmasq-dns-675f7dd995-89gfl\" (UID: \"93d06a80-e66b-4934-8175-b3c6cb1032a9\") " pod="openstack/dnsmasq-dns-675f7dd995-89gfl" Mar 09 09:23:40 crc kubenswrapper[4861]: I0309 09:23:40.869635 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93d06a80-e66b-4934-8175-b3c6cb1032a9-dns-svc\") pod \"dnsmasq-dns-675f7dd995-89gfl\" (UID: \"93d06a80-e66b-4934-8175-b3c6cb1032a9\") " pod="openstack/dnsmasq-dns-675f7dd995-89gfl" Mar 09 09:23:40 crc kubenswrapper[4861]: I0309 09:23:40.869684 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93d06a80-e66b-4934-8175-b3c6cb1032a9-ovsdbserver-sb\") pod \"dnsmasq-dns-675f7dd995-89gfl\" (UID: \"93d06a80-e66b-4934-8175-b3c6cb1032a9\") " pod="openstack/dnsmasq-dns-675f7dd995-89gfl" Mar 09 09:23:40 crc kubenswrapper[4861]: I0309 09:23:40.869714 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93d06a80-e66b-4934-8175-b3c6cb1032a9-ovsdbserver-nb\") pod \"dnsmasq-dns-675f7dd995-89gfl\" (UID: \"93d06a80-e66b-4934-8175-b3c6cb1032a9\") " pod="openstack/dnsmasq-dns-675f7dd995-89gfl" Mar 09 09:23:40 crc kubenswrapper[4861]: I0309 09:23:40.869746 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93d06a80-e66b-4934-8175-b3c6cb1032a9-config\") pod \"dnsmasq-dns-675f7dd995-89gfl\" (UID: \"93d06a80-e66b-4934-8175-b3c6cb1032a9\") " pod="openstack/dnsmasq-dns-675f7dd995-89gfl" Mar 09 09:23:40 crc kubenswrapper[4861]: I0309 09:23:40.961303 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 09 09:23:40 crc kubenswrapper[4861]: I0309 09:23:40.970969 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59rx2\" (UniqueName: \"kubernetes.io/projected/93d06a80-e66b-4934-8175-b3c6cb1032a9-kube-api-access-59rx2\") pod \"dnsmasq-dns-675f7dd995-89gfl\" (UID: \"93d06a80-e66b-4934-8175-b3c6cb1032a9\") " pod="openstack/dnsmasq-dns-675f7dd995-89gfl" Mar 09 09:23:40 crc kubenswrapper[4861]: I0309 09:23:40.971027 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93d06a80-e66b-4934-8175-b3c6cb1032a9-dns-svc\") pod \"dnsmasq-dns-675f7dd995-89gfl\" (UID: \"93d06a80-e66b-4934-8175-b3c6cb1032a9\") " pod="openstack/dnsmasq-dns-675f7dd995-89gfl" Mar 09 09:23:40 crc kubenswrapper[4861]: I0309 09:23:40.971068 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93d06a80-e66b-4934-8175-b3c6cb1032a9-ovsdbserver-sb\") pod \"dnsmasq-dns-675f7dd995-89gfl\" (UID: \"93d06a80-e66b-4934-8175-b3c6cb1032a9\") " pod="openstack/dnsmasq-dns-675f7dd995-89gfl" Mar 09 09:23:40 crc kubenswrapper[4861]: I0309 09:23:40.971089 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93d06a80-e66b-4934-8175-b3c6cb1032a9-ovsdbserver-nb\") pod \"dnsmasq-dns-675f7dd995-89gfl\" (UID: \"93d06a80-e66b-4934-8175-b3c6cb1032a9\") " pod="openstack/dnsmasq-dns-675f7dd995-89gfl" Mar 09 09:23:40 crc kubenswrapper[4861]: I0309 09:23:40.971137 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93d06a80-e66b-4934-8175-b3c6cb1032a9-config\") pod \"dnsmasq-dns-675f7dd995-89gfl\" (UID: \"93d06a80-e66b-4934-8175-b3c6cb1032a9\") " pod="openstack/dnsmasq-dns-675f7dd995-89gfl" Mar 09 09:23:40 crc kubenswrapper[4861]: I0309 09:23:40.971991 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93d06a80-e66b-4934-8175-b3c6cb1032a9-ovsdbserver-nb\") pod \"dnsmasq-dns-675f7dd995-89gfl\" (UID: \"93d06a80-e66b-4934-8175-b3c6cb1032a9\") " pod="openstack/dnsmasq-dns-675f7dd995-89gfl" Mar 09 09:23:40 crc kubenswrapper[4861]: I0309 09:23:40.972015 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93d06a80-e66b-4934-8175-b3c6cb1032a9-dns-svc\") pod \"dnsmasq-dns-675f7dd995-89gfl\" (UID: \"93d06a80-e66b-4934-8175-b3c6cb1032a9\") " pod="openstack/dnsmasq-dns-675f7dd995-89gfl" Mar 09 09:23:40 crc kubenswrapper[4861]: I0309 09:23:40.972045 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93d06a80-e66b-4934-8175-b3c6cb1032a9-ovsdbserver-sb\") pod \"dnsmasq-dns-675f7dd995-89gfl\" (UID: \"93d06a80-e66b-4934-8175-b3c6cb1032a9\") " pod="openstack/dnsmasq-dns-675f7dd995-89gfl" Mar 09 09:23:40 crc kubenswrapper[4861]: I0309 09:23:40.972083 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93d06a80-e66b-4934-8175-b3c6cb1032a9-config\") pod \"dnsmasq-dns-675f7dd995-89gfl\" (UID: \"93d06a80-e66b-4934-8175-b3c6cb1032a9\") " pod="openstack/dnsmasq-dns-675f7dd995-89gfl" Mar 09 09:23:41 crc kubenswrapper[4861]: I0309 09:23:41.003222 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59rx2\" (UniqueName: \"kubernetes.io/projected/93d06a80-e66b-4934-8175-b3c6cb1032a9-kube-api-access-59rx2\") pod \"dnsmasq-dns-675f7dd995-89gfl\" (UID: \"93d06a80-e66b-4934-8175-b3c6cb1032a9\") " pod="openstack/dnsmasq-dns-675f7dd995-89gfl" Mar 09 09:23:41 crc kubenswrapper[4861]: I0309 09:23:41.062474 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f7dd995-89gfl" Mar 09 09:23:41 crc kubenswrapper[4861]: I0309 09:23:41.344055 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-a7e9-account-create-update-x5n4n" event={"ID":"591c3df3-47fb-4da5-9776-4a6ed3170472","Type":"ContainerStarted","Data":"caadf85455a67668f15f9f86aa6e994c39806be3eb7d310b487e21ad7f875c90"} Mar 09 09:23:41 crc kubenswrapper[4861]: I0309 09:23:41.344139 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-a7e9-account-create-update-x5n4n" event={"ID":"591c3df3-47fb-4da5-9776-4a6ed3170472","Type":"ContainerStarted","Data":"551b7f81f34d6052e95283d9917584afd361f6f94aeac538f52970ce911465fe"} Mar 09 09:23:41 crc kubenswrapper[4861]: I0309 09:23:41.347434 4861 generic.go:334] "Generic (PLEG): container finished" podID="8499a71b-bdde-4468-9a1e-781db816f2f0" containerID="212a3f24478517e3b3c1bfc2375f9d399f9c8ddbc22b7dc8da2f5515d3547a9d" exitCode=0 Mar 09 09:23:41 crc kubenswrapper[4861]: I0309 09:23:41.347494 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-4mxxh" event={"ID":"8499a71b-bdde-4468-9a1e-781db816f2f0","Type":"ContainerDied","Data":"212a3f24478517e3b3c1bfc2375f9d399f9c8ddbc22b7dc8da2f5515d3547a9d"} Mar 09 09:23:41 crc kubenswrapper[4861]: I0309 09:23:41.348491 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-4mxxh" event={"ID":"8499a71b-bdde-4468-9a1e-781db816f2f0","Type":"ContainerStarted","Data":"085c575b5b898ff2455f65180522a3d416b91eef7157e39dcb85bb567373b241"} Mar 09 09:23:41 crc kubenswrapper[4861]: I0309 09:23:41.362762 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-a7e9-account-create-update-x5n4n" podStartSLOduration=2.362748385 podStartE2EDuration="2.362748385s" podCreationTimestamp="2026-03-09 09:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:23:41.360227331 +0000 UTC m=+1064.445266742" watchObservedRunningTime="2026-03-09 09:23:41.362748385 +0000 UTC m=+1064.447787786" Mar 09 09:23:41 crc kubenswrapper[4861]: I0309 09:23:41.496832 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f7dd995-89gfl"] Mar 09 09:23:41 crc kubenswrapper[4861]: W0309 09:23:41.507862 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93d06a80_e66b_4934_8175_b3c6cb1032a9.slice/crio-9ea281cc13a2d0404fd3aae4c8c6d372f48d614ae14b4b6855504bf3f548b906 WatchSource:0}: Error finding container 9ea281cc13a2d0404fd3aae4c8c6d372f48d614ae14b4b6855504bf3f548b906: Status 404 returned error can't find the container with id 9ea281cc13a2d0404fd3aae4c8c6d372f48d614ae14b4b6855504bf3f548b906 Mar 09 09:23:41 crc kubenswrapper[4861]: I0309 09:23:41.900462 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 09 09:23:41 crc kubenswrapper[4861]: I0309 09:23:41.905790 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 09 09:23:41 crc kubenswrapper[4861]: I0309 09:23:41.908649 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 09 09:23:41 crc kubenswrapper[4861]: I0309 09:23:41.908676 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 09 09:23:41 crc kubenswrapper[4861]: I0309 09:23:41.908730 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 09 09:23:41 crc kubenswrapper[4861]: I0309 09:23:41.911894 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-qp4gr" Mar 09 09:23:41 crc kubenswrapper[4861]: I0309 09:23:41.932199 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 09 09:23:42 crc kubenswrapper[4861]: I0309 09:23:41.998140 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d1ad2fa7-36fc-4cd0-98ac-07b48c42e794-etc-swift\") pod \"swift-storage-0\" (UID: \"d1ad2fa7-36fc-4cd0-98ac-07b48c42e794\") " pod="openstack/swift-storage-0" Mar 09 09:23:42 crc kubenswrapper[4861]: I0309 09:23:41.998208 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtw4l\" (UniqueName: \"kubernetes.io/projected/d1ad2fa7-36fc-4cd0-98ac-07b48c42e794-kube-api-access-xtw4l\") pod \"swift-storage-0\" (UID: \"d1ad2fa7-36fc-4cd0-98ac-07b48c42e794\") " pod="openstack/swift-storage-0" Mar 09 09:23:42 crc kubenswrapper[4861]: I0309 09:23:41.998239 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"d1ad2fa7-36fc-4cd0-98ac-07b48c42e794\") " pod="openstack/swift-storage-0" Mar 09 09:23:42 crc kubenswrapper[4861]: I0309 09:23:41.998281 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d1ad2fa7-36fc-4cd0-98ac-07b48c42e794-lock\") pod \"swift-storage-0\" (UID: \"d1ad2fa7-36fc-4cd0-98ac-07b48c42e794\") " pod="openstack/swift-storage-0" Mar 09 09:23:42 crc kubenswrapper[4861]: I0309 09:23:41.998309 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d1ad2fa7-36fc-4cd0-98ac-07b48c42e794-cache\") pod \"swift-storage-0\" (UID: \"d1ad2fa7-36fc-4cd0-98ac-07b48c42e794\") " pod="openstack/swift-storage-0" Mar 09 09:23:42 crc kubenswrapper[4861]: I0309 09:23:41.998331 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1ad2fa7-36fc-4cd0-98ac-07b48c42e794-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"d1ad2fa7-36fc-4cd0-98ac-07b48c42e794\") " pod="openstack/swift-storage-0" Mar 09 09:23:42 crc kubenswrapper[4861]: I0309 09:23:42.099726 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d1ad2fa7-36fc-4cd0-98ac-07b48c42e794-etc-swift\") pod \"swift-storage-0\" (UID: \"d1ad2fa7-36fc-4cd0-98ac-07b48c42e794\") " pod="openstack/swift-storage-0" Mar 09 09:23:42 crc kubenswrapper[4861]: I0309 09:23:42.099799 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtw4l\" (UniqueName: \"kubernetes.io/projected/d1ad2fa7-36fc-4cd0-98ac-07b48c42e794-kube-api-access-xtw4l\") pod \"swift-storage-0\" (UID: \"d1ad2fa7-36fc-4cd0-98ac-07b48c42e794\") " pod="openstack/swift-storage-0" Mar 09 09:23:42 crc kubenswrapper[4861]: I0309 09:23:42.099835 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"d1ad2fa7-36fc-4cd0-98ac-07b48c42e794\") " pod="openstack/swift-storage-0" Mar 09 09:23:42 crc kubenswrapper[4861]: I0309 09:23:42.099884 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d1ad2fa7-36fc-4cd0-98ac-07b48c42e794-lock\") pod \"swift-storage-0\" (UID: \"d1ad2fa7-36fc-4cd0-98ac-07b48c42e794\") " pod="openstack/swift-storage-0" Mar 09 09:23:42 crc kubenswrapper[4861]: I0309 09:23:42.099912 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d1ad2fa7-36fc-4cd0-98ac-07b48c42e794-cache\") pod \"swift-storage-0\" (UID: \"d1ad2fa7-36fc-4cd0-98ac-07b48c42e794\") " pod="openstack/swift-storage-0" Mar 09 09:23:42 crc kubenswrapper[4861]: E0309 09:23:42.099927 4861 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 09 09:23:42 crc kubenswrapper[4861]: E0309 09:23:42.099952 4861 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 09 09:23:42 crc kubenswrapper[4861]: E0309 09:23:42.100001 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d1ad2fa7-36fc-4cd0-98ac-07b48c42e794-etc-swift podName:d1ad2fa7-36fc-4cd0-98ac-07b48c42e794 nodeName:}" failed. No retries permitted until 2026-03-09 09:23:42.59998447 +0000 UTC m=+1065.685023871 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d1ad2fa7-36fc-4cd0-98ac-07b48c42e794-etc-swift") pod "swift-storage-0" (UID: "d1ad2fa7-36fc-4cd0-98ac-07b48c42e794") : configmap "swift-ring-files" not found Mar 09 09:23:42 crc kubenswrapper[4861]: I0309 09:23:42.099936 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1ad2fa7-36fc-4cd0-98ac-07b48c42e794-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"d1ad2fa7-36fc-4cd0-98ac-07b48c42e794\") " pod="openstack/swift-storage-0" Mar 09 09:23:42 crc kubenswrapper[4861]: I0309 09:23:42.100403 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d1ad2fa7-36fc-4cd0-98ac-07b48c42e794-lock\") pod \"swift-storage-0\" (UID: \"d1ad2fa7-36fc-4cd0-98ac-07b48c42e794\") " pod="openstack/swift-storage-0" Mar 09 09:23:42 crc kubenswrapper[4861]: I0309 09:23:42.100484 4861 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"d1ad2fa7-36fc-4cd0-98ac-07b48c42e794\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/swift-storage-0" Mar 09 09:23:42 crc kubenswrapper[4861]: I0309 09:23:42.100574 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d1ad2fa7-36fc-4cd0-98ac-07b48c42e794-cache\") pod \"swift-storage-0\" (UID: \"d1ad2fa7-36fc-4cd0-98ac-07b48c42e794\") " pod="openstack/swift-storage-0" Mar 09 09:23:42 crc kubenswrapper[4861]: I0309 09:23:42.109361 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1ad2fa7-36fc-4cd0-98ac-07b48c42e794-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"d1ad2fa7-36fc-4cd0-98ac-07b48c42e794\") " pod="openstack/swift-storage-0" Mar 09 09:23:42 crc kubenswrapper[4861]: I0309 09:23:42.124221 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtw4l\" (UniqueName: \"kubernetes.io/projected/d1ad2fa7-36fc-4cd0-98ac-07b48c42e794-kube-api-access-xtw4l\") pod \"swift-storage-0\" (UID: \"d1ad2fa7-36fc-4cd0-98ac-07b48c42e794\") " pod="openstack/swift-storage-0" Mar 09 09:23:42 crc kubenswrapper[4861]: I0309 09:23:42.124948 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"d1ad2fa7-36fc-4cd0-98ac-07b48c42e794\") " pod="openstack/swift-storage-0" Mar 09 09:23:42 crc kubenswrapper[4861]: I0309 09:23:42.356960 4861 generic.go:334] "Generic (PLEG): container finished" podID="591c3df3-47fb-4da5-9776-4a6ed3170472" containerID="caadf85455a67668f15f9f86aa6e994c39806be3eb7d310b487e21ad7f875c90" exitCode=0 Mar 09 09:23:42 crc kubenswrapper[4861]: I0309 09:23:42.357105 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-a7e9-account-create-update-x5n4n" event={"ID":"591c3df3-47fb-4da5-9776-4a6ed3170472","Type":"ContainerDied","Data":"caadf85455a67668f15f9f86aa6e994c39806be3eb7d310b487e21ad7f875c90"} Mar 09 09:23:42 crc kubenswrapper[4861]: I0309 09:23:42.359223 4861 generic.go:334] "Generic (PLEG): container finished" podID="93d06a80-e66b-4934-8175-b3c6cb1032a9" containerID="f34953ee28da5102b6104c93af2218f5d6cb81001860a73bbf189530c1b3bc8d" exitCode=0 Mar 09 09:23:42 crc kubenswrapper[4861]: I0309 09:23:42.359416 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f7dd995-89gfl" event={"ID":"93d06a80-e66b-4934-8175-b3c6cb1032a9","Type":"ContainerDied","Data":"f34953ee28da5102b6104c93af2218f5d6cb81001860a73bbf189530c1b3bc8d"} Mar 09 09:23:42 crc kubenswrapper[4861]: I0309 09:23:42.359447 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f7dd995-89gfl" event={"ID":"93d06a80-e66b-4934-8175-b3c6cb1032a9","Type":"ContainerStarted","Data":"9ea281cc13a2d0404fd3aae4c8c6d372f48d614ae14b4b6855504bf3f548b906"} Mar 09 09:23:42 crc kubenswrapper[4861]: I0309 09:23:42.359589 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b57d9888c-8wmfs" podUID="c8a795fa-6d2f-4387-92c9-e2e5e4b9937b" containerName="dnsmasq-dns" containerID="cri-o://f53db7f882d5aa744326a47fa892e440034f4966904e607ea0794c6ef99130c0" gracePeriod=10 Mar 09 09:23:42 crc kubenswrapper[4861]: I0309 09:23:42.607994 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d1ad2fa7-36fc-4cd0-98ac-07b48c42e794-etc-swift\") pod \"swift-storage-0\" (UID: \"d1ad2fa7-36fc-4cd0-98ac-07b48c42e794\") " pod="openstack/swift-storage-0" Mar 09 09:23:42 crc kubenswrapper[4861]: E0309 09:23:42.608282 4861 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 09 09:23:42 crc kubenswrapper[4861]: E0309 09:23:42.608314 4861 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 09 09:23:42 crc kubenswrapper[4861]: E0309 09:23:42.608409 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d1ad2fa7-36fc-4cd0-98ac-07b48c42e794-etc-swift podName:d1ad2fa7-36fc-4cd0-98ac-07b48c42e794 nodeName:}" failed. No retries permitted until 2026-03-09 09:23:43.6083547 +0000 UTC m=+1066.693394101 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d1ad2fa7-36fc-4cd0-98ac-07b48c42e794-etc-swift") pod "swift-storage-0" (UID: "d1ad2fa7-36fc-4cd0-98ac-07b48c42e794") : configmap "swift-ring-files" not found Mar 09 09:23:42 crc kubenswrapper[4861]: I0309 09:23:42.666537 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-4mxxh" Mar 09 09:23:42 crc kubenswrapper[4861]: I0309 09:23:42.709813 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8499a71b-bdde-4468-9a1e-781db816f2f0-operator-scripts\") pod \"8499a71b-bdde-4468-9a1e-781db816f2f0\" (UID: \"8499a71b-bdde-4468-9a1e-781db816f2f0\") " Mar 09 09:23:42 crc kubenswrapper[4861]: I0309 09:23:42.710063 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk9r5\" (UniqueName: \"kubernetes.io/projected/8499a71b-bdde-4468-9a1e-781db816f2f0-kube-api-access-tk9r5\") pod \"8499a71b-bdde-4468-9a1e-781db816f2f0\" (UID: \"8499a71b-bdde-4468-9a1e-781db816f2f0\") " Mar 09 09:23:42 crc kubenswrapper[4861]: I0309 09:23:42.712057 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8499a71b-bdde-4468-9a1e-781db816f2f0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8499a71b-bdde-4468-9a1e-781db816f2f0" (UID: "8499a71b-bdde-4468-9a1e-781db816f2f0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:23:42 crc kubenswrapper[4861]: I0309 09:23:42.718117 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8499a71b-bdde-4468-9a1e-781db816f2f0-kube-api-access-tk9r5" (OuterVolumeSpecName: "kube-api-access-tk9r5") pod "8499a71b-bdde-4468-9a1e-781db816f2f0" (UID: "8499a71b-bdde-4468-9a1e-781db816f2f0"). InnerVolumeSpecName "kube-api-access-tk9r5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:23:42 crc kubenswrapper[4861]: I0309 09:23:42.812107 4861 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8499a71b-bdde-4468-9a1e-781db816f2f0-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:42 crc kubenswrapper[4861]: I0309 09:23:42.812139 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk9r5\" (UniqueName: \"kubernetes.io/projected/8499a71b-bdde-4468-9a1e-781db816f2f0-kube-api-access-tk9r5\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:42 crc kubenswrapper[4861]: I0309 09:23:42.814518 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b57d9888c-8wmfs" Mar 09 09:23:42 crc kubenswrapper[4861]: I0309 09:23:42.913205 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8a795fa-6d2f-4387-92c9-e2e5e4b9937b-dns-svc\") pod \"c8a795fa-6d2f-4387-92c9-e2e5e4b9937b\" (UID: \"c8a795fa-6d2f-4387-92c9-e2e5e4b9937b\") " Mar 09 09:23:42 crc kubenswrapper[4861]: I0309 09:23:42.913638 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c8a795fa-6d2f-4387-92c9-e2e5e4b9937b-ovsdbserver-sb\") pod \"c8a795fa-6d2f-4387-92c9-e2e5e4b9937b\" (UID: \"c8a795fa-6d2f-4387-92c9-e2e5e4b9937b\") " Mar 09 09:23:42 crc kubenswrapper[4861]: I0309 09:23:42.913673 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngncj\" (UniqueName: \"kubernetes.io/projected/c8a795fa-6d2f-4387-92c9-e2e5e4b9937b-kube-api-access-ngncj\") pod \"c8a795fa-6d2f-4387-92c9-e2e5e4b9937b\" (UID: \"c8a795fa-6d2f-4387-92c9-e2e5e4b9937b\") " Mar 09 09:23:42 crc kubenswrapper[4861]: I0309 09:23:42.913750 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c8a795fa-6d2f-4387-92c9-e2e5e4b9937b-ovsdbserver-nb\") pod \"c8a795fa-6d2f-4387-92c9-e2e5e4b9937b\" (UID: \"c8a795fa-6d2f-4387-92c9-e2e5e4b9937b\") " Mar 09 09:23:42 crc kubenswrapper[4861]: I0309 09:23:42.913885 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8a795fa-6d2f-4387-92c9-e2e5e4b9937b-config\") pod \"c8a795fa-6d2f-4387-92c9-e2e5e4b9937b\" (UID: \"c8a795fa-6d2f-4387-92c9-e2e5e4b9937b\") " Mar 09 09:23:42 crc kubenswrapper[4861]: I0309 09:23:42.918122 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8a795fa-6d2f-4387-92c9-e2e5e4b9937b-kube-api-access-ngncj" (OuterVolumeSpecName: "kube-api-access-ngncj") pod "c8a795fa-6d2f-4387-92c9-e2e5e4b9937b" (UID: "c8a795fa-6d2f-4387-92c9-e2e5e4b9937b"). InnerVolumeSpecName "kube-api-access-ngncj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:23:42 crc kubenswrapper[4861]: I0309 09:23:42.960387 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8a795fa-6d2f-4387-92c9-e2e5e4b9937b-config" (OuterVolumeSpecName: "config") pod "c8a795fa-6d2f-4387-92c9-e2e5e4b9937b" (UID: "c8a795fa-6d2f-4387-92c9-e2e5e4b9937b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:23:42 crc kubenswrapper[4861]: I0309 09:23:42.965745 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8a795fa-6d2f-4387-92c9-e2e5e4b9937b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c8a795fa-6d2f-4387-92c9-e2e5e4b9937b" (UID: "c8a795fa-6d2f-4387-92c9-e2e5e4b9937b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:23:42 crc kubenswrapper[4861]: I0309 09:23:42.967850 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8a795fa-6d2f-4387-92c9-e2e5e4b9937b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c8a795fa-6d2f-4387-92c9-e2e5e4b9937b" (UID: "c8a795fa-6d2f-4387-92c9-e2e5e4b9937b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:23:42 crc kubenswrapper[4861]: I0309 09:23:42.999417 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8a795fa-6d2f-4387-92c9-e2e5e4b9937b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c8a795fa-6d2f-4387-92c9-e2e5e4b9937b" (UID: "c8a795fa-6d2f-4387-92c9-e2e5e4b9937b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:23:43 crc kubenswrapper[4861]: I0309 09:23:43.015353 4861 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c8a795fa-6d2f-4387-92c9-e2e5e4b9937b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:43 crc kubenswrapper[4861]: I0309 09:23:43.015413 4861 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c8a795fa-6d2f-4387-92c9-e2e5e4b9937b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:43 crc kubenswrapper[4861]: I0309 09:23:43.015430 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngncj\" (UniqueName: \"kubernetes.io/projected/c8a795fa-6d2f-4387-92c9-e2e5e4b9937b-kube-api-access-ngncj\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:43 crc kubenswrapper[4861]: I0309 09:23:43.015443 4861 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c8a795fa-6d2f-4387-92c9-e2e5e4b9937b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:43 crc kubenswrapper[4861]: I0309 09:23:43.015455 4861 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c8a795fa-6d2f-4387-92c9-e2e5e4b9937b-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:43 crc kubenswrapper[4861]: I0309 09:23:43.392233 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f7dd995-89gfl" event={"ID":"93d06a80-e66b-4934-8175-b3c6cb1032a9","Type":"ContainerStarted","Data":"b197f180ac090e170e93e652be3e09eb33318df6ba4c570ab71921b801e1c882"} Mar 09 09:23:43 crc kubenswrapper[4861]: I0309 09:23:43.392498 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-675f7dd995-89gfl" Mar 09 09:23:43 crc kubenswrapper[4861]: I0309 09:23:43.394428 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-4mxxh" event={"ID":"8499a71b-bdde-4468-9a1e-781db816f2f0","Type":"ContainerDied","Data":"085c575b5b898ff2455f65180522a3d416b91eef7157e39dcb85bb567373b241"} Mar 09 09:23:43 crc kubenswrapper[4861]: I0309 09:23:43.394471 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="085c575b5b898ff2455f65180522a3d416b91eef7157e39dcb85bb567373b241" Mar 09 09:23:43 crc kubenswrapper[4861]: I0309 09:23:43.394522 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-4mxxh" Mar 09 09:23:43 crc kubenswrapper[4861]: I0309 09:23:43.398460 4861 generic.go:334] "Generic (PLEG): container finished" podID="c8a795fa-6d2f-4387-92c9-e2e5e4b9937b" containerID="f53db7f882d5aa744326a47fa892e440034f4966904e607ea0794c6ef99130c0" exitCode=0 Mar 09 09:23:43 crc kubenswrapper[4861]: I0309 09:23:43.398512 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b57d9888c-8wmfs" Mar 09 09:23:43 crc kubenswrapper[4861]: I0309 09:23:43.398543 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b57d9888c-8wmfs" event={"ID":"c8a795fa-6d2f-4387-92c9-e2e5e4b9937b","Type":"ContainerDied","Data":"f53db7f882d5aa744326a47fa892e440034f4966904e607ea0794c6ef99130c0"} Mar 09 09:23:43 crc kubenswrapper[4861]: I0309 09:23:43.398607 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b57d9888c-8wmfs" event={"ID":"c8a795fa-6d2f-4387-92c9-e2e5e4b9937b","Type":"ContainerDied","Data":"9b8f369092454e048d5e3affe56c2a3b3de0605f0b912fcac19fa731c658b18a"} Mar 09 09:23:43 crc kubenswrapper[4861]: I0309 09:23:43.398638 4861 scope.go:117] "RemoveContainer" containerID="f53db7f882d5aa744326a47fa892e440034f4966904e607ea0794c6ef99130c0" Mar 09 09:23:43 crc kubenswrapper[4861]: I0309 09:23:43.416053 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-675f7dd995-89gfl" podStartSLOduration=3.416032576 podStartE2EDuration="3.416032576s" podCreationTimestamp="2026-03-09 09:23:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:23:43.415884451 +0000 UTC m=+1066.500923872" watchObservedRunningTime="2026-03-09 09:23:43.416032576 +0000 UTC m=+1066.501071987" Mar 09 09:23:43 crc kubenswrapper[4861]: I0309 09:23:43.434822 4861 scope.go:117] "RemoveContainer" containerID="32a3eec9924c00b8c9ce84143fb5ea90e1ab226cafc83f83cdfc2d1c6482acf8" Mar 09 09:23:43 crc kubenswrapper[4861]: I0309 09:23:43.441076 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-8wmfs"] Mar 09 09:23:43 crc kubenswrapper[4861]: I0309 09:23:43.452249 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-8wmfs"] Mar 09 09:23:43 crc kubenswrapper[4861]: I0309 09:23:43.473624 4861 scope.go:117] "RemoveContainer" containerID="f53db7f882d5aa744326a47fa892e440034f4966904e607ea0794c6ef99130c0" Mar 09 09:23:43 crc kubenswrapper[4861]: E0309 09:23:43.474309 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f53db7f882d5aa744326a47fa892e440034f4966904e607ea0794c6ef99130c0\": container with ID starting with f53db7f882d5aa744326a47fa892e440034f4966904e607ea0794c6ef99130c0 not found: ID does not exist" containerID="f53db7f882d5aa744326a47fa892e440034f4966904e607ea0794c6ef99130c0" Mar 09 09:23:43 crc kubenswrapper[4861]: I0309 09:23:43.474353 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f53db7f882d5aa744326a47fa892e440034f4966904e607ea0794c6ef99130c0"} err="failed to get container status \"f53db7f882d5aa744326a47fa892e440034f4966904e607ea0794c6ef99130c0\": rpc error: code = NotFound desc = could not find container \"f53db7f882d5aa744326a47fa892e440034f4966904e607ea0794c6ef99130c0\": container with ID starting with f53db7f882d5aa744326a47fa892e440034f4966904e607ea0794c6ef99130c0 not found: ID does not exist" Mar 09 09:23:43 crc kubenswrapper[4861]: I0309 09:23:43.474400 4861 scope.go:117] "RemoveContainer" containerID="32a3eec9924c00b8c9ce84143fb5ea90e1ab226cafc83f83cdfc2d1c6482acf8" Mar 09 09:23:43 crc kubenswrapper[4861]: E0309 09:23:43.474774 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32a3eec9924c00b8c9ce84143fb5ea90e1ab226cafc83f83cdfc2d1c6482acf8\": container with ID starting with 32a3eec9924c00b8c9ce84143fb5ea90e1ab226cafc83f83cdfc2d1c6482acf8 not found: ID does not exist" containerID="32a3eec9924c00b8c9ce84143fb5ea90e1ab226cafc83f83cdfc2d1c6482acf8" Mar 09 09:23:43 crc kubenswrapper[4861]: I0309 09:23:43.474794 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32a3eec9924c00b8c9ce84143fb5ea90e1ab226cafc83f83cdfc2d1c6482acf8"} err="failed to get container status \"32a3eec9924c00b8c9ce84143fb5ea90e1ab226cafc83f83cdfc2d1c6482acf8\": rpc error: code = NotFound desc = could not find container \"32a3eec9924c00b8c9ce84143fb5ea90e1ab226cafc83f83cdfc2d1c6482acf8\": container with ID starting with 32a3eec9924c00b8c9ce84143fb5ea90e1ab226cafc83f83cdfc2d1c6482acf8 not found: ID does not exist" Mar 09 09:23:43 crc kubenswrapper[4861]: I0309 09:23:43.573995 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-7k92z"] Mar 09 09:23:43 crc kubenswrapper[4861]: E0309 09:23:43.579594 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8a795fa-6d2f-4387-92c9-e2e5e4b9937b" containerName="dnsmasq-dns" Mar 09 09:23:43 crc kubenswrapper[4861]: I0309 09:23:43.579628 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8a795fa-6d2f-4387-92c9-e2e5e4b9937b" containerName="dnsmasq-dns" Mar 09 09:23:43 crc kubenswrapper[4861]: E0309 09:23:43.579682 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8499a71b-bdde-4468-9a1e-781db816f2f0" containerName="mariadb-database-create" Mar 09 09:23:43 crc kubenswrapper[4861]: I0309 09:23:43.579694 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="8499a71b-bdde-4468-9a1e-781db816f2f0" containerName="mariadb-database-create" Mar 09 09:23:43 crc kubenswrapper[4861]: E0309 09:23:43.579718 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8a795fa-6d2f-4387-92c9-e2e5e4b9937b" containerName="init" Mar 09 09:23:43 crc kubenswrapper[4861]: I0309 09:23:43.579726 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8a795fa-6d2f-4387-92c9-e2e5e4b9937b" containerName="init" Mar 09 09:23:43 crc kubenswrapper[4861]: I0309 09:23:43.580089 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8a795fa-6d2f-4387-92c9-e2e5e4b9937b" containerName="dnsmasq-dns" Mar 09 09:23:43 crc kubenswrapper[4861]: I0309 09:23:43.580101 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="8499a71b-bdde-4468-9a1e-781db816f2f0" containerName="mariadb-database-create" Mar 09 09:23:43 crc kubenswrapper[4861]: I0309 09:23:43.580754 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-7k92z" Mar 09 09:23:43 crc kubenswrapper[4861]: I0309 09:23:43.587220 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-7k92z"] Mar 09 09:23:43 crc kubenswrapper[4861]: I0309 09:23:43.624737 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d1ad2fa7-36fc-4cd0-98ac-07b48c42e794-etc-swift\") pod \"swift-storage-0\" (UID: \"d1ad2fa7-36fc-4cd0-98ac-07b48c42e794\") " pod="openstack/swift-storage-0" Mar 09 09:23:43 crc kubenswrapper[4861]: I0309 09:23:43.624821 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3468cc7-9c05-4934-bc2d-287e80b966a4-operator-scripts\") pod \"glance-db-create-7k92z\" (UID: \"c3468cc7-9c05-4934-bc2d-287e80b966a4\") " pod="openstack/glance-db-create-7k92z" Mar 09 09:23:43 crc kubenswrapper[4861]: I0309 09:23:43.624855 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjw9f\" (UniqueName: \"kubernetes.io/projected/c3468cc7-9c05-4934-bc2d-287e80b966a4-kube-api-access-zjw9f\") pod \"glance-db-create-7k92z\" (UID: \"c3468cc7-9c05-4934-bc2d-287e80b966a4\") " pod="openstack/glance-db-create-7k92z" Mar 09 09:23:43 crc kubenswrapper[4861]: E0309 09:23:43.625217 4861 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 09 09:23:43 crc kubenswrapper[4861]: E0309 09:23:43.625243 4861 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 09 09:23:43 crc kubenswrapper[4861]: E0309 09:23:43.625615 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d1ad2fa7-36fc-4cd0-98ac-07b48c42e794-etc-swift podName:d1ad2fa7-36fc-4cd0-98ac-07b48c42e794 nodeName:}" failed. No retries permitted until 2026-03-09 09:23:45.625288585 +0000 UTC m=+1068.710327986 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d1ad2fa7-36fc-4cd0-98ac-07b48c42e794-etc-swift") pod "swift-storage-0" (UID: "d1ad2fa7-36fc-4cd0-98ac-07b48c42e794") : configmap "swift-ring-files" not found Mar 09 09:23:43 crc kubenswrapper[4861]: I0309 09:23:43.678961 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8a795fa-6d2f-4387-92c9-e2e5e4b9937b" path="/var/lib/kubelet/pods/c8a795fa-6d2f-4387-92c9-e2e5e4b9937b/volumes" Mar 09 09:23:43 crc kubenswrapper[4861]: I0309 09:23:43.679612 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-51c1-account-create-update-2g9qr"] Mar 09 09:23:43 crc kubenswrapper[4861]: I0309 09:23:43.680654 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-51c1-account-create-update-2g9qr" Mar 09 09:23:43 crc kubenswrapper[4861]: I0309 09:23:43.683742 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-51c1-account-create-update-2g9qr"] Mar 09 09:23:43 crc kubenswrapper[4861]: I0309 09:23:43.684753 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 09 09:23:43 crc kubenswrapper[4861]: I0309 09:23:43.726153 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2de22eb1-f0a3-41f2-a06e-53ce14fedaf8-operator-scripts\") pod \"glance-51c1-account-create-update-2g9qr\" (UID: \"2de22eb1-f0a3-41f2-a06e-53ce14fedaf8\") " pod="openstack/glance-51c1-account-create-update-2g9qr" Mar 09 09:23:43 crc kubenswrapper[4861]: I0309 09:23:43.726280 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rgdh\" (UniqueName: \"kubernetes.io/projected/2de22eb1-f0a3-41f2-a06e-53ce14fedaf8-kube-api-access-2rgdh\") pod \"glance-51c1-account-create-update-2g9qr\" (UID: \"2de22eb1-f0a3-41f2-a06e-53ce14fedaf8\") " pod="openstack/glance-51c1-account-create-update-2g9qr" Mar 09 09:23:43 crc kubenswrapper[4861]: I0309 09:23:43.726511 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3468cc7-9c05-4934-bc2d-287e80b966a4-operator-scripts\") pod \"glance-db-create-7k92z\" (UID: \"c3468cc7-9c05-4934-bc2d-287e80b966a4\") " pod="openstack/glance-db-create-7k92z" Mar 09 09:23:43 crc kubenswrapper[4861]: I0309 09:23:43.726559 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjw9f\" (UniqueName: \"kubernetes.io/projected/c3468cc7-9c05-4934-bc2d-287e80b966a4-kube-api-access-zjw9f\") pod \"glance-db-create-7k92z\" (UID: \"c3468cc7-9c05-4934-bc2d-287e80b966a4\") " pod="openstack/glance-db-create-7k92z" Mar 09 09:23:43 crc kubenswrapper[4861]: I0309 09:23:43.727381 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3468cc7-9c05-4934-bc2d-287e80b966a4-operator-scripts\") pod \"glance-db-create-7k92z\" (UID: \"c3468cc7-9c05-4934-bc2d-287e80b966a4\") " pod="openstack/glance-db-create-7k92z" Mar 09 09:23:43 crc kubenswrapper[4861]: I0309 09:23:43.745605 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjw9f\" (UniqueName: \"kubernetes.io/projected/c3468cc7-9c05-4934-bc2d-287e80b966a4-kube-api-access-zjw9f\") pod \"glance-db-create-7k92z\" (UID: \"c3468cc7-9c05-4934-bc2d-287e80b966a4\") " pod="openstack/glance-db-create-7k92z" Mar 09 09:23:43 crc kubenswrapper[4861]: I0309 09:23:43.752358 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-a7e9-account-create-update-x5n4n" Mar 09 09:23:43 crc kubenswrapper[4861]: I0309 09:23:43.828335 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtppc\" (UniqueName: \"kubernetes.io/projected/591c3df3-47fb-4da5-9776-4a6ed3170472-kube-api-access-mtppc\") pod \"591c3df3-47fb-4da5-9776-4a6ed3170472\" (UID: \"591c3df3-47fb-4da5-9776-4a6ed3170472\") " Mar 09 09:23:43 crc kubenswrapper[4861]: I0309 09:23:43.828399 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/591c3df3-47fb-4da5-9776-4a6ed3170472-operator-scripts\") pod \"591c3df3-47fb-4da5-9776-4a6ed3170472\" (UID: \"591c3df3-47fb-4da5-9776-4a6ed3170472\") " Mar 09 09:23:43 crc kubenswrapper[4861]: I0309 09:23:43.828679 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rgdh\" (UniqueName: \"kubernetes.io/projected/2de22eb1-f0a3-41f2-a06e-53ce14fedaf8-kube-api-access-2rgdh\") pod \"glance-51c1-account-create-update-2g9qr\" (UID: \"2de22eb1-f0a3-41f2-a06e-53ce14fedaf8\") " pod="openstack/glance-51c1-account-create-update-2g9qr" Mar 09 09:23:43 crc kubenswrapper[4861]: I0309 09:23:43.828799 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2de22eb1-f0a3-41f2-a06e-53ce14fedaf8-operator-scripts\") pod \"glance-51c1-account-create-update-2g9qr\" (UID: \"2de22eb1-f0a3-41f2-a06e-53ce14fedaf8\") " pod="openstack/glance-51c1-account-create-update-2g9qr" Mar 09 09:23:43 crc kubenswrapper[4861]: I0309 09:23:43.828983 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/591c3df3-47fb-4da5-9776-4a6ed3170472-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "591c3df3-47fb-4da5-9776-4a6ed3170472" (UID: "591c3df3-47fb-4da5-9776-4a6ed3170472"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:23:43 crc kubenswrapper[4861]: I0309 09:23:43.829462 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2de22eb1-f0a3-41f2-a06e-53ce14fedaf8-operator-scripts\") pod \"glance-51c1-account-create-update-2g9qr\" (UID: \"2de22eb1-f0a3-41f2-a06e-53ce14fedaf8\") " pod="openstack/glance-51c1-account-create-update-2g9qr" Mar 09 09:23:43 crc kubenswrapper[4861]: I0309 09:23:43.832386 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/591c3df3-47fb-4da5-9776-4a6ed3170472-kube-api-access-mtppc" (OuterVolumeSpecName: "kube-api-access-mtppc") pod "591c3df3-47fb-4da5-9776-4a6ed3170472" (UID: "591c3df3-47fb-4da5-9776-4a6ed3170472"). InnerVolumeSpecName "kube-api-access-mtppc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:23:43 crc kubenswrapper[4861]: I0309 09:23:43.845565 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rgdh\" (UniqueName: \"kubernetes.io/projected/2de22eb1-f0a3-41f2-a06e-53ce14fedaf8-kube-api-access-2rgdh\") pod \"glance-51c1-account-create-update-2g9qr\" (UID: \"2de22eb1-f0a3-41f2-a06e-53ce14fedaf8\") " pod="openstack/glance-51c1-account-create-update-2g9qr" Mar 09 09:23:43 crc kubenswrapper[4861]: I0309 09:23:43.901823 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-7k92z" Mar 09 09:23:43 crc kubenswrapper[4861]: I0309 09:23:43.931145 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtppc\" (UniqueName: \"kubernetes.io/projected/591c3df3-47fb-4da5-9776-4a6ed3170472-kube-api-access-mtppc\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:43 crc kubenswrapper[4861]: I0309 09:23:43.931185 4861 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/591c3df3-47fb-4da5-9776-4a6ed3170472-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:44 crc kubenswrapper[4861]: I0309 09:23:44.052841 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-51c1-account-create-update-2g9qr" Mar 09 09:23:44 crc kubenswrapper[4861]: I0309 09:23:44.345783 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-7k92z"] Mar 09 09:23:44 crc kubenswrapper[4861]: W0309 09:23:44.353578 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3468cc7_9c05_4934_bc2d_287e80b966a4.slice/crio-6e6c780abd11d17afd00ea5f30aa97963e606a21f3595a1e1455bb545cf8630b WatchSource:0}: Error finding container 6e6c780abd11d17afd00ea5f30aa97963e606a21f3595a1e1455bb545cf8630b: Status 404 returned error can't find the container with id 6e6c780abd11d17afd00ea5f30aa97963e606a21f3595a1e1455bb545cf8630b Mar 09 09:23:44 crc kubenswrapper[4861]: I0309 09:23:44.417754 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-7k92z" event={"ID":"c3468cc7-9c05-4934-bc2d-287e80b966a4","Type":"ContainerStarted","Data":"6e6c780abd11d17afd00ea5f30aa97963e606a21f3595a1e1455bb545cf8630b"} Mar 09 09:23:44 crc kubenswrapper[4861]: I0309 09:23:44.420859 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-a7e9-account-create-update-x5n4n" Mar 09 09:23:44 crc kubenswrapper[4861]: I0309 09:23:44.420892 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-a7e9-account-create-update-x5n4n" event={"ID":"591c3df3-47fb-4da5-9776-4a6ed3170472","Type":"ContainerDied","Data":"551b7f81f34d6052e95283d9917584afd361f6f94aeac538f52970ce911465fe"} Mar 09 09:23:44 crc kubenswrapper[4861]: I0309 09:23:44.421647 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="551b7f81f34d6052e95283d9917584afd361f6f94aeac538f52970ce911465fe" Mar 09 09:23:44 crc kubenswrapper[4861]: I0309 09:23:44.480308 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-51c1-account-create-update-2g9qr"] Mar 09 09:23:44 crc kubenswrapper[4861]: W0309 09:23:44.483320 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2de22eb1_f0a3_41f2_a06e_53ce14fedaf8.slice/crio-0ba9fabc02e5f3b8b81d99ed23e4805242778f08b900b275bcecf2f0f99348b5 WatchSource:0}: Error finding container 0ba9fabc02e5f3b8b81d99ed23e4805242778f08b900b275bcecf2f0f99348b5: Status 404 returned error can't find the container with id 0ba9fabc02e5f3b8b81d99ed23e4805242778f08b900b275bcecf2f0f99348b5 Mar 09 09:23:45 crc kubenswrapper[4861]: I0309 09:23:45.434949 4861 generic.go:334] "Generic (PLEG): container finished" podID="2de22eb1-f0a3-41f2-a06e-53ce14fedaf8" containerID="fb9822da780ebde3553751d1ca305edfeb98f9507135fc567902c2d2d7719f0c" exitCode=0 Mar 09 09:23:45 crc kubenswrapper[4861]: I0309 09:23:45.435039 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-51c1-account-create-update-2g9qr" event={"ID":"2de22eb1-f0a3-41f2-a06e-53ce14fedaf8","Type":"ContainerDied","Data":"fb9822da780ebde3553751d1ca305edfeb98f9507135fc567902c2d2d7719f0c"} Mar 09 09:23:45 crc kubenswrapper[4861]: I0309 09:23:45.436603 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-51c1-account-create-update-2g9qr" event={"ID":"2de22eb1-f0a3-41f2-a06e-53ce14fedaf8","Type":"ContainerStarted","Data":"0ba9fabc02e5f3b8b81d99ed23e4805242778f08b900b275bcecf2f0f99348b5"} Mar 09 09:23:45 crc kubenswrapper[4861]: I0309 09:23:45.439664 4861 generic.go:334] "Generic (PLEG): container finished" podID="c3468cc7-9c05-4934-bc2d-287e80b966a4" containerID="f501d035d0d6df4266406c5d6d4937c22a47640e8366864942666f5e6ab3438d" exitCode=0 Mar 09 09:23:45 crc kubenswrapper[4861]: I0309 09:23:45.439716 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-7k92z" event={"ID":"c3468cc7-9c05-4934-bc2d-287e80b966a4","Type":"ContainerDied","Data":"f501d035d0d6df4266406c5d6d4937c22a47640e8366864942666f5e6ab3438d"} Mar 09 09:23:45 crc kubenswrapper[4861]: I0309 09:23:45.618353 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-57ggq"] Mar 09 09:23:45 crc kubenswrapper[4861]: E0309 09:23:45.618843 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="591c3df3-47fb-4da5-9776-4a6ed3170472" containerName="mariadb-account-create-update" Mar 09 09:23:45 crc kubenswrapper[4861]: I0309 09:23:45.618871 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="591c3df3-47fb-4da5-9776-4a6ed3170472" containerName="mariadb-account-create-update" Mar 09 09:23:45 crc kubenswrapper[4861]: I0309 09:23:45.619229 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="591c3df3-47fb-4da5-9776-4a6ed3170472" containerName="mariadb-account-create-update" Mar 09 09:23:45 crc kubenswrapper[4861]: I0309 09:23:45.620030 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-57ggq" Mar 09 09:23:45 crc kubenswrapper[4861]: I0309 09:23:45.631303 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 09 09:23:45 crc kubenswrapper[4861]: I0309 09:23:45.640806 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-57ggq"] Mar 09 09:23:45 crc kubenswrapper[4861]: I0309 09:23:45.661216 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d1ad2fa7-36fc-4cd0-98ac-07b48c42e794-etc-swift\") pod \"swift-storage-0\" (UID: \"d1ad2fa7-36fc-4cd0-98ac-07b48c42e794\") " pod="openstack/swift-storage-0" Mar 09 09:23:45 crc kubenswrapper[4861]: E0309 09:23:45.661455 4861 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 09 09:23:45 crc kubenswrapper[4861]: E0309 09:23:45.661469 4861 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 09 09:23:45 crc kubenswrapper[4861]: E0309 09:23:45.661513 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d1ad2fa7-36fc-4cd0-98ac-07b48c42e794-etc-swift podName:d1ad2fa7-36fc-4cd0-98ac-07b48c42e794 nodeName:}" failed. No retries permitted until 2026-03-09 09:23:49.661496696 +0000 UTC m=+1072.746536097 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d1ad2fa7-36fc-4cd0-98ac-07b48c42e794-etc-swift") pod "swift-storage-0" (UID: "d1ad2fa7-36fc-4cd0-98ac-07b48c42e794") : configmap "swift-ring-files" not found Mar 09 09:23:45 crc kubenswrapper[4861]: I0309 09:23:45.763908 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnwbq\" (UniqueName: \"kubernetes.io/projected/ca09ea8f-7fc8-4f28-9716-64ecc377fdb9-kube-api-access-qnwbq\") pod \"root-account-create-update-57ggq\" (UID: \"ca09ea8f-7fc8-4f28-9716-64ecc377fdb9\") " pod="openstack/root-account-create-update-57ggq" Mar 09 09:23:45 crc kubenswrapper[4861]: I0309 09:23:45.764074 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca09ea8f-7fc8-4f28-9716-64ecc377fdb9-operator-scripts\") pod \"root-account-create-update-57ggq\" (UID: \"ca09ea8f-7fc8-4f28-9716-64ecc377fdb9\") " pod="openstack/root-account-create-update-57ggq" Mar 09 09:23:45 crc kubenswrapper[4861]: I0309 09:23:45.828236 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-qjxtd"] Mar 09 09:23:45 crc kubenswrapper[4861]: I0309 09:23:45.829694 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-qjxtd" Mar 09 09:23:45 crc kubenswrapper[4861]: I0309 09:23:45.835298 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 09 09:23:45 crc kubenswrapper[4861]: I0309 09:23:45.835355 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 09 09:23:45 crc kubenswrapper[4861]: I0309 09:23:45.835556 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 09 09:23:45 crc kubenswrapper[4861]: I0309 09:23:45.846018 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-qjxtd"] Mar 09 09:23:45 crc kubenswrapper[4861]: I0309 09:23:45.865792 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnwbq\" (UniqueName: \"kubernetes.io/projected/ca09ea8f-7fc8-4f28-9716-64ecc377fdb9-kube-api-access-qnwbq\") pod \"root-account-create-update-57ggq\" (UID: \"ca09ea8f-7fc8-4f28-9716-64ecc377fdb9\") " pod="openstack/root-account-create-update-57ggq" Mar 09 09:23:45 crc kubenswrapper[4861]: I0309 09:23:45.865910 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca09ea8f-7fc8-4f28-9716-64ecc377fdb9-operator-scripts\") pod \"root-account-create-update-57ggq\" (UID: \"ca09ea8f-7fc8-4f28-9716-64ecc377fdb9\") " pod="openstack/root-account-create-update-57ggq" Mar 09 09:23:45 crc kubenswrapper[4861]: I0309 09:23:45.866618 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca09ea8f-7fc8-4f28-9716-64ecc377fdb9-operator-scripts\") pod \"root-account-create-update-57ggq\" (UID: \"ca09ea8f-7fc8-4f28-9716-64ecc377fdb9\") " pod="openstack/root-account-create-update-57ggq" Mar 09 09:23:45 crc kubenswrapper[4861]: I0309 09:23:45.895433 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnwbq\" (UniqueName: \"kubernetes.io/projected/ca09ea8f-7fc8-4f28-9716-64ecc377fdb9-kube-api-access-qnwbq\") pod \"root-account-create-update-57ggq\" (UID: \"ca09ea8f-7fc8-4f28-9716-64ecc377fdb9\") " pod="openstack/root-account-create-update-57ggq" Mar 09 09:23:45 crc kubenswrapper[4861]: I0309 09:23:45.944013 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-57ggq" Mar 09 09:23:45 crc kubenswrapper[4861]: I0309 09:23:45.966990 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6f0d289d-af18-4534-a0c6-c90f51e93fd8-scripts\") pod \"swift-ring-rebalance-qjxtd\" (UID: \"6f0d289d-af18-4534-a0c6-c90f51e93fd8\") " pod="openstack/swift-ring-rebalance-qjxtd" Mar 09 09:23:45 crc kubenswrapper[4861]: I0309 09:23:45.967055 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6f0d289d-af18-4534-a0c6-c90f51e93fd8-dispersionconf\") pod \"swift-ring-rebalance-qjxtd\" (UID: \"6f0d289d-af18-4534-a0c6-c90f51e93fd8\") " pod="openstack/swift-ring-rebalance-qjxtd" Mar 09 09:23:45 crc kubenswrapper[4861]: I0309 09:23:45.967086 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6f0d289d-af18-4534-a0c6-c90f51e93fd8-ring-data-devices\") pod \"swift-ring-rebalance-qjxtd\" (UID: \"6f0d289d-af18-4534-a0c6-c90f51e93fd8\") " pod="openstack/swift-ring-rebalance-qjxtd" Mar 09 09:23:45 crc kubenswrapper[4861]: I0309 09:23:45.967170 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6f0d289d-af18-4534-a0c6-c90f51e93fd8-etc-swift\") pod \"swift-ring-rebalance-qjxtd\" (UID: \"6f0d289d-af18-4534-a0c6-c90f51e93fd8\") " pod="openstack/swift-ring-rebalance-qjxtd" Mar 09 09:23:45 crc kubenswrapper[4861]: I0309 09:23:45.967198 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pv4bf\" (UniqueName: \"kubernetes.io/projected/6f0d289d-af18-4534-a0c6-c90f51e93fd8-kube-api-access-pv4bf\") pod \"swift-ring-rebalance-qjxtd\" (UID: \"6f0d289d-af18-4534-a0c6-c90f51e93fd8\") " pod="openstack/swift-ring-rebalance-qjxtd" Mar 09 09:23:45 crc kubenswrapper[4861]: I0309 09:23:45.967221 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f0d289d-af18-4534-a0c6-c90f51e93fd8-combined-ca-bundle\") pod \"swift-ring-rebalance-qjxtd\" (UID: \"6f0d289d-af18-4534-a0c6-c90f51e93fd8\") " pod="openstack/swift-ring-rebalance-qjxtd" Mar 09 09:23:45 crc kubenswrapper[4861]: I0309 09:23:45.967240 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6f0d289d-af18-4534-a0c6-c90f51e93fd8-swiftconf\") pod \"swift-ring-rebalance-qjxtd\" (UID: \"6f0d289d-af18-4534-a0c6-c90f51e93fd8\") " pod="openstack/swift-ring-rebalance-qjxtd" Mar 09 09:23:46 crc kubenswrapper[4861]: I0309 09:23:46.068593 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6f0d289d-af18-4534-a0c6-c90f51e93fd8-dispersionconf\") pod \"swift-ring-rebalance-qjxtd\" (UID: \"6f0d289d-af18-4534-a0c6-c90f51e93fd8\") " pod="openstack/swift-ring-rebalance-qjxtd" Mar 09 09:23:46 crc kubenswrapper[4861]: I0309 09:23:46.068967 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6f0d289d-af18-4534-a0c6-c90f51e93fd8-ring-data-devices\") pod \"swift-ring-rebalance-qjxtd\" (UID: \"6f0d289d-af18-4534-a0c6-c90f51e93fd8\") " pod="openstack/swift-ring-rebalance-qjxtd" Mar 09 09:23:46 crc kubenswrapper[4861]: I0309 09:23:46.069097 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6f0d289d-af18-4534-a0c6-c90f51e93fd8-etc-swift\") pod \"swift-ring-rebalance-qjxtd\" (UID: \"6f0d289d-af18-4534-a0c6-c90f51e93fd8\") " pod="openstack/swift-ring-rebalance-qjxtd" Mar 09 09:23:46 crc kubenswrapper[4861]: I0309 09:23:46.069160 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pv4bf\" (UniqueName: \"kubernetes.io/projected/6f0d289d-af18-4534-a0c6-c90f51e93fd8-kube-api-access-pv4bf\") pod \"swift-ring-rebalance-qjxtd\" (UID: \"6f0d289d-af18-4534-a0c6-c90f51e93fd8\") " pod="openstack/swift-ring-rebalance-qjxtd" Mar 09 09:23:46 crc kubenswrapper[4861]: I0309 09:23:46.069194 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f0d289d-af18-4534-a0c6-c90f51e93fd8-combined-ca-bundle\") pod \"swift-ring-rebalance-qjxtd\" (UID: \"6f0d289d-af18-4534-a0c6-c90f51e93fd8\") " pod="openstack/swift-ring-rebalance-qjxtd" Mar 09 09:23:46 crc kubenswrapper[4861]: I0309 09:23:46.069245 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6f0d289d-af18-4534-a0c6-c90f51e93fd8-swiftconf\") pod \"swift-ring-rebalance-qjxtd\" (UID: \"6f0d289d-af18-4534-a0c6-c90f51e93fd8\") " pod="openstack/swift-ring-rebalance-qjxtd" Mar 09 09:23:46 crc kubenswrapper[4861]: I0309 09:23:46.069283 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6f0d289d-af18-4534-a0c6-c90f51e93fd8-scripts\") pod \"swift-ring-rebalance-qjxtd\" (UID: \"6f0d289d-af18-4534-a0c6-c90f51e93fd8\") " pod="openstack/swift-ring-rebalance-qjxtd" Mar 09 09:23:46 crc kubenswrapper[4861]: I0309 09:23:46.070748 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6f0d289d-af18-4534-a0c6-c90f51e93fd8-ring-data-devices\") pod \"swift-ring-rebalance-qjxtd\" (UID: \"6f0d289d-af18-4534-a0c6-c90f51e93fd8\") " pod="openstack/swift-ring-rebalance-qjxtd" Mar 09 09:23:46 crc kubenswrapper[4861]: I0309 09:23:46.070994 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6f0d289d-af18-4534-a0c6-c90f51e93fd8-etc-swift\") pod \"swift-ring-rebalance-qjxtd\" (UID: \"6f0d289d-af18-4534-a0c6-c90f51e93fd8\") " pod="openstack/swift-ring-rebalance-qjxtd" Mar 09 09:23:46 crc kubenswrapper[4861]: I0309 09:23:46.074043 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6f0d289d-af18-4534-a0c6-c90f51e93fd8-scripts\") pod \"swift-ring-rebalance-qjxtd\" (UID: \"6f0d289d-af18-4534-a0c6-c90f51e93fd8\") " pod="openstack/swift-ring-rebalance-qjxtd" Mar 09 09:23:46 crc kubenswrapper[4861]: I0309 09:23:46.074803 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6f0d289d-af18-4534-a0c6-c90f51e93fd8-dispersionconf\") pod \"swift-ring-rebalance-qjxtd\" (UID: \"6f0d289d-af18-4534-a0c6-c90f51e93fd8\") " pod="openstack/swift-ring-rebalance-qjxtd" Mar 09 09:23:46 crc kubenswrapper[4861]: I0309 09:23:46.076429 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f0d289d-af18-4534-a0c6-c90f51e93fd8-combined-ca-bundle\") pod \"swift-ring-rebalance-qjxtd\" (UID: \"6f0d289d-af18-4534-a0c6-c90f51e93fd8\") " pod="openstack/swift-ring-rebalance-qjxtd" Mar 09 09:23:46 crc kubenswrapper[4861]: I0309 09:23:46.079733 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6f0d289d-af18-4534-a0c6-c90f51e93fd8-swiftconf\") pod \"swift-ring-rebalance-qjxtd\" (UID: \"6f0d289d-af18-4534-a0c6-c90f51e93fd8\") " pod="openstack/swift-ring-rebalance-qjxtd" Mar 09 09:23:46 crc kubenswrapper[4861]: I0309 09:23:46.086628 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pv4bf\" (UniqueName: \"kubernetes.io/projected/6f0d289d-af18-4534-a0c6-c90f51e93fd8-kube-api-access-pv4bf\") pod \"swift-ring-rebalance-qjxtd\" (UID: \"6f0d289d-af18-4534-a0c6-c90f51e93fd8\") " pod="openstack/swift-ring-rebalance-qjxtd" Mar 09 09:23:46 crc kubenswrapper[4861]: I0309 09:23:46.158820 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-qjxtd" Mar 09 09:23:46 crc kubenswrapper[4861]: W0309 09:23:46.374106 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca09ea8f_7fc8_4f28_9716_64ecc377fdb9.slice/crio-0b4ba730115dc77a4586911b452ac0f2b3daa883ea103085a872c302c5dcb4e9 WatchSource:0}: Error finding container 0b4ba730115dc77a4586911b452ac0f2b3daa883ea103085a872c302c5dcb4e9: Status 404 returned error can't find the container with id 0b4ba730115dc77a4586911b452ac0f2b3daa883ea103085a872c302c5dcb4e9 Mar 09 09:23:46 crc kubenswrapper[4861]: I0309 09:23:46.375113 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-57ggq"] Mar 09 09:23:46 crc kubenswrapper[4861]: I0309 09:23:46.457857 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-57ggq" event={"ID":"ca09ea8f-7fc8-4f28-9716-64ecc377fdb9","Type":"ContainerStarted","Data":"0b4ba730115dc77a4586911b452ac0f2b3daa883ea103085a872c302c5dcb4e9"} Mar 09 09:23:46 crc kubenswrapper[4861]: I0309 09:23:46.591628 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-qjxtd"] Mar 09 09:23:46 crc kubenswrapper[4861]: W0309 09:23:46.599718 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f0d289d_af18_4534_a0c6_c90f51e93fd8.slice/crio-d8a71743eaa0e4cc6c173f2241ce883628cf6a3f3ce210d7d2d56ebbf44c749f WatchSource:0}: Error finding container d8a71743eaa0e4cc6c173f2241ce883628cf6a3f3ce210d7d2d56ebbf44c749f: Status 404 returned error can't find the container with id d8a71743eaa0e4cc6c173f2241ce883628cf6a3f3ce210d7d2d56ebbf44c749f Mar 09 09:23:46 crc kubenswrapper[4861]: I0309 09:23:46.790362 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-51c1-account-create-update-2g9qr" Mar 09 09:23:46 crc kubenswrapper[4861]: I0309 09:23:46.829323 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-7k92z" Mar 09 09:23:46 crc kubenswrapper[4861]: I0309 09:23:46.983564 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2de22eb1-f0a3-41f2-a06e-53ce14fedaf8-operator-scripts\") pod \"2de22eb1-f0a3-41f2-a06e-53ce14fedaf8\" (UID: \"2de22eb1-f0a3-41f2-a06e-53ce14fedaf8\") " Mar 09 09:23:46 crc kubenswrapper[4861]: I0309 09:23:46.983687 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rgdh\" (UniqueName: \"kubernetes.io/projected/2de22eb1-f0a3-41f2-a06e-53ce14fedaf8-kube-api-access-2rgdh\") pod \"2de22eb1-f0a3-41f2-a06e-53ce14fedaf8\" (UID: \"2de22eb1-f0a3-41f2-a06e-53ce14fedaf8\") " Mar 09 09:23:46 crc kubenswrapper[4861]: I0309 09:23:46.983750 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjw9f\" (UniqueName: \"kubernetes.io/projected/c3468cc7-9c05-4934-bc2d-287e80b966a4-kube-api-access-zjw9f\") pod \"c3468cc7-9c05-4934-bc2d-287e80b966a4\" (UID: \"c3468cc7-9c05-4934-bc2d-287e80b966a4\") " Mar 09 09:23:46 crc kubenswrapper[4861]: I0309 09:23:46.983811 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3468cc7-9c05-4934-bc2d-287e80b966a4-operator-scripts\") pod \"c3468cc7-9c05-4934-bc2d-287e80b966a4\" (UID: \"c3468cc7-9c05-4934-bc2d-287e80b966a4\") " Mar 09 09:23:46 crc kubenswrapper[4861]: I0309 09:23:46.984675 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2de22eb1-f0a3-41f2-a06e-53ce14fedaf8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2de22eb1-f0a3-41f2-a06e-53ce14fedaf8" (UID: "2de22eb1-f0a3-41f2-a06e-53ce14fedaf8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:23:46 crc kubenswrapper[4861]: I0309 09:23:46.984684 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3468cc7-9c05-4934-bc2d-287e80b966a4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c3468cc7-9c05-4934-bc2d-287e80b966a4" (UID: "c3468cc7-9c05-4934-bc2d-287e80b966a4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:23:46 crc kubenswrapper[4861]: I0309 09:23:46.988403 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2de22eb1-f0a3-41f2-a06e-53ce14fedaf8-kube-api-access-2rgdh" (OuterVolumeSpecName: "kube-api-access-2rgdh") pod "2de22eb1-f0a3-41f2-a06e-53ce14fedaf8" (UID: "2de22eb1-f0a3-41f2-a06e-53ce14fedaf8"). InnerVolumeSpecName "kube-api-access-2rgdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:23:46 crc kubenswrapper[4861]: I0309 09:23:46.988898 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3468cc7-9c05-4934-bc2d-287e80b966a4-kube-api-access-zjw9f" (OuterVolumeSpecName: "kube-api-access-zjw9f") pod "c3468cc7-9c05-4934-bc2d-287e80b966a4" (UID: "c3468cc7-9c05-4934-bc2d-287e80b966a4"). InnerVolumeSpecName "kube-api-access-zjw9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:23:47 crc kubenswrapper[4861]: I0309 09:23:47.085850 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rgdh\" (UniqueName: \"kubernetes.io/projected/2de22eb1-f0a3-41f2-a06e-53ce14fedaf8-kube-api-access-2rgdh\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:47 crc kubenswrapper[4861]: I0309 09:23:47.085880 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjw9f\" (UniqueName: \"kubernetes.io/projected/c3468cc7-9c05-4934-bc2d-287e80b966a4-kube-api-access-zjw9f\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:47 crc kubenswrapper[4861]: I0309 09:23:47.085894 4861 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3468cc7-9c05-4934-bc2d-287e80b966a4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:47 crc kubenswrapper[4861]: I0309 09:23:47.085906 4861 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2de22eb1-f0a3-41f2-a06e-53ce14fedaf8-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:47 crc kubenswrapper[4861]: I0309 09:23:47.467347 4861 generic.go:334] "Generic (PLEG): container finished" podID="ca09ea8f-7fc8-4f28-9716-64ecc377fdb9" containerID="88eb70e641682e41216df017500a8be0f5e835caaf804e626b55c86a4abe6876" exitCode=0 Mar 09 09:23:47 crc kubenswrapper[4861]: I0309 09:23:47.467437 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-57ggq" event={"ID":"ca09ea8f-7fc8-4f28-9716-64ecc377fdb9","Type":"ContainerDied","Data":"88eb70e641682e41216df017500a8be0f5e835caaf804e626b55c86a4abe6876"} Mar 09 09:23:47 crc kubenswrapper[4861]: I0309 09:23:47.469426 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-51c1-account-create-update-2g9qr" event={"ID":"2de22eb1-f0a3-41f2-a06e-53ce14fedaf8","Type":"ContainerDied","Data":"0ba9fabc02e5f3b8b81d99ed23e4805242778f08b900b275bcecf2f0f99348b5"} Mar 09 09:23:47 crc kubenswrapper[4861]: I0309 09:23:47.469450 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ba9fabc02e5f3b8b81d99ed23e4805242778f08b900b275bcecf2f0f99348b5" Mar 09 09:23:47 crc kubenswrapper[4861]: I0309 09:23:47.469490 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-51c1-account-create-update-2g9qr" Mar 09 09:23:47 crc kubenswrapper[4861]: I0309 09:23:47.477624 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-qjxtd" event={"ID":"6f0d289d-af18-4534-a0c6-c90f51e93fd8","Type":"ContainerStarted","Data":"d8a71743eaa0e4cc6c173f2241ce883628cf6a3f3ce210d7d2d56ebbf44c749f"} Mar 09 09:23:47 crc kubenswrapper[4861]: I0309 09:23:47.479249 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-7k92z" event={"ID":"c3468cc7-9c05-4934-bc2d-287e80b966a4","Type":"ContainerDied","Data":"6e6c780abd11d17afd00ea5f30aa97963e606a21f3595a1e1455bb545cf8630b"} Mar 09 09:23:47 crc kubenswrapper[4861]: I0309 09:23:47.479279 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e6c780abd11d17afd00ea5f30aa97963e606a21f3595a1e1455bb545cf8630b" Mar 09 09:23:47 crc kubenswrapper[4861]: I0309 09:23:47.479361 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-7k92z" Mar 09 09:23:48 crc kubenswrapper[4861]: I0309 09:23:48.794311 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-5kg66"] Mar 09 09:23:48 crc kubenswrapper[4861]: E0309 09:23:48.795020 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2de22eb1-f0a3-41f2-a06e-53ce14fedaf8" containerName="mariadb-account-create-update" Mar 09 09:23:48 crc kubenswrapper[4861]: I0309 09:23:48.795034 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="2de22eb1-f0a3-41f2-a06e-53ce14fedaf8" containerName="mariadb-account-create-update" Mar 09 09:23:48 crc kubenswrapper[4861]: E0309 09:23:48.795052 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3468cc7-9c05-4934-bc2d-287e80b966a4" containerName="mariadb-database-create" Mar 09 09:23:48 crc kubenswrapper[4861]: I0309 09:23:48.795058 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3468cc7-9c05-4934-bc2d-287e80b966a4" containerName="mariadb-database-create" Mar 09 09:23:48 crc kubenswrapper[4861]: I0309 09:23:48.795189 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="2de22eb1-f0a3-41f2-a06e-53ce14fedaf8" containerName="mariadb-account-create-update" Mar 09 09:23:48 crc kubenswrapper[4861]: I0309 09:23:48.795212 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3468cc7-9c05-4934-bc2d-287e80b966a4" containerName="mariadb-database-create" Mar 09 09:23:48 crc kubenswrapper[4861]: I0309 09:23:48.800994 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-5kg66" Mar 09 09:23:48 crc kubenswrapper[4861]: I0309 09:23:48.804384 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 09 09:23:48 crc kubenswrapper[4861]: I0309 09:23:48.804538 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-vwtrw" Mar 09 09:23:48 crc kubenswrapper[4861]: I0309 09:23:48.808640 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-5kg66"] Mar 09 09:23:48 crc kubenswrapper[4861]: I0309 09:23:48.927477 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7drvd\" (UniqueName: \"kubernetes.io/projected/634dd56c-c726-49cc-9a71-ef57a7d0a984-kube-api-access-7drvd\") pod \"glance-db-sync-5kg66\" (UID: \"634dd56c-c726-49cc-9a71-ef57a7d0a984\") " pod="openstack/glance-db-sync-5kg66" Mar 09 09:23:48 crc kubenswrapper[4861]: I0309 09:23:48.927539 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/634dd56c-c726-49cc-9a71-ef57a7d0a984-combined-ca-bundle\") pod \"glance-db-sync-5kg66\" (UID: \"634dd56c-c726-49cc-9a71-ef57a7d0a984\") " pod="openstack/glance-db-sync-5kg66" Mar 09 09:23:48 crc kubenswrapper[4861]: I0309 09:23:48.927588 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/634dd56c-c726-49cc-9a71-ef57a7d0a984-db-sync-config-data\") pod \"glance-db-sync-5kg66\" (UID: \"634dd56c-c726-49cc-9a71-ef57a7d0a984\") " pod="openstack/glance-db-sync-5kg66" Mar 09 09:23:48 crc kubenswrapper[4861]: I0309 09:23:48.927650 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/634dd56c-c726-49cc-9a71-ef57a7d0a984-config-data\") pod \"glance-db-sync-5kg66\" (UID: \"634dd56c-c726-49cc-9a71-ef57a7d0a984\") " pod="openstack/glance-db-sync-5kg66" Mar 09 09:23:49 crc kubenswrapper[4861]: I0309 09:23:49.028601 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/634dd56c-c726-49cc-9a71-ef57a7d0a984-db-sync-config-data\") pod \"glance-db-sync-5kg66\" (UID: \"634dd56c-c726-49cc-9a71-ef57a7d0a984\") " pod="openstack/glance-db-sync-5kg66" Mar 09 09:23:49 crc kubenswrapper[4861]: I0309 09:23:49.029022 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/634dd56c-c726-49cc-9a71-ef57a7d0a984-config-data\") pod \"glance-db-sync-5kg66\" (UID: \"634dd56c-c726-49cc-9a71-ef57a7d0a984\") " pod="openstack/glance-db-sync-5kg66" Mar 09 09:23:49 crc kubenswrapper[4861]: I0309 09:23:49.029183 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7drvd\" (UniqueName: \"kubernetes.io/projected/634dd56c-c726-49cc-9a71-ef57a7d0a984-kube-api-access-7drvd\") pod \"glance-db-sync-5kg66\" (UID: \"634dd56c-c726-49cc-9a71-ef57a7d0a984\") " pod="openstack/glance-db-sync-5kg66" Mar 09 09:23:49 crc kubenswrapper[4861]: I0309 09:23:49.029279 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/634dd56c-c726-49cc-9a71-ef57a7d0a984-combined-ca-bundle\") pod \"glance-db-sync-5kg66\" (UID: \"634dd56c-c726-49cc-9a71-ef57a7d0a984\") " pod="openstack/glance-db-sync-5kg66" Mar 09 09:23:49 crc kubenswrapper[4861]: I0309 09:23:49.036318 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/634dd56c-c726-49cc-9a71-ef57a7d0a984-db-sync-config-data\") pod \"glance-db-sync-5kg66\" (UID: \"634dd56c-c726-49cc-9a71-ef57a7d0a984\") " pod="openstack/glance-db-sync-5kg66" Mar 09 09:23:49 crc kubenswrapper[4861]: I0309 09:23:49.036807 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/634dd56c-c726-49cc-9a71-ef57a7d0a984-combined-ca-bundle\") pod \"glance-db-sync-5kg66\" (UID: \"634dd56c-c726-49cc-9a71-ef57a7d0a984\") " pod="openstack/glance-db-sync-5kg66" Mar 09 09:23:49 crc kubenswrapper[4861]: I0309 09:23:49.036939 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/634dd56c-c726-49cc-9a71-ef57a7d0a984-config-data\") pod \"glance-db-sync-5kg66\" (UID: \"634dd56c-c726-49cc-9a71-ef57a7d0a984\") " pod="openstack/glance-db-sync-5kg66" Mar 09 09:23:49 crc kubenswrapper[4861]: I0309 09:23:49.051469 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7drvd\" (UniqueName: \"kubernetes.io/projected/634dd56c-c726-49cc-9a71-ef57a7d0a984-kube-api-access-7drvd\") pod \"glance-db-sync-5kg66\" (UID: \"634dd56c-c726-49cc-9a71-ef57a7d0a984\") " pod="openstack/glance-db-sync-5kg66" Mar 09 09:23:49 crc kubenswrapper[4861]: I0309 09:23:49.128509 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-5kg66" Mar 09 09:23:49 crc kubenswrapper[4861]: I0309 09:23:49.499358 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-bc782"] Mar 09 09:23:49 crc kubenswrapper[4861]: I0309 09:23:49.500711 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-bc782" Mar 09 09:23:49 crc kubenswrapper[4861]: I0309 09:23:49.514189 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-bc782"] Mar 09 09:23:49 crc kubenswrapper[4861]: I0309 09:23:49.616701 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-3c6b-account-create-update-g9zvd"] Mar 09 09:23:49 crc kubenswrapper[4861]: I0309 09:23:49.617851 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3c6b-account-create-update-g9zvd" Mar 09 09:23:49 crc kubenswrapper[4861]: I0309 09:23:49.619533 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 09 09:23:49 crc kubenswrapper[4861]: I0309 09:23:49.623729 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-3c6b-account-create-update-g9zvd"] Mar 09 09:23:49 crc kubenswrapper[4861]: I0309 09:23:49.638933 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a022a77f-33dc-449f-b20f-b91978014c94-operator-scripts\") pod \"keystone-db-create-bc782\" (UID: \"a022a77f-33dc-449f-b20f-b91978014c94\") " pod="openstack/keystone-db-create-bc782" Mar 09 09:23:49 crc kubenswrapper[4861]: I0309 09:23:49.639009 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkxm2\" (UniqueName: \"kubernetes.io/projected/a022a77f-33dc-449f-b20f-b91978014c94-kube-api-access-zkxm2\") pod \"keystone-db-create-bc782\" (UID: \"a022a77f-33dc-449f-b20f-b91978014c94\") " pod="openstack/keystone-db-create-bc782" Mar 09 09:23:49 crc kubenswrapper[4861]: I0309 09:23:49.741429 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e36bbce2-29bf-46ab-bc4f-a7afc2423059-operator-scripts\") pod \"keystone-3c6b-account-create-update-g9zvd\" (UID: \"e36bbce2-29bf-46ab-bc4f-a7afc2423059\") " pod="openstack/keystone-3c6b-account-create-update-g9zvd" Mar 09 09:23:49 crc kubenswrapper[4861]: I0309 09:23:49.741655 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a022a77f-33dc-449f-b20f-b91978014c94-operator-scripts\") pod \"keystone-db-create-bc782\" (UID: \"a022a77f-33dc-449f-b20f-b91978014c94\") " pod="openstack/keystone-db-create-bc782" Mar 09 09:23:49 crc kubenswrapper[4861]: I0309 09:23:49.742442 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a022a77f-33dc-449f-b20f-b91978014c94-operator-scripts\") pod \"keystone-db-create-bc782\" (UID: \"a022a77f-33dc-449f-b20f-b91978014c94\") " pod="openstack/keystone-db-create-bc782" Mar 09 09:23:49 crc kubenswrapper[4861]: I0309 09:23:49.742642 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkxm2\" (UniqueName: \"kubernetes.io/projected/a022a77f-33dc-449f-b20f-b91978014c94-kube-api-access-zkxm2\") pod \"keystone-db-create-bc782\" (UID: \"a022a77f-33dc-449f-b20f-b91978014c94\") " pod="openstack/keystone-db-create-bc782" Mar 09 09:23:49 crc kubenswrapper[4861]: I0309 09:23:49.742799 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d1ad2fa7-36fc-4cd0-98ac-07b48c42e794-etc-swift\") pod \"swift-storage-0\" (UID: \"d1ad2fa7-36fc-4cd0-98ac-07b48c42e794\") " pod="openstack/swift-storage-0" Mar 09 09:23:49 crc kubenswrapper[4861]: I0309 09:23:49.742828 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmhjg\" (UniqueName: \"kubernetes.io/projected/e36bbce2-29bf-46ab-bc4f-a7afc2423059-kube-api-access-dmhjg\") pod \"keystone-3c6b-account-create-update-g9zvd\" (UID: \"e36bbce2-29bf-46ab-bc4f-a7afc2423059\") " pod="openstack/keystone-3c6b-account-create-update-g9zvd" Mar 09 09:23:49 crc kubenswrapper[4861]: E0309 09:23:49.742961 4861 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 09 09:23:49 crc kubenswrapper[4861]: E0309 09:23:49.742976 4861 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 09 09:23:49 crc kubenswrapper[4861]: E0309 09:23:49.743048 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d1ad2fa7-36fc-4cd0-98ac-07b48c42e794-etc-swift podName:d1ad2fa7-36fc-4cd0-98ac-07b48c42e794 nodeName:}" failed. No retries permitted until 2026-03-09 09:23:57.743032919 +0000 UTC m=+1080.828072320 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d1ad2fa7-36fc-4cd0-98ac-07b48c42e794-etc-swift") pod "swift-storage-0" (UID: "d1ad2fa7-36fc-4cd0-98ac-07b48c42e794") : configmap "swift-ring-files" not found Mar 09 09:23:49 crc kubenswrapper[4861]: I0309 09:23:49.759281 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkxm2\" (UniqueName: \"kubernetes.io/projected/a022a77f-33dc-449f-b20f-b91978014c94-kube-api-access-zkxm2\") pod \"keystone-db-create-bc782\" (UID: \"a022a77f-33dc-449f-b20f-b91978014c94\") " pod="openstack/keystone-db-create-bc782" Mar 09 09:23:49 crc kubenswrapper[4861]: I0309 09:23:49.824964 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-bc782" Mar 09 09:23:49 crc kubenswrapper[4861]: I0309 09:23:49.845905 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmhjg\" (UniqueName: \"kubernetes.io/projected/e36bbce2-29bf-46ab-bc4f-a7afc2423059-kube-api-access-dmhjg\") pod \"keystone-3c6b-account-create-update-g9zvd\" (UID: \"e36bbce2-29bf-46ab-bc4f-a7afc2423059\") " pod="openstack/keystone-3c6b-account-create-update-g9zvd" Mar 09 09:23:49 crc kubenswrapper[4861]: I0309 09:23:49.846066 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e36bbce2-29bf-46ab-bc4f-a7afc2423059-operator-scripts\") pod \"keystone-3c6b-account-create-update-g9zvd\" (UID: \"e36bbce2-29bf-46ab-bc4f-a7afc2423059\") " pod="openstack/keystone-3c6b-account-create-update-g9zvd" Mar 09 09:23:49 crc kubenswrapper[4861]: I0309 09:23:49.847485 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e36bbce2-29bf-46ab-bc4f-a7afc2423059-operator-scripts\") pod \"keystone-3c6b-account-create-update-g9zvd\" (UID: \"e36bbce2-29bf-46ab-bc4f-a7afc2423059\") " pod="openstack/keystone-3c6b-account-create-update-g9zvd" Mar 09 09:23:49 crc kubenswrapper[4861]: I0309 09:23:49.862580 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmhjg\" (UniqueName: \"kubernetes.io/projected/e36bbce2-29bf-46ab-bc4f-a7afc2423059-kube-api-access-dmhjg\") pod \"keystone-3c6b-account-create-update-g9zvd\" (UID: \"e36bbce2-29bf-46ab-bc4f-a7afc2423059\") " pod="openstack/keystone-3c6b-account-create-update-g9zvd" Mar 09 09:23:49 crc kubenswrapper[4861]: I0309 09:23:49.939642 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3c6b-account-create-update-g9zvd" Mar 09 09:23:50 crc kubenswrapper[4861]: I0309 09:23:50.800653 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-57ggq" Mar 09 09:23:50 crc kubenswrapper[4861]: I0309 09:23:50.870538 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnwbq\" (UniqueName: \"kubernetes.io/projected/ca09ea8f-7fc8-4f28-9716-64ecc377fdb9-kube-api-access-qnwbq\") pod \"ca09ea8f-7fc8-4f28-9716-64ecc377fdb9\" (UID: \"ca09ea8f-7fc8-4f28-9716-64ecc377fdb9\") " Mar 09 09:23:50 crc kubenswrapper[4861]: I0309 09:23:50.870625 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca09ea8f-7fc8-4f28-9716-64ecc377fdb9-operator-scripts\") pod \"ca09ea8f-7fc8-4f28-9716-64ecc377fdb9\" (UID: \"ca09ea8f-7fc8-4f28-9716-64ecc377fdb9\") " Mar 09 09:23:50 crc kubenswrapper[4861]: I0309 09:23:50.871491 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca09ea8f-7fc8-4f28-9716-64ecc377fdb9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ca09ea8f-7fc8-4f28-9716-64ecc377fdb9" (UID: "ca09ea8f-7fc8-4f28-9716-64ecc377fdb9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:23:50 crc kubenswrapper[4861]: I0309 09:23:50.874062 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca09ea8f-7fc8-4f28-9716-64ecc377fdb9-kube-api-access-qnwbq" (OuterVolumeSpecName: "kube-api-access-qnwbq") pod "ca09ea8f-7fc8-4f28-9716-64ecc377fdb9" (UID: "ca09ea8f-7fc8-4f28-9716-64ecc377fdb9"). InnerVolumeSpecName "kube-api-access-qnwbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:23:50 crc kubenswrapper[4861]: I0309 09:23:50.971837 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnwbq\" (UniqueName: \"kubernetes.io/projected/ca09ea8f-7fc8-4f28-9716-64ecc377fdb9-kube-api-access-qnwbq\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:50 crc kubenswrapper[4861]: I0309 09:23:50.971870 4861 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ca09ea8f-7fc8-4f28-9716-64ecc377fdb9-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:51 crc kubenswrapper[4861]: I0309 09:23:51.064114 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-675f7dd995-89gfl" Mar 09 09:23:51 crc kubenswrapper[4861]: I0309 09:23:51.127065 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-zbnlj"] Mar 09 09:23:51 crc kubenswrapper[4861]: I0309 09:23:51.127652 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7c47bcb9f9-zbnlj" podUID="d4847dbd-e086-4995-b488-c7611173b6e8" containerName="dnsmasq-dns" containerID="cri-o://c4af947936706a88621a7b2d5ec3639ed298559375ad0f3c182cb888cf1d8117" gracePeriod=10 Mar 09 09:23:51 crc kubenswrapper[4861]: I0309 09:23:51.191188 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-bc782"] Mar 09 09:23:51 crc kubenswrapper[4861]: I0309 09:23:51.286677 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-5kg66"] Mar 09 09:23:51 crc kubenswrapper[4861]: W0309 09:23:51.363276 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode36bbce2_29bf_46ab_bc4f_a7afc2423059.slice/crio-91b0e98a87b87803150b10c25564f6724bdcaccc11ea75648989f5953013b2d1 WatchSource:0}: Error finding container 91b0e98a87b87803150b10c25564f6724bdcaccc11ea75648989f5953013b2d1: Status 404 returned error can't find the container with id 91b0e98a87b87803150b10c25564f6724bdcaccc11ea75648989f5953013b2d1 Mar 09 09:23:51 crc kubenswrapper[4861]: I0309 09:23:51.363493 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-3c6b-account-create-update-g9zvd"] Mar 09 09:23:51 crc kubenswrapper[4861]: I0309 09:23:51.522957 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-qjxtd" event={"ID":"6f0d289d-af18-4534-a0c6-c90f51e93fd8","Type":"ContainerStarted","Data":"6df0b0ac53bfc579c855ba801966917a12b5d9040dfae21f97a4098fd42b52ae"} Mar 09 09:23:51 crc kubenswrapper[4861]: I0309 09:23:51.524609 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-3c6b-account-create-update-g9zvd" event={"ID":"e36bbce2-29bf-46ab-bc4f-a7afc2423059","Type":"ContainerStarted","Data":"91b0e98a87b87803150b10c25564f6724bdcaccc11ea75648989f5953013b2d1"} Mar 09 09:23:51 crc kubenswrapper[4861]: I0309 09:23:51.526416 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-bc782" event={"ID":"a022a77f-33dc-449f-b20f-b91978014c94","Type":"ContainerStarted","Data":"a0300fc10e488ebd33207cec458035e8e06b9da87be58b19fafc10fcb6e4aeb7"} Mar 09 09:23:51 crc kubenswrapper[4861]: I0309 09:23:51.526567 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-bc782" event={"ID":"a022a77f-33dc-449f-b20f-b91978014c94","Type":"ContainerStarted","Data":"4c4696ee84428c91d6b2490d01275bf258b56b3e842e360100890ba3c02c0138"} Mar 09 09:23:51 crc kubenswrapper[4861]: I0309 09:23:51.532995 4861 generic.go:334] "Generic (PLEG): container finished" podID="d4847dbd-e086-4995-b488-c7611173b6e8" containerID="c4af947936706a88621a7b2d5ec3639ed298559375ad0f3c182cb888cf1d8117" exitCode=0 Mar 09 09:23:51 crc kubenswrapper[4861]: I0309 09:23:51.533054 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c47bcb9f9-zbnlj" event={"ID":"d4847dbd-e086-4995-b488-c7611173b6e8","Type":"ContainerDied","Data":"c4af947936706a88621a7b2d5ec3639ed298559375ad0f3c182cb888cf1d8117"} Mar 09 09:23:51 crc kubenswrapper[4861]: I0309 09:23:51.534135 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-57ggq" event={"ID":"ca09ea8f-7fc8-4f28-9716-64ecc377fdb9","Type":"ContainerDied","Data":"0b4ba730115dc77a4586911b452ac0f2b3daa883ea103085a872c302c5dcb4e9"} Mar 09 09:23:51 crc kubenswrapper[4861]: I0309 09:23:51.534163 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b4ba730115dc77a4586911b452ac0f2b3daa883ea103085a872c302c5dcb4e9" Mar 09 09:23:51 crc kubenswrapper[4861]: I0309 09:23:51.534212 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-57ggq" Mar 09 09:23:51 crc kubenswrapper[4861]: I0309 09:23:51.550426 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-5kg66" event={"ID":"634dd56c-c726-49cc-9a71-ef57a7d0a984","Type":"ContainerStarted","Data":"a70863bf785d32bda3c2c8580403592edf1366dc3350d66a8a5c291505db7cb1"} Mar 09 09:23:51 crc kubenswrapper[4861]: I0309 09:23:51.562706 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-qjxtd" podStartSLOduration=2.4401550260000002 podStartE2EDuration="6.562682641s" podCreationTimestamp="2026-03-09 09:23:45 +0000 UTC" firstStartedPulling="2026-03-09 09:23:46.602282311 +0000 UTC m=+1069.687321712" lastFinishedPulling="2026-03-09 09:23:50.724809926 +0000 UTC m=+1073.809849327" observedRunningTime="2026-03-09 09:23:51.541796218 +0000 UTC m=+1074.626835639" watchObservedRunningTime="2026-03-09 09:23:51.562682641 +0000 UTC m=+1074.647722032" Mar 09 09:23:51 crc kubenswrapper[4861]: I0309 09:23:51.563884 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-bc782" podStartSLOduration=2.563879537 podStartE2EDuration="2.563879537s" podCreationTimestamp="2026-03-09 09:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:23:51.555844447 +0000 UTC m=+1074.640883868" watchObservedRunningTime="2026-03-09 09:23:51.563879537 +0000 UTC m=+1074.648918928" Mar 09 09:23:51 crc kubenswrapper[4861]: I0309 09:23:51.575478 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-zbnlj" Mar 09 09:23:51 crc kubenswrapper[4861]: I0309 09:23:51.583789 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d4847dbd-e086-4995-b488-c7611173b6e8-dns-svc\") pod \"d4847dbd-e086-4995-b488-c7611173b6e8\" (UID: \"d4847dbd-e086-4995-b488-c7611173b6e8\") " Mar 09 09:23:51 crc kubenswrapper[4861]: I0309 09:23:51.583844 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4847dbd-e086-4995-b488-c7611173b6e8-config\") pod \"d4847dbd-e086-4995-b488-c7611173b6e8\" (UID: \"d4847dbd-e086-4995-b488-c7611173b6e8\") " Mar 09 09:23:51 crc kubenswrapper[4861]: I0309 09:23:51.634925 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4847dbd-e086-4995-b488-c7611173b6e8-config" (OuterVolumeSpecName: "config") pod "d4847dbd-e086-4995-b488-c7611173b6e8" (UID: "d4847dbd-e086-4995-b488-c7611173b6e8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:23:51 crc kubenswrapper[4861]: I0309 09:23:51.650010 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4847dbd-e086-4995-b488-c7611173b6e8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d4847dbd-e086-4995-b488-c7611173b6e8" (UID: "d4847dbd-e086-4995-b488-c7611173b6e8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:23:51 crc kubenswrapper[4861]: I0309 09:23:51.688546 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76d8l\" (UniqueName: \"kubernetes.io/projected/d4847dbd-e086-4995-b488-c7611173b6e8-kube-api-access-76d8l\") pod \"d4847dbd-e086-4995-b488-c7611173b6e8\" (UID: \"d4847dbd-e086-4995-b488-c7611173b6e8\") " Mar 09 09:23:51 crc kubenswrapper[4861]: I0309 09:23:51.690543 4861 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d4847dbd-e086-4995-b488-c7611173b6e8-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:51 crc kubenswrapper[4861]: I0309 09:23:51.690573 4861 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d4847dbd-e086-4995-b488-c7611173b6e8-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:51 crc kubenswrapper[4861]: I0309 09:23:51.692454 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4847dbd-e086-4995-b488-c7611173b6e8-kube-api-access-76d8l" (OuterVolumeSpecName: "kube-api-access-76d8l") pod "d4847dbd-e086-4995-b488-c7611173b6e8" (UID: "d4847dbd-e086-4995-b488-c7611173b6e8"). InnerVolumeSpecName "kube-api-access-76d8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:23:51 crc kubenswrapper[4861]: I0309 09:23:51.792352 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76d8l\" (UniqueName: \"kubernetes.io/projected/d4847dbd-e086-4995-b488-c7611173b6e8-kube-api-access-76d8l\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:51 crc kubenswrapper[4861]: I0309 09:23:51.984875 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-57ggq"] Mar 09 09:23:51 crc kubenswrapper[4861]: I0309 09:23:51.991635 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-57ggq"] Mar 09 09:23:52 crc kubenswrapper[4861]: I0309 09:23:52.561840 4861 generic.go:334] "Generic (PLEG): container finished" podID="a022a77f-33dc-449f-b20f-b91978014c94" containerID="a0300fc10e488ebd33207cec458035e8e06b9da87be58b19fafc10fcb6e4aeb7" exitCode=0 Mar 09 09:23:52 crc kubenswrapper[4861]: I0309 09:23:52.562023 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-bc782" event={"ID":"a022a77f-33dc-449f-b20f-b91978014c94","Type":"ContainerDied","Data":"a0300fc10e488ebd33207cec458035e8e06b9da87be58b19fafc10fcb6e4aeb7"} Mar 09 09:23:52 crc kubenswrapper[4861]: I0309 09:23:52.565792 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c47bcb9f9-zbnlj" event={"ID":"d4847dbd-e086-4995-b488-c7611173b6e8","Type":"ContainerDied","Data":"3d4e3b249f8679d3df8a380646926af0a3cfdd7bee76416d14e5f3488415e2c1"} Mar 09 09:23:52 crc kubenswrapper[4861]: I0309 09:23:52.565834 4861 scope.go:117] "RemoveContainer" containerID="c4af947936706a88621a7b2d5ec3639ed298559375ad0f3c182cb888cf1d8117" Mar 09 09:23:52 crc kubenswrapper[4861]: I0309 09:23:52.565958 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-zbnlj" Mar 09 09:23:52 crc kubenswrapper[4861]: I0309 09:23:52.570468 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-3c6b-account-create-update-g9zvd" event={"ID":"e36bbce2-29bf-46ab-bc4f-a7afc2423059","Type":"ContainerDied","Data":"def0dc52c7f8a566bd281da8497a5f4ce26474566e2eff6c5fbe140b4718b0af"} Mar 09 09:23:52 crc kubenswrapper[4861]: I0309 09:23:52.570508 4861 generic.go:334] "Generic (PLEG): container finished" podID="e36bbce2-29bf-46ab-bc4f-a7afc2423059" containerID="def0dc52c7f8a566bd281da8497a5f4ce26474566e2eff6c5fbe140b4718b0af" exitCode=0 Mar 09 09:23:52 crc kubenswrapper[4861]: I0309 09:23:52.589588 4861 scope.go:117] "RemoveContainer" containerID="6ddbfec8faee2cdbe27eb2e4243165e64f47f0d6cb0889ca6757e48b1a09f1d2" Mar 09 09:23:52 crc kubenswrapper[4861]: I0309 09:23:52.614073 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-zbnlj"] Mar 09 09:23:52 crc kubenswrapper[4861]: I0309 09:23:52.622484 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-zbnlj"] Mar 09 09:23:53 crc kubenswrapper[4861]: I0309 09:23:53.673661 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca09ea8f-7fc8-4f28-9716-64ecc377fdb9" path="/var/lib/kubelet/pods/ca09ea8f-7fc8-4f28-9716-64ecc377fdb9/volumes" Mar 09 09:23:53 crc kubenswrapper[4861]: I0309 09:23:53.674742 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4847dbd-e086-4995-b488-c7611173b6e8" path="/var/lib/kubelet/pods/d4847dbd-e086-4995-b488-c7611173b6e8/volumes" Mar 09 09:23:53 crc kubenswrapper[4861]: I0309 09:23:53.975964 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3c6b-account-create-update-g9zvd" Mar 09 09:23:53 crc kubenswrapper[4861]: I0309 09:23:53.985721 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-bc782" Mar 09 09:23:54 crc kubenswrapper[4861]: I0309 09:23:54.141752 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkxm2\" (UniqueName: \"kubernetes.io/projected/a022a77f-33dc-449f-b20f-b91978014c94-kube-api-access-zkxm2\") pod \"a022a77f-33dc-449f-b20f-b91978014c94\" (UID: \"a022a77f-33dc-449f-b20f-b91978014c94\") " Mar 09 09:23:54 crc kubenswrapper[4861]: I0309 09:23:54.141831 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e36bbce2-29bf-46ab-bc4f-a7afc2423059-operator-scripts\") pod \"e36bbce2-29bf-46ab-bc4f-a7afc2423059\" (UID: \"e36bbce2-29bf-46ab-bc4f-a7afc2423059\") " Mar 09 09:23:54 crc kubenswrapper[4861]: I0309 09:23:54.141947 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a022a77f-33dc-449f-b20f-b91978014c94-operator-scripts\") pod \"a022a77f-33dc-449f-b20f-b91978014c94\" (UID: \"a022a77f-33dc-449f-b20f-b91978014c94\") " Mar 09 09:23:54 crc kubenswrapper[4861]: I0309 09:23:54.141977 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmhjg\" (UniqueName: \"kubernetes.io/projected/e36bbce2-29bf-46ab-bc4f-a7afc2423059-kube-api-access-dmhjg\") pod \"e36bbce2-29bf-46ab-bc4f-a7afc2423059\" (UID: \"e36bbce2-29bf-46ab-bc4f-a7afc2423059\") " Mar 09 09:23:54 crc kubenswrapper[4861]: I0309 09:23:54.142860 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a022a77f-33dc-449f-b20f-b91978014c94-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a022a77f-33dc-449f-b20f-b91978014c94" (UID: "a022a77f-33dc-449f-b20f-b91978014c94"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:23:54 crc kubenswrapper[4861]: I0309 09:23:54.143052 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e36bbce2-29bf-46ab-bc4f-a7afc2423059-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e36bbce2-29bf-46ab-bc4f-a7afc2423059" (UID: "e36bbce2-29bf-46ab-bc4f-a7afc2423059"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:23:54 crc kubenswrapper[4861]: I0309 09:23:54.147755 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e36bbce2-29bf-46ab-bc4f-a7afc2423059-kube-api-access-dmhjg" (OuterVolumeSpecName: "kube-api-access-dmhjg") pod "e36bbce2-29bf-46ab-bc4f-a7afc2423059" (UID: "e36bbce2-29bf-46ab-bc4f-a7afc2423059"). InnerVolumeSpecName "kube-api-access-dmhjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:23:54 crc kubenswrapper[4861]: I0309 09:23:54.149056 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a022a77f-33dc-449f-b20f-b91978014c94-kube-api-access-zkxm2" (OuterVolumeSpecName: "kube-api-access-zkxm2") pod "a022a77f-33dc-449f-b20f-b91978014c94" (UID: "a022a77f-33dc-449f-b20f-b91978014c94"). InnerVolumeSpecName "kube-api-access-zkxm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:23:54 crc kubenswrapper[4861]: I0309 09:23:54.244002 4861 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e36bbce2-29bf-46ab-bc4f-a7afc2423059-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:54 crc kubenswrapper[4861]: I0309 09:23:54.244034 4861 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a022a77f-33dc-449f-b20f-b91978014c94-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:54 crc kubenswrapper[4861]: I0309 09:23:54.244050 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmhjg\" (UniqueName: \"kubernetes.io/projected/e36bbce2-29bf-46ab-bc4f-a7afc2423059-kube-api-access-dmhjg\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:54 crc kubenswrapper[4861]: I0309 09:23:54.244062 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkxm2\" (UniqueName: \"kubernetes.io/projected/a022a77f-33dc-449f-b20f-b91978014c94-kube-api-access-zkxm2\") on node \"crc\" DevicePath \"\"" Mar 09 09:23:54 crc kubenswrapper[4861]: I0309 09:23:54.631240 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-3c6b-account-create-update-g9zvd" event={"ID":"e36bbce2-29bf-46ab-bc4f-a7afc2423059","Type":"ContainerDied","Data":"91b0e98a87b87803150b10c25564f6724bdcaccc11ea75648989f5953013b2d1"} Mar 09 09:23:54 crc kubenswrapper[4861]: I0309 09:23:54.632574 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91b0e98a87b87803150b10c25564f6724bdcaccc11ea75648989f5953013b2d1" Mar 09 09:23:54 crc kubenswrapper[4861]: I0309 09:23:54.631497 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3c6b-account-create-update-g9zvd" Mar 09 09:23:54 crc kubenswrapper[4861]: I0309 09:23:54.634549 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-bc782" event={"ID":"a022a77f-33dc-449f-b20f-b91978014c94","Type":"ContainerDied","Data":"4c4696ee84428c91d6b2490d01275bf258b56b3e842e360100890ba3c02c0138"} Mar 09 09:23:54 crc kubenswrapper[4861]: I0309 09:23:54.634585 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c4696ee84428c91d6b2490d01275bf258b56b3e842e360100890ba3c02c0138" Mar 09 09:23:54 crc kubenswrapper[4861]: I0309 09:23:54.634641 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-bc782" Mar 09 09:23:55 crc kubenswrapper[4861]: I0309 09:23:55.621714 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-dzt7t"] Mar 09 09:23:55 crc kubenswrapper[4861]: E0309 09:23:55.622048 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a022a77f-33dc-449f-b20f-b91978014c94" containerName="mariadb-database-create" Mar 09 09:23:55 crc kubenswrapper[4861]: I0309 09:23:55.622064 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="a022a77f-33dc-449f-b20f-b91978014c94" containerName="mariadb-database-create" Mar 09 09:23:55 crc kubenswrapper[4861]: E0309 09:23:55.622103 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca09ea8f-7fc8-4f28-9716-64ecc377fdb9" containerName="mariadb-account-create-update" Mar 09 09:23:55 crc kubenswrapper[4861]: I0309 09:23:55.622112 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca09ea8f-7fc8-4f28-9716-64ecc377fdb9" containerName="mariadb-account-create-update" Mar 09 09:23:55 crc kubenswrapper[4861]: E0309 09:23:55.622127 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4847dbd-e086-4995-b488-c7611173b6e8" containerName="dnsmasq-dns" Mar 09 09:23:55 crc kubenswrapper[4861]: I0309 09:23:55.622135 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4847dbd-e086-4995-b488-c7611173b6e8" containerName="dnsmasq-dns" Mar 09 09:23:55 crc kubenswrapper[4861]: E0309 09:23:55.622160 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e36bbce2-29bf-46ab-bc4f-a7afc2423059" containerName="mariadb-account-create-update" Mar 09 09:23:55 crc kubenswrapper[4861]: I0309 09:23:55.622168 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="e36bbce2-29bf-46ab-bc4f-a7afc2423059" containerName="mariadb-account-create-update" Mar 09 09:23:55 crc kubenswrapper[4861]: E0309 09:23:55.622229 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4847dbd-e086-4995-b488-c7611173b6e8" containerName="init" Mar 09 09:23:55 crc kubenswrapper[4861]: I0309 09:23:55.622237 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4847dbd-e086-4995-b488-c7611173b6e8" containerName="init" Mar 09 09:23:55 crc kubenswrapper[4861]: I0309 09:23:55.622429 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4847dbd-e086-4995-b488-c7611173b6e8" containerName="dnsmasq-dns" Mar 09 09:23:55 crc kubenswrapper[4861]: I0309 09:23:55.622449 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="a022a77f-33dc-449f-b20f-b91978014c94" containerName="mariadb-database-create" Mar 09 09:23:55 crc kubenswrapper[4861]: I0309 09:23:55.622461 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca09ea8f-7fc8-4f28-9716-64ecc377fdb9" containerName="mariadb-account-create-update" Mar 09 09:23:55 crc kubenswrapper[4861]: I0309 09:23:55.622483 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="e36bbce2-29bf-46ab-bc4f-a7afc2423059" containerName="mariadb-account-create-update" Mar 09 09:23:55 crc kubenswrapper[4861]: I0309 09:23:55.623032 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-dzt7t" Mar 09 09:23:55 crc kubenswrapper[4861]: I0309 09:23:55.632924 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-dzt7t"] Mar 09 09:23:55 crc kubenswrapper[4861]: I0309 09:23:55.633299 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 09 09:23:55 crc kubenswrapper[4861]: I0309 09:23:55.766955 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/969317e8-5323-499c-84a5-e78139bef859-operator-scripts\") pod \"root-account-create-update-dzt7t\" (UID: \"969317e8-5323-499c-84a5-e78139bef859\") " pod="openstack/root-account-create-update-dzt7t" Mar 09 09:23:55 crc kubenswrapper[4861]: I0309 09:23:55.767013 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrn9q\" (UniqueName: \"kubernetes.io/projected/969317e8-5323-499c-84a5-e78139bef859-kube-api-access-qrn9q\") pod \"root-account-create-update-dzt7t\" (UID: \"969317e8-5323-499c-84a5-e78139bef859\") " pod="openstack/root-account-create-update-dzt7t" Mar 09 09:23:55 crc kubenswrapper[4861]: I0309 09:23:55.869587 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrn9q\" (UniqueName: \"kubernetes.io/projected/969317e8-5323-499c-84a5-e78139bef859-kube-api-access-qrn9q\") pod \"root-account-create-update-dzt7t\" (UID: \"969317e8-5323-499c-84a5-e78139bef859\") " pod="openstack/root-account-create-update-dzt7t" Mar 09 09:23:55 crc kubenswrapper[4861]: I0309 09:23:55.869768 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/969317e8-5323-499c-84a5-e78139bef859-operator-scripts\") pod \"root-account-create-update-dzt7t\" (UID: \"969317e8-5323-499c-84a5-e78139bef859\") " pod="openstack/root-account-create-update-dzt7t" Mar 09 09:23:55 crc kubenswrapper[4861]: I0309 09:23:55.870669 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/969317e8-5323-499c-84a5-e78139bef859-operator-scripts\") pod \"root-account-create-update-dzt7t\" (UID: \"969317e8-5323-499c-84a5-e78139bef859\") " pod="openstack/root-account-create-update-dzt7t" Mar 09 09:23:55 crc kubenswrapper[4861]: I0309 09:23:55.887441 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrn9q\" (UniqueName: \"kubernetes.io/projected/969317e8-5323-499c-84a5-e78139bef859-kube-api-access-qrn9q\") pod \"root-account-create-update-dzt7t\" (UID: \"969317e8-5323-499c-84a5-e78139bef859\") " pod="openstack/root-account-create-update-dzt7t" Mar 09 09:23:55 crc kubenswrapper[4861]: I0309 09:23:55.949036 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-dzt7t" Mar 09 09:23:56 crc kubenswrapper[4861]: I0309 09:23:56.376162 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-dzt7t"] Mar 09 09:23:56 crc kubenswrapper[4861]: W0309 09:23:56.376777 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod969317e8_5323_499c_84a5_e78139bef859.slice/crio-ee5b84ac96c0f055f8a1de809f2e98670af9832e318109e6c1c079d2c855ba23 WatchSource:0}: Error finding container ee5b84ac96c0f055f8a1de809f2e98670af9832e318109e6c1c079d2c855ba23: Status 404 returned error can't find the container with id ee5b84ac96c0f055f8a1de809f2e98670af9832e318109e6c1c079d2c855ba23 Mar 09 09:23:56 crc kubenswrapper[4861]: I0309 09:23:56.654726 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-dzt7t" event={"ID":"969317e8-5323-499c-84a5-e78139bef859","Type":"ContainerStarted","Data":"48a24b34c684d5a93e2d113832b569fd6f40b813ce52a311f4c0a30652f38851"} Mar 09 09:23:56 crc kubenswrapper[4861]: I0309 09:23:56.655113 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-dzt7t" event={"ID":"969317e8-5323-499c-84a5-e78139bef859","Type":"ContainerStarted","Data":"ee5b84ac96c0f055f8a1de809f2e98670af9832e318109e6c1c079d2c855ba23"} Mar 09 09:23:56 crc kubenswrapper[4861]: I0309 09:23:56.678326 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-dzt7t" podStartSLOduration=1.678305812 podStartE2EDuration="1.678305812s" podCreationTimestamp="2026-03-09 09:23:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:23:56.673468598 +0000 UTC m=+1079.758507999" watchObservedRunningTime="2026-03-09 09:23:56.678305812 +0000 UTC m=+1079.763345213" Mar 09 09:23:57 crc kubenswrapper[4861]: I0309 09:23:57.113071 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 09 09:23:59 crc kubenswrapper[4861]: I0309 09:23:59.024790 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d1ad2fa7-36fc-4cd0-98ac-07b48c42e794-etc-swift\") pod \"swift-storage-0\" (UID: \"d1ad2fa7-36fc-4cd0-98ac-07b48c42e794\") " pod="openstack/swift-storage-0" Mar 09 09:23:59 crc kubenswrapper[4861]: E0309 09:23:59.087070 4861 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 09 09:23:59 crc kubenswrapper[4861]: E0309 09:23:59.087108 4861 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 09 09:23:59 crc kubenswrapper[4861]: E0309 09:23:59.087163 4861 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d1ad2fa7-36fc-4cd0-98ac-07b48c42e794-etc-swift podName:d1ad2fa7-36fc-4cd0-98ac-07b48c42e794 nodeName:}" failed. No retries permitted until 2026-03-09 09:24:15.087142576 +0000 UTC m=+1098.172181977 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d1ad2fa7-36fc-4cd0-98ac-07b48c42e794-etc-swift") pod "swift-storage-0" (UID: "d1ad2fa7-36fc-4cd0-98ac-07b48c42e794") : configmap "swift-ring-files" not found Mar 09 09:23:59 crc kubenswrapper[4861]: I0309 09:23:59.193923 4861 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-s7nq5" podUID="4e354b06-2ae2-41af-b5d7-2909bca8cff6" containerName="ovn-controller" probeResult="failure" output=< Mar 09 09:23:59 crc kubenswrapper[4861]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 09 09:23:59 crc kubenswrapper[4861]: > Mar 09 09:23:59 crc kubenswrapper[4861]: I0309 09:23:59.218654 4861 generic.go:334] "Generic (PLEG): container finished" podID="969317e8-5323-499c-84a5-e78139bef859" containerID="48a24b34c684d5a93e2d113832b569fd6f40b813ce52a311f4c0a30652f38851" exitCode=0 Mar 09 09:23:59 crc kubenswrapper[4861]: I0309 09:23:59.218710 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-dzt7t" event={"ID":"969317e8-5323-499c-84a5-e78139bef859","Type":"ContainerDied","Data":"48a24b34c684d5a93e2d113832b569fd6f40b813ce52a311f4c0a30652f38851"} Mar 09 09:24:00 crc kubenswrapper[4861]: I0309 09:24:00.139654 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550804-lfm7c"] Mar 09 09:24:00 crc kubenswrapper[4861]: I0309 09:24:00.140936 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550804-lfm7c" Mar 09 09:24:00 crc kubenswrapper[4861]: I0309 09:24:00.144124 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:24:00 crc kubenswrapper[4861]: I0309 09:24:00.144337 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k86l8" Mar 09 09:24:00 crc kubenswrapper[4861]: I0309 09:24:00.149788 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:24:00 crc kubenswrapper[4861]: I0309 09:24:00.156771 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550804-lfm7c"] Mar 09 09:24:00 crc kubenswrapper[4861]: I0309 09:24:00.208073 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnk7q\" (UniqueName: \"kubernetes.io/projected/a4a629d1-4904-48e6-9a69-e436c68cfdbc-kube-api-access-dnk7q\") pod \"auto-csr-approver-29550804-lfm7c\" (UID: \"a4a629d1-4904-48e6-9a69-e436c68cfdbc\") " pod="openshift-infra/auto-csr-approver-29550804-lfm7c" Mar 09 09:24:00 crc kubenswrapper[4861]: I0309 09:24:00.337191 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnk7q\" (UniqueName: \"kubernetes.io/projected/a4a629d1-4904-48e6-9a69-e436c68cfdbc-kube-api-access-dnk7q\") pod \"auto-csr-approver-29550804-lfm7c\" (UID: \"a4a629d1-4904-48e6-9a69-e436c68cfdbc\") " pod="openshift-infra/auto-csr-approver-29550804-lfm7c" Mar 09 09:24:00 crc kubenswrapper[4861]: I0309 09:24:00.371749 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnk7q\" (UniqueName: \"kubernetes.io/projected/a4a629d1-4904-48e6-9a69-e436c68cfdbc-kube-api-access-dnk7q\") pod \"auto-csr-approver-29550804-lfm7c\" (UID: \"a4a629d1-4904-48e6-9a69-e436c68cfdbc\") " pod="openshift-infra/auto-csr-approver-29550804-lfm7c" Mar 09 09:24:00 crc kubenswrapper[4861]: I0309 09:24:00.472806 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550804-lfm7c" Mar 09 09:24:01 crc kubenswrapper[4861]: I0309 09:24:01.235252 4861 generic.go:334] "Generic (PLEG): container finished" podID="6f0d289d-af18-4534-a0c6-c90f51e93fd8" containerID="6df0b0ac53bfc579c855ba801966917a12b5d9040dfae21f97a4098fd42b52ae" exitCode=0 Mar 09 09:24:01 crc kubenswrapper[4861]: I0309 09:24:01.235409 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-qjxtd" event={"ID":"6f0d289d-af18-4534-a0c6-c90f51e93fd8","Type":"ContainerDied","Data":"6df0b0ac53bfc579c855ba801966917a12b5d9040dfae21f97a4098fd42b52ae"} Mar 09 09:24:02 crc kubenswrapper[4861]: I0309 09:24:02.250665 4861 generic.go:334] "Generic (PLEG): container finished" podID="03452acf-c21f-4d68-a813-772c30604a60" containerID="a9f20eea0867df7623d564d9ee258fb8c480222ebfae15bc43c9f672e31b8bfc" exitCode=0 Mar 09 09:24:02 crc kubenswrapper[4861]: I0309 09:24:02.250737 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"03452acf-c21f-4d68-a813-772c30604a60","Type":"ContainerDied","Data":"a9f20eea0867df7623d564d9ee258fb8c480222ebfae15bc43c9f672e31b8bfc"} Mar 09 09:24:02 crc kubenswrapper[4861]: I0309 09:24:02.262788 4861 generic.go:334] "Generic (PLEG): container finished" podID="b9b83355-ea40-4408-9b77-c717df91e1a9" containerID="31586b8681c909460b5f41e700f0e45d5675654a1c0fc223423f5e764a90eb87" exitCode=0 Mar 09 09:24:02 crc kubenswrapper[4861]: I0309 09:24:02.262876 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b9b83355-ea40-4408-9b77-c717df91e1a9","Type":"ContainerDied","Data":"31586b8681c909460b5f41e700f0e45d5675654a1c0fc223423f5e764a90eb87"} Mar 09 09:24:03 crc kubenswrapper[4861]: I0309 09:24:03.792125 4861 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-s7nq5" podUID="4e354b06-2ae2-41af-b5d7-2909bca8cff6" containerName="ovn-controller" probeResult="failure" output=< Mar 09 09:24:03 crc kubenswrapper[4861]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 09 09:24:03 crc kubenswrapper[4861]: > Mar 09 09:24:03 crc kubenswrapper[4861]: I0309 09:24:03.817962 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-hmb5w" Mar 09 09:24:03 crc kubenswrapper[4861]: I0309 09:24:03.819618 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-hmb5w" Mar 09 09:24:04 crc kubenswrapper[4861]: I0309 09:24:04.094418 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-s7nq5-config-ndmn6"] Mar 09 09:24:04 crc kubenswrapper[4861]: I0309 09:24:04.095445 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s7nq5-config-ndmn6" Mar 09 09:24:04 crc kubenswrapper[4861]: I0309 09:24:04.098844 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 09 09:24:04 crc kubenswrapper[4861]: I0309 09:24:04.112683 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-s7nq5-config-ndmn6"] Mar 09 09:24:04 crc kubenswrapper[4861]: I0309 09:24:04.116159 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85060605-9da6-428f-bc42-872620763791-scripts\") pod \"ovn-controller-s7nq5-config-ndmn6\" (UID: \"85060605-9da6-428f-bc42-872620763791\") " pod="openstack/ovn-controller-s7nq5-config-ndmn6" Mar 09 09:24:04 crc kubenswrapper[4861]: I0309 09:24:04.116228 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/85060605-9da6-428f-bc42-872620763791-var-log-ovn\") pod \"ovn-controller-s7nq5-config-ndmn6\" (UID: \"85060605-9da6-428f-bc42-872620763791\") " pod="openstack/ovn-controller-s7nq5-config-ndmn6" Mar 09 09:24:04 crc kubenswrapper[4861]: I0309 09:24:04.116306 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/85060605-9da6-428f-bc42-872620763791-var-run\") pod \"ovn-controller-s7nq5-config-ndmn6\" (UID: \"85060605-9da6-428f-bc42-872620763791\") " pod="openstack/ovn-controller-s7nq5-config-ndmn6" Mar 09 09:24:04 crc kubenswrapper[4861]: I0309 09:24:04.116368 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/85060605-9da6-428f-bc42-872620763791-additional-scripts\") pod \"ovn-controller-s7nq5-config-ndmn6\" (UID: \"85060605-9da6-428f-bc42-872620763791\") " pod="openstack/ovn-controller-s7nq5-config-ndmn6" Mar 09 09:24:04 crc kubenswrapper[4861]: I0309 09:24:04.116418 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/85060605-9da6-428f-bc42-872620763791-var-run-ovn\") pod \"ovn-controller-s7nq5-config-ndmn6\" (UID: \"85060605-9da6-428f-bc42-872620763791\") " pod="openstack/ovn-controller-s7nq5-config-ndmn6" Mar 09 09:24:04 crc kubenswrapper[4861]: I0309 09:24:04.116464 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tddd5\" (UniqueName: \"kubernetes.io/projected/85060605-9da6-428f-bc42-872620763791-kube-api-access-tddd5\") pod \"ovn-controller-s7nq5-config-ndmn6\" (UID: \"85060605-9da6-428f-bc42-872620763791\") " pod="openstack/ovn-controller-s7nq5-config-ndmn6" Mar 09 09:24:04 crc kubenswrapper[4861]: I0309 09:24:04.217534 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85060605-9da6-428f-bc42-872620763791-scripts\") pod \"ovn-controller-s7nq5-config-ndmn6\" (UID: \"85060605-9da6-428f-bc42-872620763791\") " pod="openstack/ovn-controller-s7nq5-config-ndmn6" Mar 09 09:24:04 crc kubenswrapper[4861]: I0309 09:24:04.217589 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/85060605-9da6-428f-bc42-872620763791-var-log-ovn\") pod \"ovn-controller-s7nq5-config-ndmn6\" (UID: \"85060605-9da6-428f-bc42-872620763791\") " pod="openstack/ovn-controller-s7nq5-config-ndmn6" Mar 09 09:24:04 crc kubenswrapper[4861]: I0309 09:24:04.217637 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/85060605-9da6-428f-bc42-872620763791-var-run\") pod \"ovn-controller-s7nq5-config-ndmn6\" (UID: \"85060605-9da6-428f-bc42-872620763791\") " pod="openstack/ovn-controller-s7nq5-config-ndmn6" Mar 09 09:24:04 crc kubenswrapper[4861]: I0309 09:24:04.217680 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/85060605-9da6-428f-bc42-872620763791-additional-scripts\") pod \"ovn-controller-s7nq5-config-ndmn6\" (UID: \"85060605-9da6-428f-bc42-872620763791\") " pod="openstack/ovn-controller-s7nq5-config-ndmn6" Mar 09 09:24:04 crc kubenswrapper[4861]: I0309 09:24:04.217701 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/85060605-9da6-428f-bc42-872620763791-var-run-ovn\") pod \"ovn-controller-s7nq5-config-ndmn6\" (UID: \"85060605-9da6-428f-bc42-872620763791\") " pod="openstack/ovn-controller-s7nq5-config-ndmn6" Mar 09 09:24:04 crc kubenswrapper[4861]: I0309 09:24:04.217729 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tddd5\" (UniqueName: \"kubernetes.io/projected/85060605-9da6-428f-bc42-872620763791-kube-api-access-tddd5\") pod \"ovn-controller-s7nq5-config-ndmn6\" (UID: \"85060605-9da6-428f-bc42-872620763791\") " pod="openstack/ovn-controller-s7nq5-config-ndmn6" Mar 09 09:24:04 crc kubenswrapper[4861]: I0309 09:24:04.218298 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/85060605-9da6-428f-bc42-872620763791-var-log-ovn\") pod \"ovn-controller-s7nq5-config-ndmn6\" (UID: \"85060605-9da6-428f-bc42-872620763791\") " pod="openstack/ovn-controller-s7nq5-config-ndmn6" Mar 09 09:24:04 crc kubenswrapper[4861]: I0309 09:24:04.218318 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/85060605-9da6-428f-bc42-872620763791-var-run-ovn\") pod \"ovn-controller-s7nq5-config-ndmn6\" (UID: \"85060605-9da6-428f-bc42-872620763791\") " pod="openstack/ovn-controller-s7nq5-config-ndmn6" Mar 09 09:24:04 crc kubenswrapper[4861]: I0309 09:24:04.218369 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/85060605-9da6-428f-bc42-872620763791-var-run\") pod \"ovn-controller-s7nq5-config-ndmn6\" (UID: \"85060605-9da6-428f-bc42-872620763791\") " pod="openstack/ovn-controller-s7nq5-config-ndmn6" Mar 09 09:24:04 crc kubenswrapper[4861]: I0309 09:24:04.218860 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/85060605-9da6-428f-bc42-872620763791-additional-scripts\") pod \"ovn-controller-s7nq5-config-ndmn6\" (UID: \"85060605-9da6-428f-bc42-872620763791\") " pod="openstack/ovn-controller-s7nq5-config-ndmn6" Mar 09 09:24:04 crc kubenswrapper[4861]: I0309 09:24:04.221041 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85060605-9da6-428f-bc42-872620763791-scripts\") pod \"ovn-controller-s7nq5-config-ndmn6\" (UID: \"85060605-9da6-428f-bc42-872620763791\") " pod="openstack/ovn-controller-s7nq5-config-ndmn6" Mar 09 09:24:04 crc kubenswrapper[4861]: I0309 09:24:04.256257 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tddd5\" (UniqueName: \"kubernetes.io/projected/85060605-9da6-428f-bc42-872620763791-kube-api-access-tddd5\") pod \"ovn-controller-s7nq5-config-ndmn6\" (UID: \"85060605-9da6-428f-bc42-872620763791\") " pod="openstack/ovn-controller-s7nq5-config-ndmn6" Mar 09 09:24:04 crc kubenswrapper[4861]: I0309 09:24:04.415945 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s7nq5-config-ndmn6" Mar 09 09:24:08 crc kubenswrapper[4861]: E0309 09:24:08.250195 4861 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api@sha256:ed912eee9adeda5c44804688cc7661695a42ab1a40fa46b28bdc819cefa98f07" Mar 09 09:24:08 crc kubenswrapper[4861]: E0309 09:24:08.250796 4861 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api@sha256:ed912eee9adeda5c44804688cc7661695a42ab1a40fa46b28bdc819cefa98f07,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7drvd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-5kg66_openstack(634dd56c-c726-49cc-9a71-ef57a7d0a984): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 09:24:08 crc kubenswrapper[4861]: E0309 09:24:08.251961 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-5kg66" podUID="634dd56c-c726-49cc-9a71-ef57a7d0a984" Mar 09 09:24:08 crc kubenswrapper[4861]: I0309 09:24:08.359684 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-qjxtd" event={"ID":"6f0d289d-af18-4534-a0c6-c90f51e93fd8","Type":"ContainerDied","Data":"d8a71743eaa0e4cc6c173f2241ce883628cf6a3f3ce210d7d2d56ebbf44c749f"} Mar 09 09:24:08 crc kubenswrapper[4861]: I0309 09:24:08.359746 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8a71743eaa0e4cc6c173f2241ce883628cf6a3f3ce210d7d2d56ebbf44c749f" Mar 09 09:24:08 crc kubenswrapper[4861]: I0309 09:24:08.363451 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-dzt7t" Mar 09 09:24:08 crc kubenswrapper[4861]: I0309 09:24:08.365995 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-dzt7t" event={"ID":"969317e8-5323-499c-84a5-e78139bef859","Type":"ContainerDied","Data":"ee5b84ac96c0f055f8a1de809f2e98670af9832e318109e6c1c079d2c855ba23"} Mar 09 09:24:08 crc kubenswrapper[4861]: I0309 09:24:08.366074 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee5b84ac96c0f055f8a1de809f2e98670af9832e318109e6c1c079d2c855ba23" Mar 09 09:24:08 crc kubenswrapper[4861]: I0309 09:24:08.370661 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-qjxtd" Mar 09 09:24:08 crc kubenswrapper[4861]: E0309 09:24:08.372559 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api@sha256:ed912eee9adeda5c44804688cc7661695a42ab1a40fa46b28bdc819cefa98f07\\\"\"" pod="openstack/glance-db-sync-5kg66" podUID="634dd56c-c726-49cc-9a71-ef57a7d0a984" Mar 09 09:24:08 crc kubenswrapper[4861]: I0309 09:24:08.489785 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6f0d289d-af18-4534-a0c6-c90f51e93fd8-scripts\") pod \"6f0d289d-af18-4534-a0c6-c90f51e93fd8\" (UID: \"6f0d289d-af18-4534-a0c6-c90f51e93fd8\") " Mar 09 09:24:08 crc kubenswrapper[4861]: I0309 09:24:08.490241 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6f0d289d-af18-4534-a0c6-c90f51e93fd8-swiftconf\") pod \"6f0d289d-af18-4534-a0c6-c90f51e93fd8\" (UID: \"6f0d289d-af18-4534-a0c6-c90f51e93fd8\") " Mar 09 09:24:08 crc kubenswrapper[4861]: I0309 09:24:08.490360 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6f0d289d-af18-4534-a0c6-c90f51e93fd8-ring-data-devices\") pod \"6f0d289d-af18-4534-a0c6-c90f51e93fd8\" (UID: \"6f0d289d-af18-4534-a0c6-c90f51e93fd8\") " Mar 09 09:24:08 crc kubenswrapper[4861]: I0309 09:24:08.490484 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6f0d289d-af18-4534-a0c6-c90f51e93fd8-dispersionconf\") pod \"6f0d289d-af18-4534-a0c6-c90f51e93fd8\" (UID: \"6f0d289d-af18-4534-a0c6-c90f51e93fd8\") " Mar 09 09:24:08 crc kubenswrapper[4861]: I0309 09:24:08.490519 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6f0d289d-af18-4534-a0c6-c90f51e93fd8-etc-swift\") pod \"6f0d289d-af18-4534-a0c6-c90f51e93fd8\" (UID: \"6f0d289d-af18-4534-a0c6-c90f51e93fd8\") " Mar 09 09:24:08 crc kubenswrapper[4861]: I0309 09:24:08.490542 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/969317e8-5323-499c-84a5-e78139bef859-operator-scripts\") pod \"969317e8-5323-499c-84a5-e78139bef859\" (UID: \"969317e8-5323-499c-84a5-e78139bef859\") " Mar 09 09:24:08 crc kubenswrapper[4861]: I0309 09:24:08.490572 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f0d289d-af18-4534-a0c6-c90f51e93fd8-combined-ca-bundle\") pod \"6f0d289d-af18-4534-a0c6-c90f51e93fd8\" (UID: \"6f0d289d-af18-4534-a0c6-c90f51e93fd8\") " Mar 09 09:24:08 crc kubenswrapper[4861]: I0309 09:24:08.490611 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrn9q\" (UniqueName: \"kubernetes.io/projected/969317e8-5323-499c-84a5-e78139bef859-kube-api-access-qrn9q\") pod \"969317e8-5323-499c-84a5-e78139bef859\" (UID: \"969317e8-5323-499c-84a5-e78139bef859\") " Mar 09 09:24:08 crc kubenswrapper[4861]: I0309 09:24:08.490655 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pv4bf\" (UniqueName: \"kubernetes.io/projected/6f0d289d-af18-4534-a0c6-c90f51e93fd8-kube-api-access-pv4bf\") pod \"6f0d289d-af18-4534-a0c6-c90f51e93fd8\" (UID: \"6f0d289d-af18-4534-a0c6-c90f51e93fd8\") " Mar 09 09:24:08 crc kubenswrapper[4861]: I0309 09:24:08.491662 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f0d289d-af18-4534-a0c6-c90f51e93fd8-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "6f0d289d-af18-4534-a0c6-c90f51e93fd8" (UID: "6f0d289d-af18-4534-a0c6-c90f51e93fd8"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:24:08 crc kubenswrapper[4861]: I0309 09:24:08.492467 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f0d289d-af18-4534-a0c6-c90f51e93fd8-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "6f0d289d-af18-4534-a0c6-c90f51e93fd8" (UID: "6f0d289d-af18-4534-a0c6-c90f51e93fd8"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:24:08 crc kubenswrapper[4861]: I0309 09:24:08.492959 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/969317e8-5323-499c-84a5-e78139bef859-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "969317e8-5323-499c-84a5-e78139bef859" (UID: "969317e8-5323-499c-84a5-e78139bef859"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:24:08 crc kubenswrapper[4861]: I0309 09:24:08.495186 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f0d289d-af18-4534-a0c6-c90f51e93fd8-kube-api-access-pv4bf" (OuterVolumeSpecName: "kube-api-access-pv4bf") pod "6f0d289d-af18-4534-a0c6-c90f51e93fd8" (UID: "6f0d289d-af18-4534-a0c6-c90f51e93fd8"). InnerVolumeSpecName "kube-api-access-pv4bf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:24:08 crc kubenswrapper[4861]: I0309 09:24:08.495683 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/969317e8-5323-499c-84a5-e78139bef859-kube-api-access-qrn9q" (OuterVolumeSpecName: "kube-api-access-qrn9q") pod "969317e8-5323-499c-84a5-e78139bef859" (UID: "969317e8-5323-499c-84a5-e78139bef859"). InnerVolumeSpecName "kube-api-access-qrn9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:24:08 crc kubenswrapper[4861]: I0309 09:24:08.499537 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f0d289d-af18-4534-a0c6-c90f51e93fd8-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "6f0d289d-af18-4534-a0c6-c90f51e93fd8" (UID: "6f0d289d-af18-4534-a0c6-c90f51e93fd8"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:24:08 crc kubenswrapper[4861]: I0309 09:24:08.517314 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f0d289d-af18-4534-a0c6-c90f51e93fd8-scripts" (OuterVolumeSpecName: "scripts") pod "6f0d289d-af18-4534-a0c6-c90f51e93fd8" (UID: "6f0d289d-af18-4534-a0c6-c90f51e93fd8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:24:08 crc kubenswrapper[4861]: I0309 09:24:08.528111 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f0d289d-af18-4534-a0c6-c90f51e93fd8-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "6f0d289d-af18-4534-a0c6-c90f51e93fd8" (UID: "6f0d289d-af18-4534-a0c6-c90f51e93fd8"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:24:08 crc kubenswrapper[4861]: I0309 09:24:08.535548 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f0d289d-af18-4534-a0c6-c90f51e93fd8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6f0d289d-af18-4534-a0c6-c90f51e93fd8" (UID: "6f0d289d-af18-4534-a0c6-c90f51e93fd8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:24:08 crc kubenswrapper[4861]: I0309 09:24:08.592614 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pv4bf\" (UniqueName: \"kubernetes.io/projected/6f0d289d-af18-4534-a0c6-c90f51e93fd8-kube-api-access-pv4bf\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:08 crc kubenswrapper[4861]: I0309 09:24:08.592638 4861 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6f0d289d-af18-4534-a0c6-c90f51e93fd8-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:08 crc kubenswrapper[4861]: I0309 09:24:08.592647 4861 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6f0d289d-af18-4534-a0c6-c90f51e93fd8-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:08 crc kubenswrapper[4861]: I0309 09:24:08.592656 4861 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6f0d289d-af18-4534-a0c6-c90f51e93fd8-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:08 crc kubenswrapper[4861]: I0309 09:24:08.592664 4861 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6f0d289d-af18-4534-a0c6-c90f51e93fd8-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:08 crc kubenswrapper[4861]: I0309 09:24:08.592671 4861 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6f0d289d-af18-4534-a0c6-c90f51e93fd8-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:08 crc kubenswrapper[4861]: I0309 09:24:08.592681 4861 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/969317e8-5323-499c-84a5-e78139bef859-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:08 crc kubenswrapper[4861]: I0309 09:24:08.592689 4861 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f0d289d-af18-4534-a0c6-c90f51e93fd8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:08 crc kubenswrapper[4861]: I0309 09:24:08.592697 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrn9q\" (UniqueName: \"kubernetes.io/projected/969317e8-5323-499c-84a5-e78139bef859-kube-api-access-qrn9q\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:08 crc kubenswrapper[4861]: I0309 09:24:08.719258 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550804-lfm7c"] Mar 09 09:24:08 crc kubenswrapper[4861]: W0309 09:24:08.720051 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4a629d1_4904_48e6_9a69_e436c68cfdbc.slice/crio-9707b1c18da480f73cfbb7093fe58e70a4397eadd84e837638a5ef64d893c65a WatchSource:0}: Error finding container 9707b1c18da480f73cfbb7093fe58e70a4397eadd84e837638a5ef64d893c65a: Status 404 returned error can't find the container with id 9707b1c18da480f73cfbb7093fe58e70a4397eadd84e837638a5ef64d893c65a Mar 09 09:24:08 crc kubenswrapper[4861]: I0309 09:24:08.722403 4861 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 09:24:08 crc kubenswrapper[4861]: I0309 09:24:08.788865 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-s7nq5-config-ndmn6"] Mar 09 09:24:08 crc kubenswrapper[4861]: I0309 09:24:08.800682 4861 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-s7nq5" podUID="4e354b06-2ae2-41af-b5d7-2909bca8cff6" containerName="ovn-controller" probeResult="failure" output=< Mar 09 09:24:08 crc kubenswrapper[4861]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 09 09:24:08 crc kubenswrapper[4861]: > Mar 09 09:24:08 crc kubenswrapper[4861]: W0309 09:24:08.820410 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85060605_9da6_428f_bc42_872620763791.slice/crio-165d37c5f1424d684e4b42ea2b036428bc19d2214a88b46ce1ba33d2823d974c WatchSource:0}: Error finding container 165d37c5f1424d684e4b42ea2b036428bc19d2214a88b46ce1ba33d2823d974c: Status 404 returned error can't find the container with id 165d37c5f1424d684e4b42ea2b036428bc19d2214a88b46ce1ba33d2823d974c Mar 09 09:24:09 crc kubenswrapper[4861]: I0309 09:24:09.374607 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550804-lfm7c" event={"ID":"a4a629d1-4904-48e6-9a69-e436c68cfdbc","Type":"ContainerStarted","Data":"9707b1c18da480f73cfbb7093fe58e70a4397eadd84e837638a5ef64d893c65a"} Mar 09 09:24:09 crc kubenswrapper[4861]: I0309 09:24:09.377747 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b9b83355-ea40-4408-9b77-c717df91e1a9","Type":"ContainerStarted","Data":"346826633ece68c2b0831d42c9df1b8a5515d5eb3dc4b95444c645750c0ab2c1"} Mar 09 09:24:09 crc kubenswrapper[4861]: I0309 09:24:09.378004 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 09 09:24:09 crc kubenswrapper[4861]: I0309 09:24:09.390730 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"03452acf-c21f-4d68-a813-772c30604a60","Type":"ContainerStarted","Data":"e064de58e735f17e2c137502c0da87ac3828eb3e26b02f3443abe788b0ffe732"} Mar 09 09:24:09 crc kubenswrapper[4861]: I0309 09:24:09.391028 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:24:09 crc kubenswrapper[4861]: I0309 09:24:09.432315 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-qjxtd" Mar 09 09:24:09 crc kubenswrapper[4861]: I0309 09:24:09.432652 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-s7nq5-config-ndmn6" event={"ID":"85060605-9da6-428f-bc42-872620763791","Type":"ContainerStarted","Data":"a7322d7bcc83d5419742604674ffba70d24ba7fbe97cb4ccc03de61b7d2abc47"} Mar 09 09:24:09 crc kubenswrapper[4861]: I0309 09:24:09.432815 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-s7nq5-config-ndmn6" event={"ID":"85060605-9da6-428f-bc42-872620763791","Type":"ContainerStarted","Data":"165d37c5f1424d684e4b42ea2b036428bc19d2214a88b46ce1ba33d2823d974c"} Mar 09 09:24:09 crc kubenswrapper[4861]: I0309 09:24:09.433128 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-dzt7t" Mar 09 09:24:09 crc kubenswrapper[4861]: I0309 09:24:09.466264 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=58.250416355 podStartE2EDuration="1m5.466244753s" podCreationTimestamp="2026-03-09 09:23:04 +0000 UTC" firstStartedPulling="2026-03-09 09:23:20.241502513 +0000 UTC m=+1043.326541924" lastFinishedPulling="2026-03-09 09:23:27.457330921 +0000 UTC m=+1050.542370322" observedRunningTime="2026-03-09 09:24:09.418320524 +0000 UTC m=+1092.503359945" watchObservedRunningTime="2026-03-09 09:24:09.466244753 +0000 UTC m=+1092.551284154" Mar 09 09:24:09 crc kubenswrapper[4861]: I0309 09:24:09.486125 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=59.421608038 podStartE2EDuration="1m6.486105165s" podCreationTimestamp="2026-03-09 09:23:03 +0000 UTC" firstStartedPulling="2026-03-09 09:23:20.241493842 +0000 UTC m=+1043.326533243" lastFinishedPulling="2026-03-09 09:23:27.305990969 +0000 UTC m=+1050.391030370" observedRunningTime="2026-03-09 09:24:09.470336745 +0000 UTC m=+1092.555376146" watchObservedRunningTime="2026-03-09 09:24:09.486105165 +0000 UTC m=+1092.571144566" Mar 09 09:24:10 crc kubenswrapper[4861]: I0309 09:24:10.444422 4861 generic.go:334] "Generic (PLEG): container finished" podID="85060605-9da6-428f-bc42-872620763791" containerID="a7322d7bcc83d5419742604674ffba70d24ba7fbe97cb4ccc03de61b7d2abc47" exitCode=0 Mar 09 09:24:10 crc kubenswrapper[4861]: I0309 09:24:10.444495 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-s7nq5-config-ndmn6" event={"ID":"85060605-9da6-428f-bc42-872620763791","Type":"ContainerDied","Data":"a7322d7bcc83d5419742604674ffba70d24ba7fbe97cb4ccc03de61b7d2abc47"} Mar 09 09:24:10 crc kubenswrapper[4861]: I0309 09:24:10.448571 4861 generic.go:334] "Generic (PLEG): container finished" podID="a4a629d1-4904-48e6-9a69-e436c68cfdbc" containerID="e117a32444d886a78c67cb565fdc8024e6468ba42d37d84957d2eaf74bef8159" exitCode=0 Mar 09 09:24:10 crc kubenswrapper[4861]: I0309 09:24:10.448632 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550804-lfm7c" event={"ID":"a4a629d1-4904-48e6-9a69-e436c68cfdbc","Type":"ContainerDied","Data":"e117a32444d886a78c67cb565fdc8024e6468ba42d37d84957d2eaf74bef8159"} Mar 09 09:24:10 crc kubenswrapper[4861]: I0309 09:24:10.780765 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s7nq5-config-ndmn6" Mar 09 09:24:10 crc kubenswrapper[4861]: I0309 09:24:10.930477 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/85060605-9da6-428f-bc42-872620763791-var-run-ovn\") pod \"85060605-9da6-428f-bc42-872620763791\" (UID: \"85060605-9da6-428f-bc42-872620763791\") " Mar 09 09:24:10 crc kubenswrapper[4861]: I0309 09:24:10.930545 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/85060605-9da6-428f-bc42-872620763791-additional-scripts\") pod \"85060605-9da6-428f-bc42-872620763791\" (UID: \"85060605-9da6-428f-bc42-872620763791\") " Mar 09 09:24:10 crc kubenswrapper[4861]: I0309 09:24:10.930601 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/85060605-9da6-428f-bc42-872620763791-var-log-ovn\") pod \"85060605-9da6-428f-bc42-872620763791\" (UID: \"85060605-9da6-428f-bc42-872620763791\") " Mar 09 09:24:10 crc kubenswrapper[4861]: I0309 09:24:10.930598 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/85060605-9da6-428f-bc42-872620763791-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "85060605-9da6-428f-bc42-872620763791" (UID: "85060605-9da6-428f-bc42-872620763791"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:24:10 crc kubenswrapper[4861]: I0309 09:24:10.930654 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85060605-9da6-428f-bc42-872620763791-scripts\") pod \"85060605-9da6-428f-bc42-872620763791\" (UID: \"85060605-9da6-428f-bc42-872620763791\") " Mar 09 09:24:10 crc kubenswrapper[4861]: I0309 09:24:10.930710 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/85060605-9da6-428f-bc42-872620763791-var-run\") pod \"85060605-9da6-428f-bc42-872620763791\" (UID: \"85060605-9da6-428f-bc42-872620763791\") " Mar 09 09:24:10 crc kubenswrapper[4861]: I0309 09:24:10.930768 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tddd5\" (UniqueName: \"kubernetes.io/projected/85060605-9da6-428f-bc42-872620763791-kube-api-access-tddd5\") pod \"85060605-9da6-428f-bc42-872620763791\" (UID: \"85060605-9da6-428f-bc42-872620763791\") " Mar 09 09:24:10 crc kubenswrapper[4861]: I0309 09:24:10.930757 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/85060605-9da6-428f-bc42-872620763791-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "85060605-9da6-428f-bc42-872620763791" (UID: "85060605-9da6-428f-bc42-872620763791"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:24:10 crc kubenswrapper[4861]: I0309 09:24:10.930807 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/85060605-9da6-428f-bc42-872620763791-var-run" (OuterVolumeSpecName: "var-run") pod "85060605-9da6-428f-bc42-872620763791" (UID: "85060605-9da6-428f-bc42-872620763791"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:24:10 crc kubenswrapper[4861]: I0309 09:24:10.931416 4861 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/85060605-9da6-428f-bc42-872620763791-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:10 crc kubenswrapper[4861]: I0309 09:24:10.931445 4861 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/85060605-9da6-428f-bc42-872620763791-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:10 crc kubenswrapper[4861]: I0309 09:24:10.931461 4861 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/85060605-9da6-428f-bc42-872620763791-var-run\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:10 crc kubenswrapper[4861]: I0309 09:24:10.931640 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85060605-9da6-428f-bc42-872620763791-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "85060605-9da6-428f-bc42-872620763791" (UID: "85060605-9da6-428f-bc42-872620763791"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:24:10 crc kubenswrapper[4861]: I0309 09:24:10.932008 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85060605-9da6-428f-bc42-872620763791-scripts" (OuterVolumeSpecName: "scripts") pod "85060605-9da6-428f-bc42-872620763791" (UID: "85060605-9da6-428f-bc42-872620763791"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:24:10 crc kubenswrapper[4861]: I0309 09:24:10.938611 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85060605-9da6-428f-bc42-872620763791-kube-api-access-tddd5" (OuterVolumeSpecName: "kube-api-access-tddd5") pod "85060605-9da6-428f-bc42-872620763791" (UID: "85060605-9da6-428f-bc42-872620763791"). InnerVolumeSpecName "kube-api-access-tddd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:24:11 crc kubenswrapper[4861]: I0309 09:24:11.033723 4861 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/85060605-9da6-428f-bc42-872620763791-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:11 crc kubenswrapper[4861]: I0309 09:24:11.033770 4861 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85060605-9da6-428f-bc42-872620763791-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:11 crc kubenswrapper[4861]: I0309 09:24:11.033781 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tddd5\" (UniqueName: \"kubernetes.io/projected/85060605-9da6-428f-bc42-872620763791-kube-api-access-tddd5\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:11 crc kubenswrapper[4861]: I0309 09:24:11.461367 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-s7nq5-config-ndmn6" event={"ID":"85060605-9da6-428f-bc42-872620763791","Type":"ContainerDied","Data":"165d37c5f1424d684e4b42ea2b036428bc19d2214a88b46ce1ba33d2823d974c"} Mar 09 09:24:11 crc kubenswrapper[4861]: I0309 09:24:11.461444 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="165d37c5f1424d684e4b42ea2b036428bc19d2214a88b46ce1ba33d2823d974c" Mar 09 09:24:11 crc kubenswrapper[4861]: I0309 09:24:11.461493 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-s7nq5-config-ndmn6" Mar 09 09:24:11 crc kubenswrapper[4861]: I0309 09:24:11.790789 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550804-lfm7c" Mar 09 09:24:11 crc kubenswrapper[4861]: I0309 09:24:11.848775 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnk7q\" (UniqueName: \"kubernetes.io/projected/a4a629d1-4904-48e6-9a69-e436c68cfdbc-kube-api-access-dnk7q\") pod \"a4a629d1-4904-48e6-9a69-e436c68cfdbc\" (UID: \"a4a629d1-4904-48e6-9a69-e436c68cfdbc\") " Mar 09 09:24:11 crc kubenswrapper[4861]: I0309 09:24:11.854040 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4a629d1-4904-48e6-9a69-e436c68cfdbc-kube-api-access-dnk7q" (OuterVolumeSpecName: "kube-api-access-dnk7q") pod "a4a629d1-4904-48e6-9a69-e436c68cfdbc" (UID: "a4a629d1-4904-48e6-9a69-e436c68cfdbc"). InnerVolumeSpecName "kube-api-access-dnk7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:24:11 crc kubenswrapper[4861]: I0309 09:24:11.911744 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-s7nq5-config-ndmn6"] Mar 09 09:24:11 crc kubenswrapper[4861]: I0309 09:24:11.919549 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-s7nq5-config-ndmn6"] Mar 09 09:24:11 crc kubenswrapper[4861]: I0309 09:24:11.950594 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnk7q\" (UniqueName: \"kubernetes.io/projected/a4a629d1-4904-48e6-9a69-e436c68cfdbc-kube-api-access-dnk7q\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:12 crc kubenswrapper[4861]: I0309 09:24:12.000235 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-dzt7t"] Mar 09 09:24:12 crc kubenswrapper[4861]: I0309 09:24:12.008550 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-dzt7t"] Mar 09 09:24:12 crc kubenswrapper[4861]: I0309 09:24:12.474072 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550804-lfm7c" event={"ID":"a4a629d1-4904-48e6-9a69-e436c68cfdbc","Type":"ContainerDied","Data":"9707b1c18da480f73cfbb7093fe58e70a4397eadd84e837638a5ef64d893c65a"} Mar 09 09:24:12 crc kubenswrapper[4861]: I0309 09:24:12.474158 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9707b1c18da480f73cfbb7093fe58e70a4397eadd84e837638a5ef64d893c65a" Mar 09 09:24:12 crc kubenswrapper[4861]: I0309 09:24:12.474322 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550804-lfm7c" Mar 09 09:24:12 crc kubenswrapper[4861]: I0309 09:24:12.849761 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550798-cd9xx"] Mar 09 09:24:12 crc kubenswrapper[4861]: I0309 09:24:12.861057 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550798-cd9xx"] Mar 09 09:24:13 crc kubenswrapper[4861]: I0309 09:24:13.670795 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85060605-9da6-428f-bc42-872620763791" path="/var/lib/kubelet/pods/85060605-9da6-428f-bc42-872620763791/volumes" Mar 09 09:24:13 crc kubenswrapper[4861]: I0309 09:24:13.671494 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="969317e8-5323-499c-84a5-e78139bef859" path="/var/lib/kubelet/pods/969317e8-5323-499c-84a5-e78139bef859/volumes" Mar 09 09:24:13 crc kubenswrapper[4861]: I0309 09:24:13.672085 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e992bde7-1734-4260-9ab3-0da5ab187665" path="/var/lib/kubelet/pods/e992bde7-1734-4260-9ab3-0da5ab187665/volumes" Mar 09 09:24:13 crc kubenswrapper[4861]: I0309 09:24:13.807446 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-s7nq5" Mar 09 09:24:15 crc kubenswrapper[4861]: I0309 09:24:15.102081 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d1ad2fa7-36fc-4cd0-98ac-07b48c42e794-etc-swift\") pod \"swift-storage-0\" (UID: \"d1ad2fa7-36fc-4cd0-98ac-07b48c42e794\") " pod="openstack/swift-storage-0" Mar 09 09:24:15 crc kubenswrapper[4861]: I0309 09:24:15.111643 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d1ad2fa7-36fc-4cd0-98ac-07b48c42e794-etc-swift\") pod \"swift-storage-0\" (UID: \"d1ad2fa7-36fc-4cd0-98ac-07b48c42e794\") " pod="openstack/swift-storage-0" Mar 09 09:24:15 crc kubenswrapper[4861]: I0309 09:24:15.282230 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 09 09:24:15 crc kubenswrapper[4861]: I0309 09:24:15.623454 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 09 09:24:16 crc kubenswrapper[4861]: I0309 09:24:16.507412 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d1ad2fa7-36fc-4cd0-98ac-07b48c42e794","Type":"ContainerStarted","Data":"779ffbdab76b0111e52cf69193c185781c4d9dfdfbbfe4c4d79eb094ea5f31c3"} Mar 09 09:24:17 crc kubenswrapper[4861]: I0309 09:24:17.029388 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-7qv25"] Mar 09 09:24:17 crc kubenswrapper[4861]: E0309 09:24:17.030106 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="969317e8-5323-499c-84a5-e78139bef859" containerName="mariadb-account-create-update" Mar 09 09:24:17 crc kubenswrapper[4861]: I0309 09:24:17.030127 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="969317e8-5323-499c-84a5-e78139bef859" containerName="mariadb-account-create-update" Mar 09 09:24:17 crc kubenswrapper[4861]: E0309 09:24:17.030147 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4a629d1-4904-48e6-9a69-e436c68cfdbc" containerName="oc" Mar 09 09:24:17 crc kubenswrapper[4861]: I0309 09:24:17.030156 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4a629d1-4904-48e6-9a69-e436c68cfdbc" containerName="oc" Mar 09 09:24:17 crc kubenswrapper[4861]: E0309 09:24:17.030196 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f0d289d-af18-4534-a0c6-c90f51e93fd8" containerName="swift-ring-rebalance" Mar 09 09:24:17 crc kubenswrapper[4861]: I0309 09:24:17.030204 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f0d289d-af18-4534-a0c6-c90f51e93fd8" containerName="swift-ring-rebalance" Mar 09 09:24:17 crc kubenswrapper[4861]: E0309 09:24:17.030227 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85060605-9da6-428f-bc42-872620763791" containerName="ovn-config" Mar 09 09:24:17 crc kubenswrapper[4861]: I0309 09:24:17.030234 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="85060605-9da6-428f-bc42-872620763791" containerName="ovn-config" Mar 09 09:24:17 crc kubenswrapper[4861]: I0309 09:24:17.030575 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4a629d1-4904-48e6-9a69-e436c68cfdbc" containerName="oc" Mar 09 09:24:17 crc kubenswrapper[4861]: I0309 09:24:17.030596 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f0d289d-af18-4534-a0c6-c90f51e93fd8" containerName="swift-ring-rebalance" Mar 09 09:24:17 crc kubenswrapper[4861]: I0309 09:24:17.030617 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="85060605-9da6-428f-bc42-872620763791" containerName="ovn-config" Mar 09 09:24:17 crc kubenswrapper[4861]: I0309 09:24:17.030634 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="969317e8-5323-499c-84a5-e78139bef859" containerName="mariadb-account-create-update" Mar 09 09:24:17 crc kubenswrapper[4861]: I0309 09:24:17.031228 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-7qv25" Mar 09 09:24:17 crc kubenswrapper[4861]: I0309 09:24:17.035727 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 09 09:24:17 crc kubenswrapper[4861]: I0309 09:24:17.038916 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-7qv25"] Mar 09 09:24:17 crc kubenswrapper[4861]: I0309 09:24:17.132328 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e71176a-f707-4517-83d9-09848f66e7bb-operator-scripts\") pod \"root-account-create-update-7qv25\" (UID: \"6e71176a-f707-4517-83d9-09848f66e7bb\") " pod="openstack/root-account-create-update-7qv25" Mar 09 09:24:17 crc kubenswrapper[4861]: I0309 09:24:17.132471 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rbft\" (UniqueName: \"kubernetes.io/projected/6e71176a-f707-4517-83d9-09848f66e7bb-kube-api-access-8rbft\") pod \"root-account-create-update-7qv25\" (UID: \"6e71176a-f707-4517-83d9-09848f66e7bb\") " pod="openstack/root-account-create-update-7qv25" Mar 09 09:24:17 crc kubenswrapper[4861]: I0309 09:24:17.233473 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e71176a-f707-4517-83d9-09848f66e7bb-operator-scripts\") pod \"root-account-create-update-7qv25\" (UID: \"6e71176a-f707-4517-83d9-09848f66e7bb\") " pod="openstack/root-account-create-update-7qv25" Mar 09 09:24:17 crc kubenswrapper[4861]: I0309 09:24:17.233566 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rbft\" (UniqueName: \"kubernetes.io/projected/6e71176a-f707-4517-83d9-09848f66e7bb-kube-api-access-8rbft\") pod \"root-account-create-update-7qv25\" (UID: \"6e71176a-f707-4517-83d9-09848f66e7bb\") " pod="openstack/root-account-create-update-7qv25" Mar 09 09:24:17 crc kubenswrapper[4861]: I0309 09:24:17.235829 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e71176a-f707-4517-83d9-09848f66e7bb-operator-scripts\") pod \"root-account-create-update-7qv25\" (UID: \"6e71176a-f707-4517-83d9-09848f66e7bb\") " pod="openstack/root-account-create-update-7qv25" Mar 09 09:24:17 crc kubenswrapper[4861]: I0309 09:24:17.250482 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rbft\" (UniqueName: \"kubernetes.io/projected/6e71176a-f707-4517-83d9-09848f66e7bb-kube-api-access-8rbft\") pod \"root-account-create-update-7qv25\" (UID: \"6e71176a-f707-4517-83d9-09848f66e7bb\") " pod="openstack/root-account-create-update-7qv25" Mar 09 09:24:17 crc kubenswrapper[4861]: I0309 09:24:17.364721 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-7qv25" Mar 09 09:24:17 crc kubenswrapper[4861]: I0309 09:24:17.520071 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d1ad2fa7-36fc-4cd0-98ac-07b48c42e794","Type":"ContainerStarted","Data":"0df686a991e22e7f0dd978843a3db7853e2975a76042009c68208fe6cbae0b7b"} Mar 09 09:24:17 crc kubenswrapper[4861]: I0309 09:24:17.520445 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d1ad2fa7-36fc-4cd0-98ac-07b48c42e794","Type":"ContainerStarted","Data":"ba37f76f23f334328b149f6b6b089408c3b96c7ff2d5a0ec3b186363b91b8c3d"} Mar 09 09:24:17 crc kubenswrapper[4861]: I0309 09:24:17.520465 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d1ad2fa7-36fc-4cd0-98ac-07b48c42e794","Type":"ContainerStarted","Data":"b370523d2ab2d4ab468e28abd9598efe8b53ca4dcdb96926c45051f876560563"} Mar 09 09:24:17 crc kubenswrapper[4861]: I0309 09:24:17.520480 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d1ad2fa7-36fc-4cd0-98ac-07b48c42e794","Type":"ContainerStarted","Data":"8795495ac5f45aaf8ba6fd241f7270d883354b936cacf851b91e392aed4e4fc1"} Mar 09 09:24:17 crc kubenswrapper[4861]: I0309 09:24:17.787574 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-7qv25"] Mar 09 09:24:18 crc kubenswrapper[4861]: I0309 09:24:18.529653 4861 generic.go:334] "Generic (PLEG): container finished" podID="6e71176a-f707-4517-83d9-09848f66e7bb" containerID="1fab9aa85d1803e1d3b8653b33579a54369de0af09e2a9c46f1d7ab50349d561" exitCode=0 Mar 09 09:24:18 crc kubenswrapper[4861]: I0309 09:24:18.529767 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-7qv25" event={"ID":"6e71176a-f707-4517-83d9-09848f66e7bb","Type":"ContainerDied","Data":"1fab9aa85d1803e1d3b8653b33579a54369de0af09e2a9c46f1d7ab50349d561"} Mar 09 09:24:18 crc kubenswrapper[4861]: I0309 09:24:18.529985 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-7qv25" event={"ID":"6e71176a-f707-4517-83d9-09848f66e7bb","Type":"ContainerStarted","Data":"f40b1db4efa516fa2140f341fd365ebd4f22789139ea0b3d532ea96a6870c3d1"} Mar 09 09:24:19 crc kubenswrapper[4861]: I0309 09:24:19.540751 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d1ad2fa7-36fc-4cd0-98ac-07b48c42e794","Type":"ContainerStarted","Data":"6ded9f38878551758ef4df9a9e04c4b9fe34b664a33cafa73704bba4372ef8d7"} Mar 09 09:24:19 crc kubenswrapper[4861]: I0309 09:24:19.541016 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d1ad2fa7-36fc-4cd0-98ac-07b48c42e794","Type":"ContainerStarted","Data":"9384162a57b45569d68e35573f0ccfc9f1e891cde8d1990d26d0f73460affd31"} Mar 09 09:24:19 crc kubenswrapper[4861]: I0309 09:24:19.541032 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d1ad2fa7-36fc-4cd0-98ac-07b48c42e794","Type":"ContainerStarted","Data":"bb2b090c1cc4d99587184a62ea986a2d763bc72bfc9ae41c8cf56d5d99104f50"} Mar 09 09:24:19 crc kubenswrapper[4861]: I0309 09:24:19.541044 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d1ad2fa7-36fc-4cd0-98ac-07b48c42e794","Type":"ContainerStarted","Data":"46889ae9c850adcaced2b3956167b2ea35d6730447082e6873257d9983499c86"} Mar 09 09:24:19 crc kubenswrapper[4861]: I0309 09:24:19.847668 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-7qv25" Mar 09 09:24:19 crc kubenswrapper[4861]: I0309 09:24:19.977417 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rbft\" (UniqueName: \"kubernetes.io/projected/6e71176a-f707-4517-83d9-09848f66e7bb-kube-api-access-8rbft\") pod \"6e71176a-f707-4517-83d9-09848f66e7bb\" (UID: \"6e71176a-f707-4517-83d9-09848f66e7bb\") " Mar 09 09:24:19 crc kubenswrapper[4861]: I0309 09:24:19.977526 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e71176a-f707-4517-83d9-09848f66e7bb-operator-scripts\") pod \"6e71176a-f707-4517-83d9-09848f66e7bb\" (UID: \"6e71176a-f707-4517-83d9-09848f66e7bb\") " Mar 09 09:24:19 crc kubenswrapper[4861]: I0309 09:24:19.978356 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e71176a-f707-4517-83d9-09848f66e7bb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6e71176a-f707-4517-83d9-09848f66e7bb" (UID: "6e71176a-f707-4517-83d9-09848f66e7bb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:24:19 crc kubenswrapper[4861]: I0309 09:24:19.982156 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e71176a-f707-4517-83d9-09848f66e7bb-kube-api-access-8rbft" (OuterVolumeSpecName: "kube-api-access-8rbft") pod "6e71176a-f707-4517-83d9-09848f66e7bb" (UID: "6e71176a-f707-4517-83d9-09848f66e7bb"). InnerVolumeSpecName "kube-api-access-8rbft". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:24:20 crc kubenswrapper[4861]: I0309 09:24:20.080013 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rbft\" (UniqueName: \"kubernetes.io/projected/6e71176a-f707-4517-83d9-09848f66e7bb-kube-api-access-8rbft\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:20 crc kubenswrapper[4861]: I0309 09:24:20.080060 4861 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6e71176a-f707-4517-83d9-09848f66e7bb-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:20 crc kubenswrapper[4861]: I0309 09:24:20.549568 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-7qv25" event={"ID":"6e71176a-f707-4517-83d9-09848f66e7bb","Type":"ContainerDied","Data":"f40b1db4efa516fa2140f341fd365ebd4f22789139ea0b3d532ea96a6870c3d1"} Mar 09 09:24:20 crc kubenswrapper[4861]: I0309 09:24:20.550040 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f40b1db4efa516fa2140f341fd365ebd4f22789139ea0b3d532ea96a6870c3d1" Mar 09 09:24:20 crc kubenswrapper[4861]: I0309 09:24:20.549654 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-7qv25" Mar 09 09:24:21 crc kubenswrapper[4861]: I0309 09:24:21.564625 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d1ad2fa7-36fc-4cd0-98ac-07b48c42e794","Type":"ContainerStarted","Data":"8017e6982fa018d78322b026a3cc9e408796d7d04deb8500f07b07c94495ea84"} Mar 09 09:24:21 crc kubenswrapper[4861]: I0309 09:24:21.564924 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d1ad2fa7-36fc-4cd0-98ac-07b48c42e794","Type":"ContainerStarted","Data":"ad0ca61b2ba1ba8884bd1b4866064182867a0cae6e8f2edc2118eafcf3cc6622"} Mar 09 09:24:21 crc kubenswrapper[4861]: I0309 09:24:21.564939 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d1ad2fa7-36fc-4cd0-98ac-07b48c42e794","Type":"ContainerStarted","Data":"97c8a645be011eb262f9d2d1e85bb8fc885295490c1259b8eec7925bbcc971da"} Mar 09 09:24:21 crc kubenswrapper[4861]: I0309 09:24:21.564952 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d1ad2fa7-36fc-4cd0-98ac-07b48c42e794","Type":"ContainerStarted","Data":"dab053729d42f2cd0c0c8a48e2b5dbc85ee62d25c822f7cace97e9ee4ad6f353"} Mar 09 09:24:21 crc kubenswrapper[4861]: I0309 09:24:21.564963 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d1ad2fa7-36fc-4cd0-98ac-07b48c42e794","Type":"ContainerStarted","Data":"343cb40dd4264eca021db149ad559b8087bddc4c126f519b10324e4a13c625f9"} Mar 09 09:24:21 crc kubenswrapper[4861]: I0309 09:24:21.564974 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d1ad2fa7-36fc-4cd0-98ac-07b48c42e794","Type":"ContainerStarted","Data":"24381c83f4664fd2de1b5c7f0afdeaefb862cc9312de32570c1ec0557d35d183"} Mar 09 09:24:22 crc kubenswrapper[4861]: I0309 09:24:22.578150 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d1ad2fa7-36fc-4cd0-98ac-07b48c42e794","Type":"ContainerStarted","Data":"c06884a7c1f002d699262eed0a21c9f37ed46269d3ff4a0ff01a2c27235bd217"} Mar 09 09:24:22 crc kubenswrapper[4861]: I0309 09:24:22.627649 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=37.829602725 podStartE2EDuration="42.627623272s" podCreationTimestamp="2026-03-09 09:23:40 +0000 UTC" firstStartedPulling="2026-03-09 09:24:15.632893645 +0000 UTC m=+1098.717933046" lastFinishedPulling="2026-03-09 09:24:20.430914192 +0000 UTC m=+1103.515953593" observedRunningTime="2026-03-09 09:24:22.617499496 +0000 UTC m=+1105.702538917" watchObservedRunningTime="2026-03-09 09:24:22.627623272 +0000 UTC m=+1105.712662693" Mar 09 09:24:22 crc kubenswrapper[4861]: I0309 09:24:22.873767 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b684fb9f5-wg2v4"] Mar 09 09:24:22 crc kubenswrapper[4861]: E0309 09:24:22.874197 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e71176a-f707-4517-83d9-09848f66e7bb" containerName="mariadb-account-create-update" Mar 09 09:24:22 crc kubenswrapper[4861]: I0309 09:24:22.874214 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e71176a-f707-4517-83d9-09848f66e7bb" containerName="mariadb-account-create-update" Mar 09 09:24:22 crc kubenswrapper[4861]: I0309 09:24:22.874430 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e71176a-f707-4517-83d9-09848f66e7bb" containerName="mariadb-account-create-update" Mar 09 09:24:22 crc kubenswrapper[4861]: I0309 09:24:22.875475 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b684fb9f5-wg2v4" Mar 09 09:24:22 crc kubenswrapper[4861]: I0309 09:24:22.877305 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 09 09:24:22 crc kubenswrapper[4861]: I0309 09:24:22.889913 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b684fb9f5-wg2v4"] Mar 09 09:24:22 crc kubenswrapper[4861]: I0309 09:24:22.931219 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45ff338f-f912-422e-bb87-1db962b83fc4-dns-svc\") pod \"dnsmasq-dns-7b684fb9f5-wg2v4\" (UID: \"45ff338f-f912-422e-bb87-1db962b83fc4\") " pod="openstack/dnsmasq-dns-7b684fb9f5-wg2v4" Mar 09 09:24:22 crc kubenswrapper[4861]: I0309 09:24:22.931262 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45ff338f-f912-422e-bb87-1db962b83fc4-ovsdbserver-sb\") pod \"dnsmasq-dns-7b684fb9f5-wg2v4\" (UID: \"45ff338f-f912-422e-bb87-1db962b83fc4\") " pod="openstack/dnsmasq-dns-7b684fb9f5-wg2v4" Mar 09 09:24:22 crc kubenswrapper[4861]: I0309 09:24:22.931298 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45ff338f-f912-422e-bb87-1db962b83fc4-ovsdbserver-nb\") pod \"dnsmasq-dns-7b684fb9f5-wg2v4\" (UID: \"45ff338f-f912-422e-bb87-1db962b83fc4\") " pod="openstack/dnsmasq-dns-7b684fb9f5-wg2v4" Mar 09 09:24:22 crc kubenswrapper[4861]: I0309 09:24:22.931323 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/45ff338f-f912-422e-bb87-1db962b83fc4-dns-swift-storage-0\") pod \"dnsmasq-dns-7b684fb9f5-wg2v4\" (UID: \"45ff338f-f912-422e-bb87-1db962b83fc4\") " pod="openstack/dnsmasq-dns-7b684fb9f5-wg2v4" Mar 09 09:24:22 crc kubenswrapper[4861]: I0309 09:24:22.931428 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45ff338f-f912-422e-bb87-1db962b83fc4-config\") pod \"dnsmasq-dns-7b684fb9f5-wg2v4\" (UID: \"45ff338f-f912-422e-bb87-1db962b83fc4\") " pod="openstack/dnsmasq-dns-7b684fb9f5-wg2v4" Mar 09 09:24:22 crc kubenswrapper[4861]: I0309 09:24:22.931461 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k87xj\" (UniqueName: \"kubernetes.io/projected/45ff338f-f912-422e-bb87-1db962b83fc4-kube-api-access-k87xj\") pod \"dnsmasq-dns-7b684fb9f5-wg2v4\" (UID: \"45ff338f-f912-422e-bb87-1db962b83fc4\") " pod="openstack/dnsmasq-dns-7b684fb9f5-wg2v4" Mar 09 09:24:23 crc kubenswrapper[4861]: I0309 09:24:23.032602 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45ff338f-f912-422e-bb87-1db962b83fc4-dns-svc\") pod \"dnsmasq-dns-7b684fb9f5-wg2v4\" (UID: \"45ff338f-f912-422e-bb87-1db962b83fc4\") " pod="openstack/dnsmasq-dns-7b684fb9f5-wg2v4" Mar 09 09:24:23 crc kubenswrapper[4861]: I0309 09:24:23.032893 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45ff338f-f912-422e-bb87-1db962b83fc4-ovsdbserver-sb\") pod \"dnsmasq-dns-7b684fb9f5-wg2v4\" (UID: \"45ff338f-f912-422e-bb87-1db962b83fc4\") " pod="openstack/dnsmasq-dns-7b684fb9f5-wg2v4" Mar 09 09:24:23 crc kubenswrapper[4861]: I0309 09:24:23.033000 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45ff338f-f912-422e-bb87-1db962b83fc4-ovsdbserver-nb\") pod \"dnsmasq-dns-7b684fb9f5-wg2v4\" (UID: \"45ff338f-f912-422e-bb87-1db962b83fc4\") " pod="openstack/dnsmasq-dns-7b684fb9f5-wg2v4" Mar 09 09:24:23 crc kubenswrapper[4861]: I0309 09:24:23.033096 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/45ff338f-f912-422e-bb87-1db962b83fc4-dns-swift-storage-0\") pod \"dnsmasq-dns-7b684fb9f5-wg2v4\" (UID: \"45ff338f-f912-422e-bb87-1db962b83fc4\") " pod="openstack/dnsmasq-dns-7b684fb9f5-wg2v4" Mar 09 09:24:23 crc kubenswrapper[4861]: I0309 09:24:23.033251 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45ff338f-f912-422e-bb87-1db962b83fc4-config\") pod \"dnsmasq-dns-7b684fb9f5-wg2v4\" (UID: \"45ff338f-f912-422e-bb87-1db962b83fc4\") " pod="openstack/dnsmasq-dns-7b684fb9f5-wg2v4" Mar 09 09:24:23 crc kubenswrapper[4861]: I0309 09:24:23.033396 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k87xj\" (UniqueName: \"kubernetes.io/projected/45ff338f-f912-422e-bb87-1db962b83fc4-kube-api-access-k87xj\") pod \"dnsmasq-dns-7b684fb9f5-wg2v4\" (UID: \"45ff338f-f912-422e-bb87-1db962b83fc4\") " pod="openstack/dnsmasq-dns-7b684fb9f5-wg2v4" Mar 09 09:24:23 crc kubenswrapper[4861]: I0309 09:24:23.033788 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45ff338f-f912-422e-bb87-1db962b83fc4-dns-svc\") pod \"dnsmasq-dns-7b684fb9f5-wg2v4\" (UID: \"45ff338f-f912-422e-bb87-1db962b83fc4\") " pod="openstack/dnsmasq-dns-7b684fb9f5-wg2v4" Mar 09 09:24:23 crc kubenswrapper[4861]: I0309 09:24:23.033901 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45ff338f-f912-422e-bb87-1db962b83fc4-ovsdbserver-nb\") pod \"dnsmasq-dns-7b684fb9f5-wg2v4\" (UID: \"45ff338f-f912-422e-bb87-1db962b83fc4\") " pod="openstack/dnsmasq-dns-7b684fb9f5-wg2v4" Mar 09 09:24:23 crc kubenswrapper[4861]: I0309 09:24:23.033898 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/45ff338f-f912-422e-bb87-1db962b83fc4-dns-swift-storage-0\") pod \"dnsmasq-dns-7b684fb9f5-wg2v4\" (UID: \"45ff338f-f912-422e-bb87-1db962b83fc4\") " pod="openstack/dnsmasq-dns-7b684fb9f5-wg2v4" Mar 09 09:24:23 crc kubenswrapper[4861]: I0309 09:24:23.034209 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45ff338f-f912-422e-bb87-1db962b83fc4-ovsdbserver-sb\") pod \"dnsmasq-dns-7b684fb9f5-wg2v4\" (UID: \"45ff338f-f912-422e-bb87-1db962b83fc4\") " pod="openstack/dnsmasq-dns-7b684fb9f5-wg2v4" Mar 09 09:24:23 crc kubenswrapper[4861]: I0309 09:24:23.034608 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45ff338f-f912-422e-bb87-1db962b83fc4-config\") pod \"dnsmasq-dns-7b684fb9f5-wg2v4\" (UID: \"45ff338f-f912-422e-bb87-1db962b83fc4\") " pod="openstack/dnsmasq-dns-7b684fb9f5-wg2v4" Mar 09 09:24:23 crc kubenswrapper[4861]: I0309 09:24:23.049977 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k87xj\" (UniqueName: \"kubernetes.io/projected/45ff338f-f912-422e-bb87-1db962b83fc4-kube-api-access-k87xj\") pod \"dnsmasq-dns-7b684fb9f5-wg2v4\" (UID: \"45ff338f-f912-422e-bb87-1db962b83fc4\") " pod="openstack/dnsmasq-dns-7b684fb9f5-wg2v4" Mar 09 09:24:23 crc kubenswrapper[4861]: I0309 09:24:23.201696 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b684fb9f5-wg2v4" Mar 09 09:24:23 crc kubenswrapper[4861]: I0309 09:24:23.723759 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b684fb9f5-wg2v4"] Mar 09 09:24:23 crc kubenswrapper[4861]: W0309 09:24:23.728284 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45ff338f_f912_422e_bb87_1db962b83fc4.slice/crio-ff7a90ae1a764e1edeab3f7a6760f2f83ea9deef382ab60da6b9cf73efcb3cbb WatchSource:0}: Error finding container ff7a90ae1a764e1edeab3f7a6760f2f83ea9deef382ab60da6b9cf73efcb3cbb: Status 404 returned error can't find the container with id ff7a90ae1a764e1edeab3f7a6760f2f83ea9deef382ab60da6b9cf73efcb3cbb Mar 09 09:24:24 crc kubenswrapper[4861]: I0309 09:24:24.598997 4861 generic.go:334] "Generic (PLEG): container finished" podID="45ff338f-f912-422e-bb87-1db962b83fc4" containerID="26543cef1088372389596a46fd4618f07a932da0e0015f8554a64e7090e44470" exitCode=0 Mar 09 09:24:24 crc kubenswrapper[4861]: I0309 09:24:24.599795 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b684fb9f5-wg2v4" event={"ID":"45ff338f-f912-422e-bb87-1db962b83fc4","Type":"ContainerDied","Data":"26543cef1088372389596a46fd4618f07a932da0e0015f8554a64e7090e44470"} Mar 09 09:24:24 crc kubenswrapper[4861]: I0309 09:24:24.600585 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b684fb9f5-wg2v4" event={"ID":"45ff338f-f912-422e-bb87-1db962b83fc4","Type":"ContainerStarted","Data":"ff7a90ae1a764e1edeab3f7a6760f2f83ea9deef382ab60da6b9cf73efcb3cbb"} Mar 09 09:24:25 crc kubenswrapper[4861]: I0309 09:24:25.274642 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:24:25 crc kubenswrapper[4861]: I0309 09:24:25.561558 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 09 09:24:25 crc kubenswrapper[4861]: I0309 09:24:25.623118 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b684fb9f5-wg2v4" event={"ID":"45ff338f-f912-422e-bb87-1db962b83fc4","Type":"ContainerStarted","Data":"e52331302f6d484869f3c328b0ea4ef009c57e85ef9a0f6696c7579c7b951bd3"} Mar 09 09:24:25 crc kubenswrapper[4861]: I0309 09:24:25.623437 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b684fb9f5-wg2v4" Mar 09 09:24:25 crc kubenswrapper[4861]: I0309 09:24:25.624908 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-5kg66" event={"ID":"634dd56c-c726-49cc-9a71-ef57a7d0a984","Type":"ContainerStarted","Data":"d43a5bb9b44c57a2c41149c014eb0b93c247b7214b4694b9ba5f4555d49bcb4f"} Mar 09 09:24:25 crc kubenswrapper[4861]: I0309 09:24:25.649867 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b684fb9f5-wg2v4" podStartSLOduration=3.649848601 podStartE2EDuration="3.649848601s" podCreationTimestamp="2026-03-09 09:24:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:24:25.648580954 +0000 UTC m=+1108.733620375" watchObservedRunningTime="2026-03-09 09:24:25.649848601 +0000 UTC m=+1108.734888002" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.010250 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-5kg66" podStartSLOduration=6.167819857 podStartE2EDuration="39.010231274s" podCreationTimestamp="2026-03-09 09:23:48 +0000 UTC" firstStartedPulling="2026-03-09 09:23:51.314549142 +0000 UTC m=+1074.399588543" lastFinishedPulling="2026-03-09 09:24:24.156960559 +0000 UTC m=+1107.241999960" observedRunningTime="2026-03-09 09:24:25.668140745 +0000 UTC m=+1108.753180146" watchObservedRunningTime="2026-03-09 09:24:27.010231274 +0000 UTC m=+1110.095270675" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.013301 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-crjkx"] Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.014235 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-crjkx" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.031620 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-crjkx"] Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.100432 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ljps\" (UniqueName: \"kubernetes.io/projected/3cdcc666-9e35-47b3-a84b-0cd31afdc84a-kube-api-access-9ljps\") pod \"cinder-db-create-crjkx\" (UID: \"3cdcc666-9e35-47b3-a84b-0cd31afdc84a\") " pod="openstack/cinder-db-create-crjkx" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.100561 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3cdcc666-9e35-47b3-a84b-0cd31afdc84a-operator-scripts\") pod \"cinder-db-create-crjkx\" (UID: \"3cdcc666-9e35-47b3-a84b-0cd31afdc84a\") " pod="openstack/cinder-db-create-crjkx" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.141038 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-f21b-account-create-update-cmptx"] Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.142081 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f21b-account-create-update-cmptx" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.144680 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.170724 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-f21b-account-create-update-cmptx"] Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.202199 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntz7v\" (UniqueName: \"kubernetes.io/projected/0413fd17-3445-4719-9d24-42d8a9e41905-kube-api-access-ntz7v\") pod \"cinder-f21b-account-create-update-cmptx\" (UID: \"0413fd17-3445-4719-9d24-42d8a9e41905\") " pod="openstack/cinder-f21b-account-create-update-cmptx" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.202317 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0413fd17-3445-4719-9d24-42d8a9e41905-operator-scripts\") pod \"cinder-f21b-account-create-update-cmptx\" (UID: \"0413fd17-3445-4719-9d24-42d8a9e41905\") " pod="openstack/cinder-f21b-account-create-update-cmptx" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.202359 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3cdcc666-9e35-47b3-a84b-0cd31afdc84a-operator-scripts\") pod \"cinder-db-create-crjkx\" (UID: \"3cdcc666-9e35-47b3-a84b-0cd31afdc84a\") " pod="openstack/cinder-db-create-crjkx" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.202410 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ljps\" (UniqueName: \"kubernetes.io/projected/3cdcc666-9e35-47b3-a84b-0cd31afdc84a-kube-api-access-9ljps\") pod \"cinder-db-create-crjkx\" (UID: \"3cdcc666-9e35-47b3-a84b-0cd31afdc84a\") " pod="openstack/cinder-db-create-crjkx" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.203318 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3cdcc666-9e35-47b3-a84b-0cd31afdc84a-operator-scripts\") pod \"cinder-db-create-crjkx\" (UID: \"3cdcc666-9e35-47b3-a84b-0cd31afdc84a\") " pod="openstack/cinder-db-create-crjkx" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.221813 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ljps\" (UniqueName: \"kubernetes.io/projected/3cdcc666-9e35-47b3-a84b-0cd31afdc84a-kube-api-access-9ljps\") pod \"cinder-db-create-crjkx\" (UID: \"3cdcc666-9e35-47b3-a84b-0cd31afdc84a\") " pod="openstack/cinder-db-create-crjkx" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.223195 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-q252w"] Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.225982 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-q252w" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.253019 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-q252w"] Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.303898 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwq9g\" (UniqueName: \"kubernetes.io/projected/caf1f04f-5f46-4a52-9e17-58aa6a2b61e7-kube-api-access-wwq9g\") pod \"barbican-db-create-q252w\" (UID: \"caf1f04f-5f46-4a52-9e17-58aa6a2b61e7\") " pod="openstack/barbican-db-create-q252w" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.304245 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/caf1f04f-5f46-4a52-9e17-58aa6a2b61e7-operator-scripts\") pod \"barbican-db-create-q252w\" (UID: \"caf1f04f-5f46-4a52-9e17-58aa6a2b61e7\") " pod="openstack/barbican-db-create-q252w" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.304326 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntz7v\" (UniqueName: \"kubernetes.io/projected/0413fd17-3445-4719-9d24-42d8a9e41905-kube-api-access-ntz7v\") pod \"cinder-f21b-account-create-update-cmptx\" (UID: \"0413fd17-3445-4719-9d24-42d8a9e41905\") " pod="openstack/cinder-f21b-account-create-update-cmptx" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.304692 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0413fd17-3445-4719-9d24-42d8a9e41905-operator-scripts\") pod \"cinder-f21b-account-create-update-cmptx\" (UID: \"0413fd17-3445-4719-9d24-42d8a9e41905\") " pod="openstack/cinder-f21b-account-create-update-cmptx" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.305430 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0413fd17-3445-4719-9d24-42d8a9e41905-operator-scripts\") pod \"cinder-f21b-account-create-update-cmptx\" (UID: \"0413fd17-3445-4719-9d24-42d8a9e41905\") " pod="openstack/cinder-f21b-account-create-update-cmptx" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.323580 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntz7v\" (UniqueName: \"kubernetes.io/projected/0413fd17-3445-4719-9d24-42d8a9e41905-kube-api-access-ntz7v\") pod \"cinder-f21b-account-create-update-cmptx\" (UID: \"0413fd17-3445-4719-9d24-42d8a9e41905\") " pod="openstack/cinder-f21b-account-create-update-cmptx" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.329488 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-crjkx" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.329797 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-69b0-account-create-update-5pw85"] Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.330765 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-69b0-account-create-update-5pw85" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.340382 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-69b0-account-create-update-5pw85"] Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.340659 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.402628 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-n79q7"] Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.403689 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-n79q7" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.405971 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-sk77l" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.407964 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.408124 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.408395 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.416246 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xgj9\" (UniqueName: \"kubernetes.io/projected/40076fb0-0798-4425-a7ec-2638a66ee6f5-kube-api-access-9xgj9\") pod \"barbican-69b0-account-create-update-5pw85\" (UID: \"40076fb0-0798-4425-a7ec-2638a66ee6f5\") " pod="openstack/barbican-69b0-account-create-update-5pw85" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.416459 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40076fb0-0798-4425-a7ec-2638a66ee6f5-operator-scripts\") pod \"barbican-69b0-account-create-update-5pw85\" (UID: \"40076fb0-0798-4425-a7ec-2638a66ee6f5\") " pod="openstack/barbican-69b0-account-create-update-5pw85" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.416731 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwq9g\" (UniqueName: \"kubernetes.io/projected/caf1f04f-5f46-4a52-9e17-58aa6a2b61e7-kube-api-access-wwq9g\") pod \"barbican-db-create-q252w\" (UID: \"caf1f04f-5f46-4a52-9e17-58aa6a2b61e7\") " pod="openstack/barbican-db-create-q252w" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.416761 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/caf1f04f-5f46-4a52-9e17-58aa6a2b61e7-operator-scripts\") pod \"barbican-db-create-q252w\" (UID: \"caf1f04f-5f46-4a52-9e17-58aa6a2b61e7\") " pod="openstack/barbican-db-create-q252w" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.417699 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/caf1f04f-5f46-4a52-9e17-58aa6a2b61e7-operator-scripts\") pod \"barbican-db-create-q252w\" (UID: \"caf1f04f-5f46-4a52-9e17-58aa6a2b61e7\") " pod="openstack/barbican-db-create-q252w" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.427962 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-n79q7"] Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.451894 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwq9g\" (UniqueName: \"kubernetes.io/projected/caf1f04f-5f46-4a52-9e17-58aa6a2b61e7-kube-api-access-wwq9g\") pod \"barbican-db-create-q252w\" (UID: \"caf1f04f-5f46-4a52-9e17-58aa6a2b61e7\") " pod="openstack/barbican-db-create-q252w" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.456673 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f21b-account-create-update-cmptx" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.519882 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xgj9\" (UniqueName: \"kubernetes.io/projected/40076fb0-0798-4425-a7ec-2638a66ee6f5-kube-api-access-9xgj9\") pod \"barbican-69b0-account-create-update-5pw85\" (UID: \"40076fb0-0798-4425-a7ec-2638a66ee6f5\") " pod="openstack/barbican-69b0-account-create-update-5pw85" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.519961 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40076fb0-0798-4425-a7ec-2638a66ee6f5-operator-scripts\") pod \"barbican-69b0-account-create-update-5pw85\" (UID: \"40076fb0-0798-4425-a7ec-2638a66ee6f5\") " pod="openstack/barbican-69b0-account-create-update-5pw85" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.520017 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24508739-f77b-4cb1-8b0e-bc18a292c6f0-combined-ca-bundle\") pod \"keystone-db-sync-n79q7\" (UID: \"24508739-f77b-4cb1-8b0e-bc18a292c6f0\") " pod="openstack/keystone-db-sync-n79q7" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.520054 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p854n\" (UniqueName: \"kubernetes.io/projected/24508739-f77b-4cb1-8b0e-bc18a292c6f0-kube-api-access-p854n\") pod \"keystone-db-sync-n79q7\" (UID: \"24508739-f77b-4cb1-8b0e-bc18a292c6f0\") " pod="openstack/keystone-db-sync-n79q7" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.520092 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24508739-f77b-4cb1-8b0e-bc18a292c6f0-config-data\") pod \"keystone-db-sync-n79q7\" (UID: \"24508739-f77b-4cb1-8b0e-bc18a292c6f0\") " pod="openstack/keystone-db-sync-n79q7" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.521758 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40076fb0-0798-4425-a7ec-2638a66ee6f5-operator-scripts\") pod \"barbican-69b0-account-create-update-5pw85\" (UID: \"40076fb0-0798-4425-a7ec-2638a66ee6f5\") " pod="openstack/barbican-69b0-account-create-update-5pw85" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.534065 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-6pxsf"] Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.535648 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-6pxsf" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.546107 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xgj9\" (UniqueName: \"kubernetes.io/projected/40076fb0-0798-4425-a7ec-2638a66ee6f5-kube-api-access-9xgj9\") pod \"barbican-69b0-account-create-update-5pw85\" (UID: \"40076fb0-0798-4425-a7ec-2638a66ee6f5\") " pod="openstack/barbican-69b0-account-create-update-5pw85" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.558205 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-1884-account-create-update-fl8nw"] Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.559324 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1884-account-create-update-fl8nw" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.561340 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.567131 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-6pxsf"] Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.578360 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-q252w" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.595503 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-1884-account-create-update-fl8nw"] Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.623265 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24508739-f77b-4cb1-8b0e-bc18a292c6f0-config-data\") pod \"keystone-db-sync-n79q7\" (UID: \"24508739-f77b-4cb1-8b0e-bc18a292c6f0\") " pod="openstack/keystone-db-sync-n79q7" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.623313 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwfr9\" (UniqueName: \"kubernetes.io/projected/4263ee95-7df1-4358-80b0-c3516f030ff6-kube-api-access-wwfr9\") pod \"neutron-1884-account-create-update-fl8nw\" (UID: \"4263ee95-7df1-4358-80b0-c3516f030ff6\") " pod="openstack/neutron-1884-account-create-update-fl8nw" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.623354 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/98aa826b-02b5-4db3-b496-10eb34917427-operator-scripts\") pod \"neutron-db-create-6pxsf\" (UID: \"98aa826b-02b5-4db3-b496-10eb34917427\") " pod="openstack/neutron-db-create-6pxsf" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.623402 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4263ee95-7df1-4358-80b0-c3516f030ff6-operator-scripts\") pod \"neutron-1884-account-create-update-fl8nw\" (UID: \"4263ee95-7df1-4358-80b0-c3516f030ff6\") " pod="openstack/neutron-1884-account-create-update-fl8nw" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.623460 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpdsh\" (UniqueName: \"kubernetes.io/projected/98aa826b-02b5-4db3-b496-10eb34917427-kube-api-access-gpdsh\") pod \"neutron-db-create-6pxsf\" (UID: \"98aa826b-02b5-4db3-b496-10eb34917427\") " pod="openstack/neutron-db-create-6pxsf" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.623490 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24508739-f77b-4cb1-8b0e-bc18a292c6f0-combined-ca-bundle\") pod \"keystone-db-sync-n79q7\" (UID: \"24508739-f77b-4cb1-8b0e-bc18a292c6f0\") " pod="openstack/keystone-db-sync-n79q7" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.623519 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p854n\" (UniqueName: \"kubernetes.io/projected/24508739-f77b-4cb1-8b0e-bc18a292c6f0-kube-api-access-p854n\") pod \"keystone-db-sync-n79q7\" (UID: \"24508739-f77b-4cb1-8b0e-bc18a292c6f0\") " pod="openstack/keystone-db-sync-n79q7" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.628611 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24508739-f77b-4cb1-8b0e-bc18a292c6f0-config-data\") pod \"keystone-db-sync-n79q7\" (UID: \"24508739-f77b-4cb1-8b0e-bc18a292c6f0\") " pod="openstack/keystone-db-sync-n79q7" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.632214 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24508739-f77b-4cb1-8b0e-bc18a292c6f0-combined-ca-bundle\") pod \"keystone-db-sync-n79q7\" (UID: \"24508739-f77b-4cb1-8b0e-bc18a292c6f0\") " pod="openstack/keystone-db-sync-n79q7" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.644047 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p854n\" (UniqueName: \"kubernetes.io/projected/24508739-f77b-4cb1-8b0e-bc18a292c6f0-kube-api-access-p854n\") pod \"keystone-db-sync-n79q7\" (UID: \"24508739-f77b-4cb1-8b0e-bc18a292c6f0\") " pod="openstack/keystone-db-sync-n79q7" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.724766 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpdsh\" (UniqueName: \"kubernetes.io/projected/98aa826b-02b5-4db3-b496-10eb34917427-kube-api-access-gpdsh\") pod \"neutron-db-create-6pxsf\" (UID: \"98aa826b-02b5-4db3-b496-10eb34917427\") " pod="openstack/neutron-db-create-6pxsf" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.724882 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwfr9\" (UniqueName: \"kubernetes.io/projected/4263ee95-7df1-4358-80b0-c3516f030ff6-kube-api-access-wwfr9\") pod \"neutron-1884-account-create-update-fl8nw\" (UID: \"4263ee95-7df1-4358-80b0-c3516f030ff6\") " pod="openstack/neutron-1884-account-create-update-fl8nw" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.724934 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/98aa826b-02b5-4db3-b496-10eb34917427-operator-scripts\") pod \"neutron-db-create-6pxsf\" (UID: \"98aa826b-02b5-4db3-b496-10eb34917427\") " pod="openstack/neutron-db-create-6pxsf" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.724975 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4263ee95-7df1-4358-80b0-c3516f030ff6-operator-scripts\") pod \"neutron-1884-account-create-update-fl8nw\" (UID: \"4263ee95-7df1-4358-80b0-c3516f030ff6\") " pod="openstack/neutron-1884-account-create-update-fl8nw" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.726584 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/98aa826b-02b5-4db3-b496-10eb34917427-operator-scripts\") pod \"neutron-db-create-6pxsf\" (UID: \"98aa826b-02b5-4db3-b496-10eb34917427\") " pod="openstack/neutron-db-create-6pxsf" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.727130 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4263ee95-7df1-4358-80b0-c3516f030ff6-operator-scripts\") pod \"neutron-1884-account-create-update-fl8nw\" (UID: \"4263ee95-7df1-4358-80b0-c3516f030ff6\") " pod="openstack/neutron-1884-account-create-update-fl8nw" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.745163 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpdsh\" (UniqueName: \"kubernetes.io/projected/98aa826b-02b5-4db3-b496-10eb34917427-kube-api-access-gpdsh\") pod \"neutron-db-create-6pxsf\" (UID: \"98aa826b-02b5-4db3-b496-10eb34917427\") " pod="openstack/neutron-db-create-6pxsf" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.745556 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwfr9\" (UniqueName: \"kubernetes.io/projected/4263ee95-7df1-4358-80b0-c3516f030ff6-kube-api-access-wwfr9\") pod \"neutron-1884-account-create-update-fl8nw\" (UID: \"4263ee95-7df1-4358-80b0-c3516f030ff6\") " pod="openstack/neutron-1884-account-create-update-fl8nw" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.802894 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-69b0-account-create-update-5pw85" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.812260 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-n79q7" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.882006 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-6pxsf" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.896451 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1884-account-create-update-fl8nw" Mar 09 09:24:27 crc kubenswrapper[4861]: I0309 09:24:27.933643 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-crjkx"] Mar 09 09:24:28 crc kubenswrapper[4861]: I0309 09:24:28.081976 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-q252w"] Mar 09 09:24:28 crc kubenswrapper[4861]: W0309 09:24:28.099710 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcaf1f04f_5f46_4a52_9e17_58aa6a2b61e7.slice/crio-fea9963ccde8c8bb66d07772e39a981debc5117bc8fa4d181338da8252ec162f WatchSource:0}: Error finding container fea9963ccde8c8bb66d07772e39a981debc5117bc8fa4d181338da8252ec162f: Status 404 returned error can't find the container with id fea9963ccde8c8bb66d07772e39a981debc5117bc8fa4d181338da8252ec162f Mar 09 09:24:28 crc kubenswrapper[4861]: I0309 09:24:28.100916 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-f21b-account-create-update-cmptx"] Mar 09 09:24:28 crc kubenswrapper[4861]: I0309 09:24:28.280368 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-69b0-account-create-update-5pw85"] Mar 09 09:24:28 crc kubenswrapper[4861]: I0309 09:24:28.675234 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-69b0-account-create-update-5pw85" event={"ID":"40076fb0-0798-4425-a7ec-2638a66ee6f5","Type":"ContainerStarted","Data":"84ca4e525422b3807ce2160168de070c09c37ae2e82c11ecf1efe5cc31bf82e0"} Mar 09 09:24:28 crc kubenswrapper[4861]: I0309 09:24:28.675292 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-69b0-account-create-update-5pw85" event={"ID":"40076fb0-0798-4425-a7ec-2638a66ee6f5","Type":"ContainerStarted","Data":"9b30491254398aec42e2df9abfc1a150c84c5420218519f0d32dfca5f9d2c0ad"} Mar 09 09:24:28 crc kubenswrapper[4861]: I0309 09:24:28.676927 4861 generic.go:334] "Generic (PLEG): container finished" podID="caf1f04f-5f46-4a52-9e17-58aa6a2b61e7" containerID="b1957f5b86896c192575f201cafc98ec5e171cde10b6c66681c4a6d641e37a8f" exitCode=0 Mar 09 09:24:28 crc kubenswrapper[4861]: I0309 09:24:28.676996 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-q252w" event={"ID":"caf1f04f-5f46-4a52-9e17-58aa6a2b61e7","Type":"ContainerDied","Data":"b1957f5b86896c192575f201cafc98ec5e171cde10b6c66681c4a6d641e37a8f"} Mar 09 09:24:28 crc kubenswrapper[4861]: I0309 09:24:28.677041 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-q252w" event={"ID":"caf1f04f-5f46-4a52-9e17-58aa6a2b61e7","Type":"ContainerStarted","Data":"fea9963ccde8c8bb66d07772e39a981debc5117bc8fa4d181338da8252ec162f"} Mar 09 09:24:28 crc kubenswrapper[4861]: I0309 09:24:28.680049 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f21b-account-create-update-cmptx" event={"ID":"0413fd17-3445-4719-9d24-42d8a9e41905","Type":"ContainerStarted","Data":"d2d5b3f66c142b0dee4cee1521dfa5ac2cec1368e09d96d9f53efdc2af1208fa"} Mar 09 09:24:28 crc kubenswrapper[4861]: I0309 09:24:28.680089 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f21b-account-create-update-cmptx" event={"ID":"0413fd17-3445-4719-9d24-42d8a9e41905","Type":"ContainerStarted","Data":"bf2e050857855f85cc5f90dc01a95c6b4b0377d8a0c8afea992341db25244c10"} Mar 09 09:24:28 crc kubenswrapper[4861]: I0309 09:24:28.694625 4861 generic.go:334] "Generic (PLEG): container finished" podID="3cdcc666-9e35-47b3-a84b-0cd31afdc84a" containerID="40d0a28810d1725fd9d9998dfc3cd8acc2dc4f3e07ec8b89cae28d4bf9d16b10" exitCode=0 Mar 09 09:24:28 crc kubenswrapper[4861]: I0309 09:24:28.694681 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-crjkx" event={"ID":"3cdcc666-9e35-47b3-a84b-0cd31afdc84a","Type":"ContainerDied","Data":"40d0a28810d1725fd9d9998dfc3cd8acc2dc4f3e07ec8b89cae28d4bf9d16b10"} Mar 09 09:24:28 crc kubenswrapper[4861]: I0309 09:24:28.694713 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-crjkx" event={"ID":"3cdcc666-9e35-47b3-a84b-0cd31afdc84a","Type":"ContainerStarted","Data":"2f8bc50804570b3256296169da2462d90096d5c3b2210d24c30f01d233b7109d"} Mar 09 09:24:28 crc kubenswrapper[4861]: I0309 09:24:28.695653 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-69b0-account-create-update-5pw85" podStartSLOduration=1.6956342960000002 podStartE2EDuration="1.695634296s" podCreationTimestamp="2026-03-09 09:24:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:24:28.694952987 +0000 UTC m=+1111.779992388" watchObservedRunningTime="2026-03-09 09:24:28.695634296 +0000 UTC m=+1111.780673697" Mar 09 09:24:28 crc kubenswrapper[4861]: I0309 09:24:28.746096 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-f21b-account-create-update-cmptx" podStartSLOduration=1.7460788489999999 podStartE2EDuration="1.746078849s" podCreationTimestamp="2026-03-09 09:24:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:24:28.74543378 +0000 UTC m=+1111.830473181" watchObservedRunningTime="2026-03-09 09:24:28.746078849 +0000 UTC m=+1111.831118250" Mar 09 09:24:28 crc kubenswrapper[4861]: I0309 09:24:28.939413 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-n79q7"] Mar 09 09:24:29 crc kubenswrapper[4861]: I0309 09:24:29.011982 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-6pxsf"] Mar 09 09:24:29 crc kubenswrapper[4861]: W0309 09:24:29.022456 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98aa826b_02b5_4db3_b496_10eb34917427.slice/crio-7156865a300391701953e7744af079b041817a8282f2c60a0327701b5f896e94 WatchSource:0}: Error finding container 7156865a300391701953e7744af079b041817a8282f2c60a0327701b5f896e94: Status 404 returned error can't find the container with id 7156865a300391701953e7744af079b041817a8282f2c60a0327701b5f896e94 Mar 09 09:24:29 crc kubenswrapper[4861]: I0309 09:24:29.024359 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-1884-account-create-update-fl8nw"] Mar 09 09:24:29 crc kubenswrapper[4861]: I0309 09:24:29.711009 4861 generic.go:334] "Generic (PLEG): container finished" podID="0413fd17-3445-4719-9d24-42d8a9e41905" containerID="d2d5b3f66c142b0dee4cee1521dfa5ac2cec1368e09d96d9f53efdc2af1208fa" exitCode=0 Mar 09 09:24:29 crc kubenswrapper[4861]: I0309 09:24:29.711090 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f21b-account-create-update-cmptx" event={"ID":"0413fd17-3445-4719-9d24-42d8a9e41905","Type":"ContainerDied","Data":"d2d5b3f66c142b0dee4cee1521dfa5ac2cec1368e09d96d9f53efdc2af1208fa"} Mar 09 09:24:29 crc kubenswrapper[4861]: I0309 09:24:29.713793 4861 generic.go:334] "Generic (PLEG): container finished" podID="98aa826b-02b5-4db3-b496-10eb34917427" containerID="4b60ef0028e183b2bf765de01466d8955fe335836880888abeb310abea8d615a" exitCode=0 Mar 09 09:24:29 crc kubenswrapper[4861]: I0309 09:24:29.713864 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-6pxsf" event={"ID":"98aa826b-02b5-4db3-b496-10eb34917427","Type":"ContainerDied","Data":"4b60ef0028e183b2bf765de01466d8955fe335836880888abeb310abea8d615a"} Mar 09 09:24:29 crc kubenswrapper[4861]: I0309 09:24:29.713895 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-6pxsf" event={"ID":"98aa826b-02b5-4db3-b496-10eb34917427","Type":"ContainerStarted","Data":"7156865a300391701953e7744af079b041817a8282f2c60a0327701b5f896e94"} Mar 09 09:24:29 crc kubenswrapper[4861]: I0309 09:24:29.715470 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-n79q7" event={"ID":"24508739-f77b-4cb1-8b0e-bc18a292c6f0","Type":"ContainerStarted","Data":"9fa2cb52ec902b485647992296acd578feb543379d25b564e250d2b1271f171b"} Mar 09 09:24:29 crc kubenswrapper[4861]: I0309 09:24:29.716609 4861 generic.go:334] "Generic (PLEG): container finished" podID="4263ee95-7df1-4358-80b0-c3516f030ff6" containerID="22d03a47e72066afbb0d4bf06ca8e439442024c3d3063ba6f9eddfa5d222ea60" exitCode=0 Mar 09 09:24:29 crc kubenswrapper[4861]: I0309 09:24:29.716661 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1884-account-create-update-fl8nw" event={"ID":"4263ee95-7df1-4358-80b0-c3516f030ff6","Type":"ContainerDied","Data":"22d03a47e72066afbb0d4bf06ca8e439442024c3d3063ba6f9eddfa5d222ea60"} Mar 09 09:24:29 crc kubenswrapper[4861]: I0309 09:24:29.716686 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1884-account-create-update-fl8nw" event={"ID":"4263ee95-7df1-4358-80b0-c3516f030ff6","Type":"ContainerStarted","Data":"6b993bc82f6454a047563cdab61fcc0a385d1a047a7696c5fd64a3bc6a5ca659"} Mar 09 09:24:29 crc kubenswrapper[4861]: I0309 09:24:29.717813 4861 generic.go:334] "Generic (PLEG): container finished" podID="40076fb0-0798-4425-a7ec-2638a66ee6f5" containerID="84ca4e525422b3807ce2160168de070c09c37ae2e82c11ecf1efe5cc31bf82e0" exitCode=0 Mar 09 09:24:29 crc kubenswrapper[4861]: I0309 09:24:29.718007 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-69b0-account-create-update-5pw85" event={"ID":"40076fb0-0798-4425-a7ec-2638a66ee6f5","Type":"ContainerDied","Data":"84ca4e525422b3807ce2160168de070c09c37ae2e82c11ecf1efe5cc31bf82e0"} Mar 09 09:24:30 crc kubenswrapper[4861]: I0309 09:24:30.137963 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-q252w" Mar 09 09:24:30 crc kubenswrapper[4861]: I0309 09:24:30.146796 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-crjkx" Mar 09 09:24:30 crc kubenswrapper[4861]: I0309 09:24:30.176359 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwq9g\" (UniqueName: \"kubernetes.io/projected/caf1f04f-5f46-4a52-9e17-58aa6a2b61e7-kube-api-access-wwq9g\") pod \"caf1f04f-5f46-4a52-9e17-58aa6a2b61e7\" (UID: \"caf1f04f-5f46-4a52-9e17-58aa6a2b61e7\") " Mar 09 09:24:30 crc kubenswrapper[4861]: I0309 09:24:30.176453 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/caf1f04f-5f46-4a52-9e17-58aa6a2b61e7-operator-scripts\") pod \"caf1f04f-5f46-4a52-9e17-58aa6a2b61e7\" (UID: \"caf1f04f-5f46-4a52-9e17-58aa6a2b61e7\") " Mar 09 09:24:30 crc kubenswrapper[4861]: I0309 09:24:30.176473 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3cdcc666-9e35-47b3-a84b-0cd31afdc84a-operator-scripts\") pod \"3cdcc666-9e35-47b3-a84b-0cd31afdc84a\" (UID: \"3cdcc666-9e35-47b3-a84b-0cd31afdc84a\") " Mar 09 09:24:30 crc kubenswrapper[4861]: I0309 09:24:30.176632 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ljps\" (UniqueName: \"kubernetes.io/projected/3cdcc666-9e35-47b3-a84b-0cd31afdc84a-kube-api-access-9ljps\") pod \"3cdcc666-9e35-47b3-a84b-0cd31afdc84a\" (UID: \"3cdcc666-9e35-47b3-a84b-0cd31afdc84a\") " Mar 09 09:24:30 crc kubenswrapper[4861]: I0309 09:24:30.177461 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cdcc666-9e35-47b3-a84b-0cd31afdc84a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3cdcc666-9e35-47b3-a84b-0cd31afdc84a" (UID: "3cdcc666-9e35-47b3-a84b-0cd31afdc84a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:24:30 crc kubenswrapper[4861]: I0309 09:24:30.177591 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/caf1f04f-5f46-4a52-9e17-58aa6a2b61e7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "caf1f04f-5f46-4a52-9e17-58aa6a2b61e7" (UID: "caf1f04f-5f46-4a52-9e17-58aa6a2b61e7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:24:30 crc kubenswrapper[4861]: I0309 09:24:30.201642 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cdcc666-9e35-47b3-a84b-0cd31afdc84a-kube-api-access-9ljps" (OuterVolumeSpecName: "kube-api-access-9ljps") pod "3cdcc666-9e35-47b3-a84b-0cd31afdc84a" (UID: "3cdcc666-9e35-47b3-a84b-0cd31afdc84a"). InnerVolumeSpecName "kube-api-access-9ljps". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:24:30 crc kubenswrapper[4861]: I0309 09:24:30.202315 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/caf1f04f-5f46-4a52-9e17-58aa6a2b61e7-kube-api-access-wwq9g" (OuterVolumeSpecName: "kube-api-access-wwq9g") pod "caf1f04f-5f46-4a52-9e17-58aa6a2b61e7" (UID: "caf1f04f-5f46-4a52-9e17-58aa6a2b61e7"). InnerVolumeSpecName "kube-api-access-wwq9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:24:30 crc kubenswrapper[4861]: I0309 09:24:30.279299 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwq9g\" (UniqueName: \"kubernetes.io/projected/caf1f04f-5f46-4a52-9e17-58aa6a2b61e7-kube-api-access-wwq9g\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:30 crc kubenswrapper[4861]: I0309 09:24:30.279411 4861 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/caf1f04f-5f46-4a52-9e17-58aa6a2b61e7-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:30 crc kubenswrapper[4861]: I0309 09:24:30.279422 4861 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3cdcc666-9e35-47b3-a84b-0cd31afdc84a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:30 crc kubenswrapper[4861]: I0309 09:24:30.279433 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ljps\" (UniqueName: \"kubernetes.io/projected/3cdcc666-9e35-47b3-a84b-0cd31afdc84a-kube-api-access-9ljps\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:30 crc kubenswrapper[4861]: I0309 09:24:30.726671 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-q252w" event={"ID":"caf1f04f-5f46-4a52-9e17-58aa6a2b61e7","Type":"ContainerDied","Data":"fea9963ccde8c8bb66d07772e39a981debc5117bc8fa4d181338da8252ec162f"} Mar 09 09:24:30 crc kubenswrapper[4861]: I0309 09:24:30.726717 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fea9963ccde8c8bb66d07772e39a981debc5117bc8fa4d181338da8252ec162f" Mar 09 09:24:30 crc kubenswrapper[4861]: I0309 09:24:30.726737 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-q252w" Mar 09 09:24:30 crc kubenswrapper[4861]: I0309 09:24:30.729427 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-crjkx" Mar 09 09:24:30 crc kubenswrapper[4861]: I0309 09:24:30.731516 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-crjkx" event={"ID":"3cdcc666-9e35-47b3-a84b-0cd31afdc84a","Type":"ContainerDied","Data":"2f8bc50804570b3256296169da2462d90096d5c3b2210d24c30f01d233b7109d"} Mar 09 09:24:30 crc kubenswrapper[4861]: I0309 09:24:30.731578 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f8bc50804570b3256296169da2462d90096d5c3b2210d24c30f01d233b7109d" Mar 09 09:24:32 crc kubenswrapper[4861]: I0309 09:24:32.748440 4861 generic.go:334] "Generic (PLEG): container finished" podID="634dd56c-c726-49cc-9a71-ef57a7d0a984" containerID="d43a5bb9b44c57a2c41149c014eb0b93c247b7214b4694b9ba5f4555d49bcb4f" exitCode=0 Mar 09 09:24:32 crc kubenswrapper[4861]: I0309 09:24:32.748528 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-5kg66" event={"ID":"634dd56c-c726-49cc-9a71-ef57a7d0a984","Type":"ContainerDied","Data":"d43a5bb9b44c57a2c41149c014eb0b93c247b7214b4694b9ba5f4555d49bcb4f"} Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.203642 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7b684fb9f5-wg2v4" Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.275141 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f7dd995-89gfl"] Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.275468 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-675f7dd995-89gfl" podUID="93d06a80-e66b-4934-8175-b3c6cb1032a9" containerName="dnsmasq-dns" containerID="cri-o://b197f180ac090e170e93e652be3e09eb33318df6ba4c570ab71921b801e1c882" gracePeriod=10 Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.573170 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-6pxsf" Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.581031 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1884-account-create-update-fl8nw" Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.591238 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f21b-account-create-update-cmptx" Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.598915 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-69b0-account-create-update-5pw85" Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.687192 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f7dd995-89gfl" Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.732438 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwfr9\" (UniqueName: \"kubernetes.io/projected/4263ee95-7df1-4358-80b0-c3516f030ff6-kube-api-access-wwfr9\") pod \"4263ee95-7df1-4358-80b0-c3516f030ff6\" (UID: \"4263ee95-7df1-4358-80b0-c3516f030ff6\") " Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.732692 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntz7v\" (UniqueName: \"kubernetes.io/projected/0413fd17-3445-4719-9d24-42d8a9e41905-kube-api-access-ntz7v\") pod \"0413fd17-3445-4719-9d24-42d8a9e41905\" (UID: \"0413fd17-3445-4719-9d24-42d8a9e41905\") " Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.732800 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0413fd17-3445-4719-9d24-42d8a9e41905-operator-scripts\") pod \"0413fd17-3445-4719-9d24-42d8a9e41905\" (UID: \"0413fd17-3445-4719-9d24-42d8a9e41905\") " Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.732871 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/98aa826b-02b5-4db3-b496-10eb34917427-operator-scripts\") pod \"98aa826b-02b5-4db3-b496-10eb34917427\" (UID: \"98aa826b-02b5-4db3-b496-10eb34917427\") " Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.732955 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40076fb0-0798-4425-a7ec-2638a66ee6f5-operator-scripts\") pod \"40076fb0-0798-4425-a7ec-2638a66ee6f5\" (UID: \"40076fb0-0798-4425-a7ec-2638a66ee6f5\") " Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.733039 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4263ee95-7df1-4358-80b0-c3516f030ff6-operator-scripts\") pod \"4263ee95-7df1-4358-80b0-c3516f030ff6\" (UID: \"4263ee95-7df1-4358-80b0-c3516f030ff6\") " Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.733115 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xgj9\" (UniqueName: \"kubernetes.io/projected/40076fb0-0798-4425-a7ec-2638a66ee6f5-kube-api-access-9xgj9\") pod \"40076fb0-0798-4425-a7ec-2638a66ee6f5\" (UID: \"40076fb0-0798-4425-a7ec-2638a66ee6f5\") " Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.733241 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gpdsh\" (UniqueName: \"kubernetes.io/projected/98aa826b-02b5-4db3-b496-10eb34917427-kube-api-access-gpdsh\") pod \"98aa826b-02b5-4db3-b496-10eb34917427\" (UID: \"98aa826b-02b5-4db3-b496-10eb34917427\") " Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.733502 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0413fd17-3445-4719-9d24-42d8a9e41905-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0413fd17-3445-4719-9d24-42d8a9e41905" (UID: "0413fd17-3445-4719-9d24-42d8a9e41905"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.733701 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40076fb0-0798-4425-a7ec-2638a66ee6f5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "40076fb0-0798-4425-a7ec-2638a66ee6f5" (UID: "40076fb0-0798-4425-a7ec-2638a66ee6f5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.733817 4861 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0413fd17-3445-4719-9d24-42d8a9e41905-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.733832 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4263ee95-7df1-4358-80b0-c3516f030ff6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4263ee95-7df1-4358-80b0-c3516f030ff6" (UID: "4263ee95-7df1-4358-80b0-c3516f030ff6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.733899 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98aa826b-02b5-4db3-b496-10eb34917427-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "98aa826b-02b5-4db3-b496-10eb34917427" (UID: "98aa826b-02b5-4db3-b496-10eb34917427"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.737915 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4263ee95-7df1-4358-80b0-c3516f030ff6-kube-api-access-wwfr9" (OuterVolumeSpecName: "kube-api-access-wwfr9") pod "4263ee95-7df1-4358-80b0-c3516f030ff6" (UID: "4263ee95-7df1-4358-80b0-c3516f030ff6"). InnerVolumeSpecName "kube-api-access-wwfr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.738133 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40076fb0-0798-4425-a7ec-2638a66ee6f5-kube-api-access-9xgj9" (OuterVolumeSpecName: "kube-api-access-9xgj9") pod "40076fb0-0798-4425-a7ec-2638a66ee6f5" (UID: "40076fb0-0798-4425-a7ec-2638a66ee6f5"). InnerVolumeSpecName "kube-api-access-9xgj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.738522 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0413fd17-3445-4719-9d24-42d8a9e41905-kube-api-access-ntz7v" (OuterVolumeSpecName: "kube-api-access-ntz7v") pod "0413fd17-3445-4719-9d24-42d8a9e41905" (UID: "0413fd17-3445-4719-9d24-42d8a9e41905"). InnerVolumeSpecName "kube-api-access-ntz7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.738740 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98aa826b-02b5-4db3-b496-10eb34917427-kube-api-access-gpdsh" (OuterVolumeSpecName: "kube-api-access-gpdsh") pod "98aa826b-02b5-4db3-b496-10eb34917427" (UID: "98aa826b-02b5-4db3-b496-10eb34917427"). InnerVolumeSpecName "kube-api-access-gpdsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.758481 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f21b-account-create-update-cmptx" Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.758480 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f21b-account-create-update-cmptx" event={"ID":"0413fd17-3445-4719-9d24-42d8a9e41905","Type":"ContainerDied","Data":"bf2e050857855f85cc5f90dc01a95c6b4b0377d8a0c8afea992341db25244c10"} Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.758578 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bf2e050857855f85cc5f90dc01a95c6b4b0377d8a0c8afea992341db25244c10" Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.762104 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-6pxsf" event={"ID":"98aa826b-02b5-4db3-b496-10eb34917427","Type":"ContainerDied","Data":"7156865a300391701953e7744af079b041817a8282f2c60a0327701b5f896e94"} Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.762127 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7156865a300391701953e7744af079b041817a8282f2c60a0327701b5f896e94" Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.762214 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-6pxsf" Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.764569 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-n79q7" event={"ID":"24508739-f77b-4cb1-8b0e-bc18a292c6f0","Type":"ContainerStarted","Data":"a66850429ce2fd114f33f4f8a53c612097eff74b5c4b688fc3a6dd8e084522ab"} Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.767712 4861 generic.go:334] "Generic (PLEG): container finished" podID="93d06a80-e66b-4934-8175-b3c6cb1032a9" containerID="b197f180ac090e170e93e652be3e09eb33318df6ba4c570ab71921b801e1c882" exitCode=0 Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.767812 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f7dd995-89gfl" event={"ID":"93d06a80-e66b-4934-8175-b3c6cb1032a9","Type":"ContainerDied","Data":"b197f180ac090e170e93e652be3e09eb33318df6ba4c570ab71921b801e1c882"} Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.767881 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f7dd995-89gfl" event={"ID":"93d06a80-e66b-4934-8175-b3c6cb1032a9","Type":"ContainerDied","Data":"9ea281cc13a2d0404fd3aae4c8c6d372f48d614ae14b4b6855504bf3f548b906"} Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.767914 4861 scope.go:117] "RemoveContainer" containerID="b197f180ac090e170e93e652be3e09eb33318df6ba4c570ab71921b801e1c882" Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.768133 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f7dd995-89gfl" Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.779013 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1884-account-create-update-fl8nw" Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.779038 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1884-account-create-update-fl8nw" event={"ID":"4263ee95-7df1-4358-80b0-c3516f030ff6","Type":"ContainerDied","Data":"6b993bc82f6454a047563cdab61fcc0a385d1a047a7696c5fd64a3bc6a5ca659"} Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.779111 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b993bc82f6454a047563cdab61fcc0a385d1a047a7696c5fd64a3bc6a5ca659" Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.791693 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-69b0-account-create-update-5pw85" event={"ID":"40076fb0-0798-4425-a7ec-2638a66ee6f5","Type":"ContainerDied","Data":"9b30491254398aec42e2df9abfc1a150c84c5420218519f0d32dfca5f9d2c0ad"} Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.791766 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b30491254398aec42e2df9abfc1a150c84c5420218519f0d32dfca5f9d2c0ad" Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.791843 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-69b0-account-create-update-5pw85" Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.797708 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-n79q7" podStartSLOduration=2.384995431 podStartE2EDuration="6.797690252s" podCreationTimestamp="2026-03-09 09:24:27 +0000 UTC" firstStartedPulling="2026-03-09 09:24:28.944004527 +0000 UTC m=+1112.029043938" lastFinishedPulling="2026-03-09 09:24:33.356699338 +0000 UTC m=+1116.441738759" observedRunningTime="2026-03-09 09:24:33.78047636 +0000 UTC m=+1116.865515771" watchObservedRunningTime="2026-03-09 09:24:33.797690252 +0000 UTC m=+1116.882729653" Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.828763 4861 scope.go:117] "RemoveContainer" containerID="f34953ee28da5102b6104c93af2218f5d6cb81001860a73bbf189530c1b3bc8d" Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.834314 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59rx2\" (UniqueName: \"kubernetes.io/projected/93d06a80-e66b-4934-8175-b3c6cb1032a9-kube-api-access-59rx2\") pod \"93d06a80-e66b-4934-8175-b3c6cb1032a9\" (UID: \"93d06a80-e66b-4934-8175-b3c6cb1032a9\") " Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.834361 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93d06a80-e66b-4934-8175-b3c6cb1032a9-ovsdbserver-sb\") pod \"93d06a80-e66b-4934-8175-b3c6cb1032a9\" (UID: \"93d06a80-e66b-4934-8175-b3c6cb1032a9\") " Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.834421 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93d06a80-e66b-4934-8175-b3c6cb1032a9-config\") pod \"93d06a80-e66b-4934-8175-b3c6cb1032a9\" (UID: \"93d06a80-e66b-4934-8175-b3c6cb1032a9\") " Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.834486 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93d06a80-e66b-4934-8175-b3c6cb1032a9-ovsdbserver-nb\") pod \"93d06a80-e66b-4934-8175-b3c6cb1032a9\" (UID: \"93d06a80-e66b-4934-8175-b3c6cb1032a9\") " Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.834518 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93d06a80-e66b-4934-8175-b3c6cb1032a9-dns-svc\") pod \"93d06a80-e66b-4934-8175-b3c6cb1032a9\" (UID: \"93d06a80-e66b-4934-8175-b3c6cb1032a9\") " Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.834885 4861 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/98aa826b-02b5-4db3-b496-10eb34917427-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.834896 4861 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/40076fb0-0798-4425-a7ec-2638a66ee6f5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.834906 4861 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4263ee95-7df1-4358-80b0-c3516f030ff6-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.834917 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xgj9\" (UniqueName: \"kubernetes.io/projected/40076fb0-0798-4425-a7ec-2638a66ee6f5-kube-api-access-9xgj9\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.834927 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gpdsh\" (UniqueName: \"kubernetes.io/projected/98aa826b-02b5-4db3-b496-10eb34917427-kube-api-access-gpdsh\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.834935 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwfr9\" (UniqueName: \"kubernetes.io/projected/4263ee95-7df1-4358-80b0-c3516f030ff6-kube-api-access-wwfr9\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.834945 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntz7v\" (UniqueName: \"kubernetes.io/projected/0413fd17-3445-4719-9d24-42d8a9e41905-kube-api-access-ntz7v\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.839644 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93d06a80-e66b-4934-8175-b3c6cb1032a9-kube-api-access-59rx2" (OuterVolumeSpecName: "kube-api-access-59rx2") pod "93d06a80-e66b-4934-8175-b3c6cb1032a9" (UID: "93d06a80-e66b-4934-8175-b3c6cb1032a9"). InnerVolumeSpecName "kube-api-access-59rx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.863482 4861 scope.go:117] "RemoveContainer" containerID="b197f180ac090e170e93e652be3e09eb33318df6ba4c570ab71921b801e1c882" Mar 09 09:24:33 crc kubenswrapper[4861]: E0309 09:24:33.865042 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b197f180ac090e170e93e652be3e09eb33318df6ba4c570ab71921b801e1c882\": container with ID starting with b197f180ac090e170e93e652be3e09eb33318df6ba4c570ab71921b801e1c882 not found: ID does not exist" containerID="b197f180ac090e170e93e652be3e09eb33318df6ba4c570ab71921b801e1c882" Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.865076 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b197f180ac090e170e93e652be3e09eb33318df6ba4c570ab71921b801e1c882"} err="failed to get container status \"b197f180ac090e170e93e652be3e09eb33318df6ba4c570ab71921b801e1c882\": rpc error: code = NotFound desc = could not find container \"b197f180ac090e170e93e652be3e09eb33318df6ba4c570ab71921b801e1c882\": container with ID starting with b197f180ac090e170e93e652be3e09eb33318df6ba4c570ab71921b801e1c882 not found: ID does not exist" Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.865095 4861 scope.go:117] "RemoveContainer" containerID="f34953ee28da5102b6104c93af2218f5d6cb81001860a73bbf189530c1b3bc8d" Mar 09 09:24:33 crc kubenswrapper[4861]: E0309 09:24:33.865357 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f34953ee28da5102b6104c93af2218f5d6cb81001860a73bbf189530c1b3bc8d\": container with ID starting with f34953ee28da5102b6104c93af2218f5d6cb81001860a73bbf189530c1b3bc8d not found: ID does not exist" containerID="f34953ee28da5102b6104c93af2218f5d6cb81001860a73bbf189530c1b3bc8d" Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.865454 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f34953ee28da5102b6104c93af2218f5d6cb81001860a73bbf189530c1b3bc8d"} err="failed to get container status \"f34953ee28da5102b6104c93af2218f5d6cb81001860a73bbf189530c1b3bc8d\": rpc error: code = NotFound desc = could not find container \"f34953ee28da5102b6104c93af2218f5d6cb81001860a73bbf189530c1b3bc8d\": container with ID starting with f34953ee28da5102b6104c93af2218f5d6cb81001860a73bbf189530c1b3bc8d not found: ID does not exist" Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.877088 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93d06a80-e66b-4934-8175-b3c6cb1032a9-config" (OuterVolumeSpecName: "config") pod "93d06a80-e66b-4934-8175-b3c6cb1032a9" (UID: "93d06a80-e66b-4934-8175-b3c6cb1032a9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.880252 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93d06a80-e66b-4934-8175-b3c6cb1032a9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "93d06a80-e66b-4934-8175-b3c6cb1032a9" (UID: "93d06a80-e66b-4934-8175-b3c6cb1032a9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.892279 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93d06a80-e66b-4934-8175-b3c6cb1032a9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "93d06a80-e66b-4934-8175-b3c6cb1032a9" (UID: "93d06a80-e66b-4934-8175-b3c6cb1032a9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.896854 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93d06a80-e66b-4934-8175-b3c6cb1032a9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "93d06a80-e66b-4934-8175-b3c6cb1032a9" (UID: "93d06a80-e66b-4934-8175-b3c6cb1032a9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.936050 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59rx2\" (UniqueName: \"kubernetes.io/projected/93d06a80-e66b-4934-8175-b3c6cb1032a9-kube-api-access-59rx2\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.936078 4861 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93d06a80-e66b-4934-8175-b3c6cb1032a9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.936087 4861 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93d06a80-e66b-4934-8175-b3c6cb1032a9-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.936096 4861 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93d06a80-e66b-4934-8175-b3c6cb1032a9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:33 crc kubenswrapper[4861]: I0309 09:24:33.936104 4861 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93d06a80-e66b-4934-8175-b3c6cb1032a9-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:34 crc kubenswrapper[4861]: I0309 09:24:34.068420 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-5kg66" Mar 09 09:24:34 crc kubenswrapper[4861]: I0309 09:24:34.117094 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f7dd995-89gfl"] Mar 09 09:24:34 crc kubenswrapper[4861]: I0309 09:24:34.134528 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f7dd995-89gfl"] Mar 09 09:24:34 crc kubenswrapper[4861]: I0309 09:24:34.242883 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/634dd56c-c726-49cc-9a71-ef57a7d0a984-combined-ca-bundle\") pod \"634dd56c-c726-49cc-9a71-ef57a7d0a984\" (UID: \"634dd56c-c726-49cc-9a71-ef57a7d0a984\") " Mar 09 09:24:34 crc kubenswrapper[4861]: I0309 09:24:34.242942 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/634dd56c-c726-49cc-9a71-ef57a7d0a984-db-sync-config-data\") pod \"634dd56c-c726-49cc-9a71-ef57a7d0a984\" (UID: \"634dd56c-c726-49cc-9a71-ef57a7d0a984\") " Mar 09 09:24:34 crc kubenswrapper[4861]: I0309 09:24:34.243052 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/634dd56c-c726-49cc-9a71-ef57a7d0a984-config-data\") pod \"634dd56c-c726-49cc-9a71-ef57a7d0a984\" (UID: \"634dd56c-c726-49cc-9a71-ef57a7d0a984\") " Mar 09 09:24:34 crc kubenswrapper[4861]: I0309 09:24:34.243072 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7drvd\" (UniqueName: \"kubernetes.io/projected/634dd56c-c726-49cc-9a71-ef57a7d0a984-kube-api-access-7drvd\") pod \"634dd56c-c726-49cc-9a71-ef57a7d0a984\" (UID: \"634dd56c-c726-49cc-9a71-ef57a7d0a984\") " Mar 09 09:24:34 crc kubenswrapper[4861]: I0309 09:24:34.254205 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/634dd56c-c726-49cc-9a71-ef57a7d0a984-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "634dd56c-c726-49cc-9a71-ef57a7d0a984" (UID: "634dd56c-c726-49cc-9a71-ef57a7d0a984"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:24:34 crc kubenswrapper[4861]: I0309 09:24:34.260558 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/634dd56c-c726-49cc-9a71-ef57a7d0a984-kube-api-access-7drvd" (OuterVolumeSpecName: "kube-api-access-7drvd") pod "634dd56c-c726-49cc-9a71-ef57a7d0a984" (UID: "634dd56c-c726-49cc-9a71-ef57a7d0a984"). InnerVolumeSpecName "kube-api-access-7drvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:24:34 crc kubenswrapper[4861]: I0309 09:24:34.298788 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/634dd56c-c726-49cc-9a71-ef57a7d0a984-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "634dd56c-c726-49cc-9a71-ef57a7d0a984" (UID: "634dd56c-c726-49cc-9a71-ef57a7d0a984"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:24:34 crc kubenswrapper[4861]: I0309 09:24:34.331434 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/634dd56c-c726-49cc-9a71-ef57a7d0a984-config-data" (OuterVolumeSpecName: "config-data") pod "634dd56c-c726-49cc-9a71-ef57a7d0a984" (UID: "634dd56c-c726-49cc-9a71-ef57a7d0a984"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:24:34 crc kubenswrapper[4861]: I0309 09:24:34.346191 4861 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/634dd56c-c726-49cc-9a71-ef57a7d0a984-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:34 crc kubenswrapper[4861]: I0309 09:24:34.346234 4861 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/634dd56c-c726-49cc-9a71-ef57a7d0a984-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:34 crc kubenswrapper[4861]: I0309 09:24:34.346249 4861 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/634dd56c-c726-49cc-9a71-ef57a7d0a984-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:34 crc kubenswrapper[4861]: I0309 09:24:34.346261 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7drvd\" (UniqueName: \"kubernetes.io/projected/634dd56c-c726-49cc-9a71-ef57a7d0a984-kube-api-access-7drvd\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:34 crc kubenswrapper[4861]: I0309 09:24:34.804140 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-5kg66" Mar 09 09:24:34 crc kubenswrapper[4861]: I0309 09:24:34.804210 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-5kg66" event={"ID":"634dd56c-c726-49cc-9a71-ef57a7d0a984","Type":"ContainerDied","Data":"a70863bf785d32bda3c2c8580403592edf1366dc3350d66a8a5c291505db7cb1"} Mar 09 09:24:34 crc kubenswrapper[4861]: I0309 09:24:34.804248 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a70863bf785d32bda3c2c8580403592edf1366dc3350d66a8a5c291505db7cb1" Mar 09 09:24:35 crc kubenswrapper[4861]: I0309 09:24:35.130979 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75c886f8b5-wq27s"] Mar 09 09:24:35 crc kubenswrapper[4861]: E0309 09:24:35.131404 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cdcc666-9e35-47b3-a84b-0cd31afdc84a" containerName="mariadb-database-create" Mar 09 09:24:35 crc kubenswrapper[4861]: I0309 09:24:35.131425 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cdcc666-9e35-47b3-a84b-0cd31afdc84a" containerName="mariadb-database-create" Mar 09 09:24:35 crc kubenswrapper[4861]: E0309 09:24:35.131438 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="634dd56c-c726-49cc-9a71-ef57a7d0a984" containerName="glance-db-sync" Mar 09 09:24:35 crc kubenswrapper[4861]: I0309 09:24:35.131444 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="634dd56c-c726-49cc-9a71-ef57a7d0a984" containerName="glance-db-sync" Mar 09 09:24:35 crc kubenswrapper[4861]: E0309 09:24:35.131455 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4263ee95-7df1-4358-80b0-c3516f030ff6" containerName="mariadb-account-create-update" Mar 09 09:24:35 crc kubenswrapper[4861]: I0309 09:24:35.131463 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="4263ee95-7df1-4358-80b0-c3516f030ff6" containerName="mariadb-account-create-update" Mar 09 09:24:35 crc kubenswrapper[4861]: E0309 09:24:35.131477 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40076fb0-0798-4425-a7ec-2638a66ee6f5" containerName="mariadb-account-create-update" Mar 09 09:24:35 crc kubenswrapper[4861]: I0309 09:24:35.131483 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="40076fb0-0798-4425-a7ec-2638a66ee6f5" containerName="mariadb-account-create-update" Mar 09 09:24:35 crc kubenswrapper[4861]: E0309 09:24:35.131496 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0413fd17-3445-4719-9d24-42d8a9e41905" containerName="mariadb-account-create-update" Mar 09 09:24:35 crc kubenswrapper[4861]: I0309 09:24:35.131501 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="0413fd17-3445-4719-9d24-42d8a9e41905" containerName="mariadb-account-create-update" Mar 09 09:24:35 crc kubenswrapper[4861]: E0309 09:24:35.131511 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93d06a80-e66b-4934-8175-b3c6cb1032a9" containerName="dnsmasq-dns" Mar 09 09:24:35 crc kubenswrapper[4861]: I0309 09:24:35.131516 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="93d06a80-e66b-4934-8175-b3c6cb1032a9" containerName="dnsmasq-dns" Mar 09 09:24:35 crc kubenswrapper[4861]: E0309 09:24:35.131526 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caf1f04f-5f46-4a52-9e17-58aa6a2b61e7" containerName="mariadb-database-create" Mar 09 09:24:35 crc kubenswrapper[4861]: I0309 09:24:35.131532 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="caf1f04f-5f46-4a52-9e17-58aa6a2b61e7" containerName="mariadb-database-create" Mar 09 09:24:35 crc kubenswrapper[4861]: E0309 09:24:35.131544 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98aa826b-02b5-4db3-b496-10eb34917427" containerName="mariadb-database-create" Mar 09 09:24:35 crc kubenswrapper[4861]: I0309 09:24:35.131549 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="98aa826b-02b5-4db3-b496-10eb34917427" containerName="mariadb-database-create" Mar 09 09:24:35 crc kubenswrapper[4861]: E0309 09:24:35.131560 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93d06a80-e66b-4934-8175-b3c6cb1032a9" containerName="init" Mar 09 09:24:35 crc kubenswrapper[4861]: I0309 09:24:35.131565 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="93d06a80-e66b-4934-8175-b3c6cb1032a9" containerName="init" Mar 09 09:24:35 crc kubenswrapper[4861]: I0309 09:24:35.135336 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="0413fd17-3445-4719-9d24-42d8a9e41905" containerName="mariadb-account-create-update" Mar 09 09:24:35 crc kubenswrapper[4861]: I0309 09:24:35.135391 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="98aa826b-02b5-4db3-b496-10eb34917427" containerName="mariadb-database-create" Mar 09 09:24:35 crc kubenswrapper[4861]: I0309 09:24:35.135406 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="caf1f04f-5f46-4a52-9e17-58aa6a2b61e7" containerName="mariadb-database-create" Mar 09 09:24:35 crc kubenswrapper[4861]: I0309 09:24:35.135417 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cdcc666-9e35-47b3-a84b-0cd31afdc84a" containerName="mariadb-database-create" Mar 09 09:24:35 crc kubenswrapper[4861]: I0309 09:24:35.135426 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="40076fb0-0798-4425-a7ec-2638a66ee6f5" containerName="mariadb-account-create-update" Mar 09 09:24:35 crc kubenswrapper[4861]: I0309 09:24:35.135438 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="634dd56c-c726-49cc-9a71-ef57a7d0a984" containerName="glance-db-sync" Mar 09 09:24:35 crc kubenswrapper[4861]: I0309 09:24:35.135445 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="4263ee95-7df1-4358-80b0-c3516f030ff6" containerName="mariadb-account-create-update" Mar 09 09:24:35 crc kubenswrapper[4861]: I0309 09:24:35.135455 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="93d06a80-e66b-4934-8175-b3c6cb1032a9" containerName="dnsmasq-dns" Mar 09 09:24:35 crc kubenswrapper[4861]: I0309 09:24:35.136406 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c886f8b5-wq27s" Mar 09 09:24:35 crc kubenswrapper[4861]: I0309 09:24:35.146761 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75c886f8b5-wq27s"] Mar 09 09:24:35 crc kubenswrapper[4861]: I0309 09:24:35.260469 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e2c54f63-d580-4793-bb62-e293cdef7ae3-ovsdbserver-sb\") pod \"dnsmasq-dns-75c886f8b5-wq27s\" (UID: \"e2c54f63-d580-4793-bb62-e293cdef7ae3\") " pod="openstack/dnsmasq-dns-75c886f8b5-wq27s" Mar 09 09:24:35 crc kubenswrapper[4861]: I0309 09:24:35.260541 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2c54f63-d580-4793-bb62-e293cdef7ae3-config\") pod \"dnsmasq-dns-75c886f8b5-wq27s\" (UID: \"e2c54f63-d580-4793-bb62-e293cdef7ae3\") " pod="openstack/dnsmasq-dns-75c886f8b5-wq27s" Mar 09 09:24:35 crc kubenswrapper[4861]: I0309 09:24:35.260570 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9t9sh\" (UniqueName: \"kubernetes.io/projected/e2c54f63-d580-4793-bb62-e293cdef7ae3-kube-api-access-9t9sh\") pod \"dnsmasq-dns-75c886f8b5-wq27s\" (UID: \"e2c54f63-d580-4793-bb62-e293cdef7ae3\") " pod="openstack/dnsmasq-dns-75c886f8b5-wq27s" Mar 09 09:24:35 crc kubenswrapper[4861]: I0309 09:24:35.260610 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e2c54f63-d580-4793-bb62-e293cdef7ae3-dns-swift-storage-0\") pod \"dnsmasq-dns-75c886f8b5-wq27s\" (UID: \"e2c54f63-d580-4793-bb62-e293cdef7ae3\") " pod="openstack/dnsmasq-dns-75c886f8b5-wq27s" Mar 09 09:24:35 crc kubenswrapper[4861]: I0309 09:24:35.260638 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e2c54f63-d580-4793-bb62-e293cdef7ae3-dns-svc\") pod \"dnsmasq-dns-75c886f8b5-wq27s\" (UID: \"e2c54f63-d580-4793-bb62-e293cdef7ae3\") " pod="openstack/dnsmasq-dns-75c886f8b5-wq27s" Mar 09 09:24:35 crc kubenswrapper[4861]: I0309 09:24:35.260680 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e2c54f63-d580-4793-bb62-e293cdef7ae3-ovsdbserver-nb\") pod \"dnsmasq-dns-75c886f8b5-wq27s\" (UID: \"e2c54f63-d580-4793-bb62-e293cdef7ae3\") " pod="openstack/dnsmasq-dns-75c886f8b5-wq27s" Mar 09 09:24:35 crc kubenswrapper[4861]: I0309 09:24:35.362529 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e2c54f63-d580-4793-bb62-e293cdef7ae3-dns-svc\") pod \"dnsmasq-dns-75c886f8b5-wq27s\" (UID: \"e2c54f63-d580-4793-bb62-e293cdef7ae3\") " pod="openstack/dnsmasq-dns-75c886f8b5-wq27s" Mar 09 09:24:35 crc kubenswrapper[4861]: I0309 09:24:35.362848 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e2c54f63-d580-4793-bb62-e293cdef7ae3-ovsdbserver-nb\") pod \"dnsmasq-dns-75c886f8b5-wq27s\" (UID: \"e2c54f63-d580-4793-bb62-e293cdef7ae3\") " pod="openstack/dnsmasq-dns-75c886f8b5-wq27s" Mar 09 09:24:35 crc kubenswrapper[4861]: I0309 09:24:35.362897 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e2c54f63-d580-4793-bb62-e293cdef7ae3-ovsdbserver-sb\") pod \"dnsmasq-dns-75c886f8b5-wq27s\" (UID: \"e2c54f63-d580-4793-bb62-e293cdef7ae3\") " pod="openstack/dnsmasq-dns-75c886f8b5-wq27s" Mar 09 09:24:35 crc kubenswrapper[4861]: I0309 09:24:35.362952 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2c54f63-d580-4793-bb62-e293cdef7ae3-config\") pod \"dnsmasq-dns-75c886f8b5-wq27s\" (UID: \"e2c54f63-d580-4793-bb62-e293cdef7ae3\") " pod="openstack/dnsmasq-dns-75c886f8b5-wq27s" Mar 09 09:24:35 crc kubenswrapper[4861]: I0309 09:24:35.362984 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9t9sh\" (UniqueName: \"kubernetes.io/projected/e2c54f63-d580-4793-bb62-e293cdef7ae3-kube-api-access-9t9sh\") pod \"dnsmasq-dns-75c886f8b5-wq27s\" (UID: \"e2c54f63-d580-4793-bb62-e293cdef7ae3\") " pod="openstack/dnsmasq-dns-75c886f8b5-wq27s" Mar 09 09:24:35 crc kubenswrapper[4861]: I0309 09:24:35.363043 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e2c54f63-d580-4793-bb62-e293cdef7ae3-dns-swift-storage-0\") pod \"dnsmasq-dns-75c886f8b5-wq27s\" (UID: \"e2c54f63-d580-4793-bb62-e293cdef7ae3\") " pod="openstack/dnsmasq-dns-75c886f8b5-wq27s" Mar 09 09:24:35 crc kubenswrapper[4861]: I0309 09:24:35.363629 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e2c54f63-d580-4793-bb62-e293cdef7ae3-dns-svc\") pod \"dnsmasq-dns-75c886f8b5-wq27s\" (UID: \"e2c54f63-d580-4793-bb62-e293cdef7ae3\") " pod="openstack/dnsmasq-dns-75c886f8b5-wq27s" Mar 09 09:24:35 crc kubenswrapper[4861]: I0309 09:24:35.363877 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e2c54f63-d580-4793-bb62-e293cdef7ae3-dns-swift-storage-0\") pod \"dnsmasq-dns-75c886f8b5-wq27s\" (UID: \"e2c54f63-d580-4793-bb62-e293cdef7ae3\") " pod="openstack/dnsmasq-dns-75c886f8b5-wq27s" Mar 09 09:24:35 crc kubenswrapper[4861]: I0309 09:24:35.364275 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e2c54f63-d580-4793-bb62-e293cdef7ae3-ovsdbserver-sb\") pod \"dnsmasq-dns-75c886f8b5-wq27s\" (UID: \"e2c54f63-d580-4793-bb62-e293cdef7ae3\") " pod="openstack/dnsmasq-dns-75c886f8b5-wq27s" Mar 09 09:24:35 crc kubenswrapper[4861]: I0309 09:24:35.364439 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2c54f63-d580-4793-bb62-e293cdef7ae3-config\") pod \"dnsmasq-dns-75c886f8b5-wq27s\" (UID: \"e2c54f63-d580-4793-bb62-e293cdef7ae3\") " pod="openstack/dnsmasq-dns-75c886f8b5-wq27s" Mar 09 09:24:35 crc kubenswrapper[4861]: I0309 09:24:35.364805 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e2c54f63-d580-4793-bb62-e293cdef7ae3-ovsdbserver-nb\") pod \"dnsmasq-dns-75c886f8b5-wq27s\" (UID: \"e2c54f63-d580-4793-bb62-e293cdef7ae3\") " pod="openstack/dnsmasq-dns-75c886f8b5-wq27s" Mar 09 09:24:35 crc kubenswrapper[4861]: I0309 09:24:35.383985 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9t9sh\" (UniqueName: \"kubernetes.io/projected/e2c54f63-d580-4793-bb62-e293cdef7ae3-kube-api-access-9t9sh\") pod \"dnsmasq-dns-75c886f8b5-wq27s\" (UID: \"e2c54f63-d580-4793-bb62-e293cdef7ae3\") " pod="openstack/dnsmasq-dns-75c886f8b5-wq27s" Mar 09 09:24:35 crc kubenswrapper[4861]: I0309 09:24:35.469419 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c886f8b5-wq27s" Mar 09 09:24:35 crc kubenswrapper[4861]: I0309 09:24:35.671877 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93d06a80-e66b-4934-8175-b3c6cb1032a9" path="/var/lib/kubelet/pods/93d06a80-e66b-4934-8175-b3c6cb1032a9/volumes" Mar 09 09:24:35 crc kubenswrapper[4861]: I0309 09:24:35.980984 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75c886f8b5-wq27s"] Mar 09 09:24:35 crc kubenswrapper[4861]: W0309 09:24:35.990008 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2c54f63_d580_4793_bb62_e293cdef7ae3.slice/crio-be2b95886ed7e4e6a793ccadc23d178a6a11504b28bbef88c739899e893ffb00 WatchSource:0}: Error finding container be2b95886ed7e4e6a793ccadc23d178a6a11504b28bbef88c739899e893ffb00: Status 404 returned error can't find the container with id be2b95886ed7e4e6a793ccadc23d178a6a11504b28bbef88c739899e893ffb00 Mar 09 09:24:36 crc kubenswrapper[4861]: I0309 09:24:36.820288 4861 generic.go:334] "Generic (PLEG): container finished" podID="e2c54f63-d580-4793-bb62-e293cdef7ae3" containerID="d29bfac4c915a0273fce2c1c7f9964bf2bc24c64c52b27bc0f6173b430ed3cc6" exitCode=0 Mar 09 09:24:36 crc kubenswrapper[4861]: I0309 09:24:36.820466 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c886f8b5-wq27s" event={"ID":"e2c54f63-d580-4793-bb62-e293cdef7ae3","Type":"ContainerDied","Data":"d29bfac4c915a0273fce2c1c7f9964bf2bc24c64c52b27bc0f6173b430ed3cc6"} Mar 09 09:24:36 crc kubenswrapper[4861]: I0309 09:24:36.820682 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c886f8b5-wq27s" event={"ID":"e2c54f63-d580-4793-bb62-e293cdef7ae3","Type":"ContainerStarted","Data":"be2b95886ed7e4e6a793ccadc23d178a6a11504b28bbef88c739899e893ffb00"} Mar 09 09:24:36 crc kubenswrapper[4861]: I0309 09:24:36.822877 4861 generic.go:334] "Generic (PLEG): container finished" podID="24508739-f77b-4cb1-8b0e-bc18a292c6f0" containerID="a66850429ce2fd114f33f4f8a53c612097eff74b5c4b688fc3a6dd8e084522ab" exitCode=0 Mar 09 09:24:36 crc kubenswrapper[4861]: I0309 09:24:36.822906 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-n79q7" event={"ID":"24508739-f77b-4cb1-8b0e-bc18a292c6f0","Type":"ContainerDied","Data":"a66850429ce2fd114f33f4f8a53c612097eff74b5c4b688fc3a6dd8e084522ab"} Mar 09 09:24:37 crc kubenswrapper[4861]: I0309 09:24:37.832041 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c886f8b5-wq27s" event={"ID":"e2c54f63-d580-4793-bb62-e293cdef7ae3","Type":"ContainerStarted","Data":"d5a7fec222bc50f31223a619c031d95f1d36876e8287e2b77be3256b87578703"} Mar 09 09:24:37 crc kubenswrapper[4861]: I0309 09:24:37.832465 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75c886f8b5-wq27s" Mar 09 09:24:37 crc kubenswrapper[4861]: I0309 09:24:37.861996 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75c886f8b5-wq27s" podStartSLOduration=2.861977892 podStartE2EDuration="2.861977892s" podCreationTimestamp="2026-03-09 09:24:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:24:37.860393576 +0000 UTC m=+1120.945432987" watchObservedRunningTime="2026-03-09 09:24:37.861977892 +0000 UTC m=+1120.947017293" Mar 09 09:24:38 crc kubenswrapper[4861]: I0309 09:24:38.152449 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-n79q7" Mar 09 09:24:38 crc kubenswrapper[4861]: I0309 09:24:38.238875 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24508739-f77b-4cb1-8b0e-bc18a292c6f0-combined-ca-bundle\") pod \"24508739-f77b-4cb1-8b0e-bc18a292c6f0\" (UID: \"24508739-f77b-4cb1-8b0e-bc18a292c6f0\") " Mar 09 09:24:38 crc kubenswrapper[4861]: I0309 09:24:38.239023 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24508739-f77b-4cb1-8b0e-bc18a292c6f0-config-data\") pod \"24508739-f77b-4cb1-8b0e-bc18a292c6f0\" (UID: \"24508739-f77b-4cb1-8b0e-bc18a292c6f0\") " Mar 09 09:24:38 crc kubenswrapper[4861]: I0309 09:24:38.239102 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p854n\" (UniqueName: \"kubernetes.io/projected/24508739-f77b-4cb1-8b0e-bc18a292c6f0-kube-api-access-p854n\") pod \"24508739-f77b-4cb1-8b0e-bc18a292c6f0\" (UID: \"24508739-f77b-4cb1-8b0e-bc18a292c6f0\") " Mar 09 09:24:38 crc kubenswrapper[4861]: I0309 09:24:38.249606 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24508739-f77b-4cb1-8b0e-bc18a292c6f0-kube-api-access-p854n" (OuterVolumeSpecName: "kube-api-access-p854n") pod "24508739-f77b-4cb1-8b0e-bc18a292c6f0" (UID: "24508739-f77b-4cb1-8b0e-bc18a292c6f0"). InnerVolumeSpecName "kube-api-access-p854n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:24:38 crc kubenswrapper[4861]: I0309 09:24:38.277686 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24508739-f77b-4cb1-8b0e-bc18a292c6f0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "24508739-f77b-4cb1-8b0e-bc18a292c6f0" (UID: "24508739-f77b-4cb1-8b0e-bc18a292c6f0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:24:38 crc kubenswrapper[4861]: I0309 09:24:38.327261 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24508739-f77b-4cb1-8b0e-bc18a292c6f0-config-data" (OuterVolumeSpecName: "config-data") pod "24508739-f77b-4cb1-8b0e-bc18a292c6f0" (UID: "24508739-f77b-4cb1-8b0e-bc18a292c6f0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:24:38 crc kubenswrapper[4861]: I0309 09:24:38.340636 4861 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24508739-f77b-4cb1-8b0e-bc18a292c6f0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:38 crc kubenswrapper[4861]: I0309 09:24:38.340669 4861 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24508739-f77b-4cb1-8b0e-bc18a292c6f0-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:38 crc kubenswrapper[4861]: I0309 09:24:38.340682 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p854n\" (UniqueName: \"kubernetes.io/projected/24508739-f77b-4cb1-8b0e-bc18a292c6f0-kube-api-access-p854n\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:38 crc kubenswrapper[4861]: I0309 09:24:38.840152 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-n79q7" Mar 09 09:24:38 crc kubenswrapper[4861]: I0309 09:24:38.840151 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-n79q7" event={"ID":"24508739-f77b-4cb1-8b0e-bc18a292c6f0","Type":"ContainerDied","Data":"9fa2cb52ec902b485647992296acd578feb543379d25b564e250d2b1271f171b"} Mar 09 09:24:38 crc kubenswrapper[4861]: I0309 09:24:38.840355 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9fa2cb52ec902b485647992296acd578feb543379d25b564e250d2b1271f171b" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.115419 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75c886f8b5-wq27s"] Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.154296 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5985c59c55-2fk4t"] Mar 09 09:24:39 crc kubenswrapper[4861]: E0309 09:24:39.154702 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24508739-f77b-4cb1-8b0e-bc18a292c6f0" containerName="keystone-db-sync" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.154723 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="24508739-f77b-4cb1-8b0e-bc18a292c6f0" containerName="keystone-db-sync" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.154943 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="24508739-f77b-4cb1-8b0e-bc18a292c6f0" containerName="keystone-db-sync" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.155854 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5985c59c55-2fk4t" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.180451 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-ktlzn"] Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.181478 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ktlzn" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.188464 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.188639 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.188747 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.188848 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.196531 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-sk77l" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.198486 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5985c59c55-2fk4t"] Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.227326 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-ktlzn"] Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.254258 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b84f271-fc53-4d64-ae57-acfd2a5b1696-ovsdbserver-nb\") pod \"dnsmasq-dns-5985c59c55-2fk4t\" (UID: \"1b84f271-fc53-4d64-ae57-acfd2a5b1696\") " pod="openstack/dnsmasq-dns-5985c59c55-2fk4t" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.254318 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-274cv\" (UniqueName: \"kubernetes.io/projected/35527270-268b-4b4d-864b-de9c9c6182ea-kube-api-access-274cv\") pod \"keystone-bootstrap-ktlzn\" (UID: \"35527270-268b-4b4d-864b-de9c9c6182ea\") " pod="openstack/keystone-bootstrap-ktlzn" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.254387 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/35527270-268b-4b4d-864b-de9c9c6182ea-credential-keys\") pod \"keystone-bootstrap-ktlzn\" (UID: \"35527270-268b-4b4d-864b-de9c9c6182ea\") " pod="openstack/keystone-bootstrap-ktlzn" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.254417 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b84f271-fc53-4d64-ae57-acfd2a5b1696-config\") pod \"dnsmasq-dns-5985c59c55-2fk4t\" (UID: \"1b84f271-fc53-4d64-ae57-acfd2a5b1696\") " pod="openstack/dnsmasq-dns-5985c59c55-2fk4t" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.254441 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mnfb\" (UniqueName: \"kubernetes.io/projected/1b84f271-fc53-4d64-ae57-acfd2a5b1696-kube-api-access-6mnfb\") pod \"dnsmasq-dns-5985c59c55-2fk4t\" (UID: \"1b84f271-fc53-4d64-ae57-acfd2a5b1696\") " pod="openstack/dnsmasq-dns-5985c59c55-2fk4t" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.254476 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b84f271-fc53-4d64-ae57-acfd2a5b1696-dns-svc\") pod \"dnsmasq-dns-5985c59c55-2fk4t\" (UID: \"1b84f271-fc53-4d64-ae57-acfd2a5b1696\") " pod="openstack/dnsmasq-dns-5985c59c55-2fk4t" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.254504 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/35527270-268b-4b4d-864b-de9c9c6182ea-fernet-keys\") pod \"keystone-bootstrap-ktlzn\" (UID: \"35527270-268b-4b4d-864b-de9c9c6182ea\") " pod="openstack/keystone-bootstrap-ktlzn" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.254540 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35527270-268b-4b4d-864b-de9c9c6182ea-scripts\") pod \"keystone-bootstrap-ktlzn\" (UID: \"35527270-268b-4b4d-864b-de9c9c6182ea\") " pod="openstack/keystone-bootstrap-ktlzn" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.254574 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35527270-268b-4b4d-864b-de9c9c6182ea-config-data\") pod \"keystone-bootstrap-ktlzn\" (UID: \"35527270-268b-4b4d-864b-de9c9c6182ea\") " pod="openstack/keystone-bootstrap-ktlzn" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.254625 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35527270-268b-4b4d-864b-de9c9c6182ea-combined-ca-bundle\") pod \"keystone-bootstrap-ktlzn\" (UID: \"35527270-268b-4b4d-864b-de9c9c6182ea\") " pod="openstack/keystone-bootstrap-ktlzn" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.254656 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b84f271-fc53-4d64-ae57-acfd2a5b1696-ovsdbserver-sb\") pod \"dnsmasq-dns-5985c59c55-2fk4t\" (UID: \"1b84f271-fc53-4d64-ae57-acfd2a5b1696\") " pod="openstack/dnsmasq-dns-5985c59c55-2fk4t" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.254681 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1b84f271-fc53-4d64-ae57-acfd2a5b1696-dns-swift-storage-0\") pod \"dnsmasq-dns-5985c59c55-2fk4t\" (UID: \"1b84f271-fc53-4d64-ae57-acfd2a5b1696\") " pod="openstack/dnsmasq-dns-5985c59c55-2fk4t" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.318116 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7f474b7d57-rss5x"] Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.336020 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7f474b7d57-rss5x" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.357168 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.358238 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.358474 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-jkcms" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.358859 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2ff07d5b-934f-40e1-a938-4c9c6d6bd846-config-data\") pod \"horizon-7f474b7d57-rss5x\" (UID: \"2ff07d5b-934f-40e1-a938-4c9c6d6bd846\") " pod="openstack/horizon-7f474b7d57-rss5x" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.358894 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35527270-268b-4b4d-864b-de9c9c6182ea-scripts\") pod \"keystone-bootstrap-ktlzn\" (UID: \"35527270-268b-4b4d-864b-de9c9c6182ea\") " pod="openstack/keystone-bootstrap-ktlzn" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.358922 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2ff07d5b-934f-40e1-a938-4c9c6d6bd846-scripts\") pod \"horizon-7f474b7d57-rss5x\" (UID: \"2ff07d5b-934f-40e1-a938-4c9c6d6bd846\") " pod="openstack/horizon-7f474b7d57-rss5x" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.358940 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35527270-268b-4b4d-864b-de9c9c6182ea-config-data\") pod \"keystone-bootstrap-ktlzn\" (UID: \"35527270-268b-4b4d-864b-de9c9c6182ea\") " pod="openstack/keystone-bootstrap-ktlzn" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.358980 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ff07d5b-934f-40e1-a938-4c9c6d6bd846-logs\") pod \"horizon-7f474b7d57-rss5x\" (UID: \"2ff07d5b-934f-40e1-a938-4c9c6d6bd846\") " pod="openstack/horizon-7f474b7d57-rss5x" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.358998 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2ff07d5b-934f-40e1-a938-4c9c6d6bd846-horizon-secret-key\") pod \"horizon-7f474b7d57-rss5x\" (UID: \"2ff07d5b-934f-40e1-a938-4c9c6d6bd846\") " pod="openstack/horizon-7f474b7d57-rss5x" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.359019 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35527270-268b-4b4d-864b-de9c9c6182ea-combined-ca-bundle\") pod \"keystone-bootstrap-ktlzn\" (UID: \"35527270-268b-4b4d-864b-de9c9c6182ea\") " pod="openstack/keystone-bootstrap-ktlzn" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.359040 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b84f271-fc53-4d64-ae57-acfd2a5b1696-ovsdbserver-sb\") pod \"dnsmasq-dns-5985c59c55-2fk4t\" (UID: \"1b84f271-fc53-4d64-ae57-acfd2a5b1696\") " pod="openstack/dnsmasq-dns-5985c59c55-2fk4t" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.359061 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1b84f271-fc53-4d64-ae57-acfd2a5b1696-dns-swift-storage-0\") pod \"dnsmasq-dns-5985c59c55-2fk4t\" (UID: \"1b84f271-fc53-4d64-ae57-acfd2a5b1696\") " pod="openstack/dnsmasq-dns-5985c59c55-2fk4t" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.359090 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b84f271-fc53-4d64-ae57-acfd2a5b1696-ovsdbserver-nb\") pod \"dnsmasq-dns-5985c59c55-2fk4t\" (UID: \"1b84f271-fc53-4d64-ae57-acfd2a5b1696\") " pod="openstack/dnsmasq-dns-5985c59c55-2fk4t" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.359108 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-274cv\" (UniqueName: \"kubernetes.io/projected/35527270-268b-4b4d-864b-de9c9c6182ea-kube-api-access-274cv\") pod \"keystone-bootstrap-ktlzn\" (UID: \"35527270-268b-4b4d-864b-de9c9c6182ea\") " pod="openstack/keystone-bootstrap-ktlzn" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.359138 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/35527270-268b-4b4d-864b-de9c9c6182ea-credential-keys\") pod \"keystone-bootstrap-ktlzn\" (UID: \"35527270-268b-4b4d-864b-de9c9c6182ea\") " pod="openstack/keystone-bootstrap-ktlzn" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.359161 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b84f271-fc53-4d64-ae57-acfd2a5b1696-config\") pod \"dnsmasq-dns-5985c59c55-2fk4t\" (UID: \"1b84f271-fc53-4d64-ae57-acfd2a5b1696\") " pod="openstack/dnsmasq-dns-5985c59c55-2fk4t" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.359179 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mnfb\" (UniqueName: \"kubernetes.io/projected/1b84f271-fc53-4d64-ae57-acfd2a5b1696-kube-api-access-6mnfb\") pod \"dnsmasq-dns-5985c59c55-2fk4t\" (UID: \"1b84f271-fc53-4d64-ae57-acfd2a5b1696\") " pod="openstack/dnsmasq-dns-5985c59c55-2fk4t" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.359204 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b84f271-fc53-4d64-ae57-acfd2a5b1696-dns-svc\") pod \"dnsmasq-dns-5985c59c55-2fk4t\" (UID: \"1b84f271-fc53-4d64-ae57-acfd2a5b1696\") " pod="openstack/dnsmasq-dns-5985c59c55-2fk4t" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.359224 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zs7g4\" (UniqueName: \"kubernetes.io/projected/2ff07d5b-934f-40e1-a938-4c9c6d6bd846-kube-api-access-zs7g4\") pod \"horizon-7f474b7d57-rss5x\" (UID: \"2ff07d5b-934f-40e1-a938-4c9c6d6bd846\") " pod="openstack/horizon-7f474b7d57-rss5x" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.359245 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/35527270-268b-4b4d-864b-de9c9c6182ea-fernet-keys\") pod \"keystone-bootstrap-ktlzn\" (UID: \"35527270-268b-4b4d-864b-de9c9c6182ea\") " pod="openstack/keystone-bootstrap-ktlzn" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.361291 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1b84f271-fc53-4d64-ae57-acfd2a5b1696-dns-swift-storage-0\") pod \"dnsmasq-dns-5985c59c55-2fk4t\" (UID: \"1b84f271-fc53-4d64-ae57-acfd2a5b1696\") " pod="openstack/dnsmasq-dns-5985c59c55-2fk4t" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.363887 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35527270-268b-4b4d-864b-de9c9c6182ea-scripts\") pod \"keystone-bootstrap-ktlzn\" (UID: \"35527270-268b-4b4d-864b-de9c9c6182ea\") " pod="openstack/keystone-bootstrap-ktlzn" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.364306 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b84f271-fc53-4d64-ae57-acfd2a5b1696-config\") pod \"dnsmasq-dns-5985c59c55-2fk4t\" (UID: \"1b84f271-fc53-4d64-ae57-acfd2a5b1696\") " pod="openstack/dnsmasq-dns-5985c59c55-2fk4t" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.364776 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b84f271-fc53-4d64-ae57-acfd2a5b1696-ovsdbserver-sb\") pod \"dnsmasq-dns-5985c59c55-2fk4t\" (UID: \"1b84f271-fc53-4d64-ae57-acfd2a5b1696\") " pod="openstack/dnsmasq-dns-5985c59c55-2fk4t" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.365107 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b84f271-fc53-4d64-ae57-acfd2a5b1696-ovsdbserver-nb\") pod \"dnsmasq-dns-5985c59c55-2fk4t\" (UID: \"1b84f271-fc53-4d64-ae57-acfd2a5b1696\") " pod="openstack/dnsmasq-dns-5985c59c55-2fk4t" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.365830 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.368090 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35527270-268b-4b4d-864b-de9c9c6182ea-config-data\") pod \"keystone-bootstrap-ktlzn\" (UID: \"35527270-268b-4b4d-864b-de9c9c6182ea\") " pod="openstack/keystone-bootstrap-ktlzn" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.368445 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b84f271-fc53-4d64-ae57-acfd2a5b1696-dns-svc\") pod \"dnsmasq-dns-5985c59c55-2fk4t\" (UID: \"1b84f271-fc53-4d64-ae57-acfd2a5b1696\") " pod="openstack/dnsmasq-dns-5985c59c55-2fk4t" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.369047 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/35527270-268b-4b4d-864b-de9c9c6182ea-fernet-keys\") pod \"keystone-bootstrap-ktlzn\" (UID: \"35527270-268b-4b4d-864b-de9c9c6182ea\") " pod="openstack/keystone-bootstrap-ktlzn" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.394781 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7f474b7d57-rss5x"] Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.395122 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35527270-268b-4b4d-864b-de9c9c6182ea-combined-ca-bundle\") pod \"keystone-bootstrap-ktlzn\" (UID: \"35527270-268b-4b4d-864b-de9c9c6182ea\") " pod="openstack/keystone-bootstrap-ktlzn" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.397068 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/35527270-268b-4b4d-864b-de9c9c6182ea-credential-keys\") pod \"keystone-bootstrap-ktlzn\" (UID: \"35527270-268b-4b4d-864b-de9c9c6182ea\") " pod="openstack/keystone-bootstrap-ktlzn" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.427082 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-sr27s"] Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.428114 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-sr27s" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.433206 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-w94sx" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.441544 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.441780 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.449768 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-274cv\" (UniqueName: \"kubernetes.io/projected/35527270-268b-4b4d-864b-de9c9c6182ea-kube-api-access-274cv\") pod \"keystone-bootstrap-ktlzn\" (UID: \"35527270-268b-4b4d-864b-de9c9c6182ea\") " pod="openstack/keystone-bootstrap-ktlzn" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.458586 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-sr27s"] Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.466195 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/deb8e24b-1a6f-4173-9a5f-62974b0331a5-config-data\") pod \"cinder-db-sync-sr27s\" (UID: \"deb8e24b-1a6f-4173-9a5f-62974b0331a5\") " pod="openstack/cinder-db-sync-sr27s" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.466246 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zs7g4\" (UniqueName: \"kubernetes.io/projected/2ff07d5b-934f-40e1-a938-4c9c6d6bd846-kube-api-access-zs7g4\") pod \"horizon-7f474b7d57-rss5x\" (UID: \"2ff07d5b-934f-40e1-a938-4c9c6d6bd846\") " pod="openstack/horizon-7f474b7d57-rss5x" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.466290 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/deb8e24b-1a6f-4173-9a5f-62974b0331a5-db-sync-config-data\") pod \"cinder-db-sync-sr27s\" (UID: \"deb8e24b-1a6f-4173-9a5f-62974b0331a5\") " pod="openstack/cinder-db-sync-sr27s" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.466324 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deb8e24b-1a6f-4173-9a5f-62974b0331a5-combined-ca-bundle\") pod \"cinder-db-sync-sr27s\" (UID: \"deb8e24b-1a6f-4173-9a5f-62974b0331a5\") " pod="openstack/cinder-db-sync-sr27s" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.466346 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2ff07d5b-934f-40e1-a938-4c9c6d6bd846-config-data\") pod \"horizon-7f474b7d57-rss5x\" (UID: \"2ff07d5b-934f-40e1-a938-4c9c6d6bd846\") " pod="openstack/horizon-7f474b7d57-rss5x" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.466381 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2ff07d5b-934f-40e1-a938-4c9c6d6bd846-scripts\") pod \"horizon-7f474b7d57-rss5x\" (UID: \"2ff07d5b-934f-40e1-a938-4c9c6d6bd846\") " pod="openstack/horizon-7f474b7d57-rss5x" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.466414 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/deb8e24b-1a6f-4173-9a5f-62974b0331a5-scripts\") pod \"cinder-db-sync-sr27s\" (UID: \"deb8e24b-1a6f-4173-9a5f-62974b0331a5\") " pod="openstack/cinder-db-sync-sr27s" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.466432 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtc75\" (UniqueName: \"kubernetes.io/projected/deb8e24b-1a6f-4173-9a5f-62974b0331a5-kube-api-access-vtc75\") pod \"cinder-db-sync-sr27s\" (UID: \"deb8e24b-1a6f-4173-9a5f-62974b0331a5\") " pod="openstack/cinder-db-sync-sr27s" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.466452 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ff07d5b-934f-40e1-a938-4c9c6d6bd846-logs\") pod \"horizon-7f474b7d57-rss5x\" (UID: \"2ff07d5b-934f-40e1-a938-4c9c6d6bd846\") " pod="openstack/horizon-7f474b7d57-rss5x" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.466469 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2ff07d5b-934f-40e1-a938-4c9c6d6bd846-horizon-secret-key\") pod \"horizon-7f474b7d57-rss5x\" (UID: \"2ff07d5b-934f-40e1-a938-4c9c6d6bd846\") " pod="openstack/horizon-7f474b7d57-rss5x" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.466489 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/deb8e24b-1a6f-4173-9a5f-62974b0331a5-etc-machine-id\") pod \"cinder-db-sync-sr27s\" (UID: \"deb8e24b-1a6f-4173-9a5f-62974b0331a5\") " pod="openstack/cinder-db-sync-sr27s" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.468082 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2ff07d5b-934f-40e1-a938-4c9c6d6bd846-config-data\") pod \"horizon-7f474b7d57-rss5x\" (UID: \"2ff07d5b-934f-40e1-a938-4c9c6d6bd846\") " pod="openstack/horizon-7f474b7d57-rss5x" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.468212 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2ff07d5b-934f-40e1-a938-4c9c6d6bd846-scripts\") pod \"horizon-7f474b7d57-rss5x\" (UID: \"2ff07d5b-934f-40e1-a938-4c9c6d6bd846\") " pod="openstack/horizon-7f474b7d57-rss5x" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.468322 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ff07d5b-934f-40e1-a938-4c9c6d6bd846-logs\") pod \"horizon-7f474b7d57-rss5x\" (UID: \"2ff07d5b-934f-40e1-a938-4c9c6d6bd846\") " pod="openstack/horizon-7f474b7d57-rss5x" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.470167 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mnfb\" (UniqueName: \"kubernetes.io/projected/1b84f271-fc53-4d64-ae57-acfd2a5b1696-kube-api-access-6mnfb\") pod \"dnsmasq-dns-5985c59c55-2fk4t\" (UID: \"1b84f271-fc53-4d64-ae57-acfd2a5b1696\") " pod="openstack/dnsmasq-dns-5985c59c55-2fk4t" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.477833 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2ff07d5b-934f-40e1-a938-4c9c6d6bd846-horizon-secret-key\") pod \"horizon-7f474b7d57-rss5x\" (UID: \"2ff07d5b-934f-40e1-a938-4c9c6d6bd846\") " pod="openstack/horizon-7f474b7d57-rss5x" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.484857 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5985c59c55-2fk4t" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.488432 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-ntvbd"] Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.489530 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-ntvbd" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.496914 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.517659 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-mhz5j" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.517801 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ktlzn" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.518257 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.527931 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-ntvbd"] Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.563237 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zs7g4\" (UniqueName: \"kubernetes.io/projected/2ff07d5b-934f-40e1-a938-4c9c6d6bd846-kube-api-access-zs7g4\") pod \"horizon-7f474b7d57-rss5x\" (UID: \"2ff07d5b-934f-40e1-a938-4c9c6d6bd846\") " pod="openstack/horizon-7f474b7d57-rss5x" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.580357 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e5455e9-fa02-46bd-8786-1888543b55cc-combined-ca-bundle\") pod \"neutron-db-sync-ntvbd\" (UID: \"4e5455e9-fa02-46bd-8786-1888543b55cc\") " pod="openstack/neutron-db-sync-ntvbd" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.580444 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/deb8e24b-1a6f-4173-9a5f-62974b0331a5-config-data\") pod \"cinder-db-sync-sr27s\" (UID: \"deb8e24b-1a6f-4173-9a5f-62974b0331a5\") " pod="openstack/cinder-db-sync-sr27s" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.580471 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/deb8e24b-1a6f-4173-9a5f-62974b0331a5-db-sync-config-data\") pod \"cinder-db-sync-sr27s\" (UID: \"deb8e24b-1a6f-4173-9a5f-62974b0331a5\") " pod="openstack/cinder-db-sync-sr27s" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.580491 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4e5455e9-fa02-46bd-8786-1888543b55cc-config\") pod \"neutron-db-sync-ntvbd\" (UID: \"4e5455e9-fa02-46bd-8786-1888543b55cc\") " pod="openstack/neutron-db-sync-ntvbd" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.580526 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deb8e24b-1a6f-4173-9a5f-62974b0331a5-combined-ca-bundle\") pod \"cinder-db-sync-sr27s\" (UID: \"deb8e24b-1a6f-4173-9a5f-62974b0331a5\") " pod="openstack/cinder-db-sync-sr27s" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.580561 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqf5r\" (UniqueName: \"kubernetes.io/projected/4e5455e9-fa02-46bd-8786-1888543b55cc-kube-api-access-tqf5r\") pod \"neutron-db-sync-ntvbd\" (UID: \"4e5455e9-fa02-46bd-8786-1888543b55cc\") " pod="openstack/neutron-db-sync-ntvbd" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.580595 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/deb8e24b-1a6f-4173-9a5f-62974b0331a5-scripts\") pod \"cinder-db-sync-sr27s\" (UID: \"deb8e24b-1a6f-4173-9a5f-62974b0331a5\") " pod="openstack/cinder-db-sync-sr27s" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.580612 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtc75\" (UniqueName: \"kubernetes.io/projected/deb8e24b-1a6f-4173-9a5f-62974b0331a5-kube-api-access-vtc75\") pod \"cinder-db-sync-sr27s\" (UID: \"deb8e24b-1a6f-4173-9a5f-62974b0331a5\") " pod="openstack/cinder-db-sync-sr27s" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.580643 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/deb8e24b-1a6f-4173-9a5f-62974b0331a5-etc-machine-id\") pod \"cinder-db-sync-sr27s\" (UID: \"deb8e24b-1a6f-4173-9a5f-62974b0331a5\") " pod="openstack/cinder-db-sync-sr27s" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.580723 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/deb8e24b-1a6f-4173-9a5f-62974b0331a5-etc-machine-id\") pod \"cinder-db-sync-sr27s\" (UID: \"deb8e24b-1a6f-4173-9a5f-62974b0331a5\") " pod="openstack/cinder-db-sync-sr27s" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.581093 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7f474b7d57-rss5x" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.588437 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.618128 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/deb8e24b-1a6f-4173-9a5f-62974b0331a5-db-sync-config-data\") pod \"cinder-db-sync-sr27s\" (UID: \"deb8e24b-1a6f-4173-9a5f-62974b0331a5\") " pod="openstack/cinder-db-sync-sr27s" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.618651 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deb8e24b-1a6f-4173-9a5f-62974b0331a5-combined-ca-bundle\") pod \"cinder-db-sync-sr27s\" (UID: \"deb8e24b-1a6f-4173-9a5f-62974b0331a5\") " pod="openstack/cinder-db-sync-sr27s" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.618923 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/deb8e24b-1a6f-4173-9a5f-62974b0331a5-scripts\") pod \"cinder-db-sync-sr27s\" (UID: \"deb8e24b-1a6f-4173-9a5f-62974b0331a5\") " pod="openstack/cinder-db-sync-sr27s" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.622240 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/deb8e24b-1a6f-4173-9a5f-62974b0331a5-config-data\") pod \"cinder-db-sync-sr27s\" (UID: \"deb8e24b-1a6f-4173-9a5f-62974b0331a5\") " pod="openstack/cinder-db-sync-sr27s" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.659006 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtc75\" (UniqueName: \"kubernetes.io/projected/deb8e24b-1a6f-4173-9a5f-62974b0331a5-kube-api-access-vtc75\") pod \"cinder-db-sync-sr27s\" (UID: \"deb8e24b-1a6f-4173-9a5f-62974b0331a5\") " pod="openstack/cinder-db-sync-sr27s" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.662247 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-sr27s" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.673780 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.678333 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.682402 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.684307 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4e5455e9-fa02-46bd-8786-1888543b55cc-config\") pod \"neutron-db-sync-ntvbd\" (UID: \"4e5455e9-fa02-46bd-8786-1888543b55cc\") " pod="openstack/neutron-db-sync-ntvbd" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.684556 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqf5r\" (UniqueName: \"kubernetes.io/projected/4e5455e9-fa02-46bd-8786-1888543b55cc-kube-api-access-tqf5r\") pod \"neutron-db-sync-ntvbd\" (UID: \"4e5455e9-fa02-46bd-8786-1888543b55cc\") " pod="openstack/neutron-db-sync-ntvbd" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.684793 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e5455e9-fa02-46bd-8786-1888543b55cc-combined-ca-bundle\") pod \"neutron-db-sync-ntvbd\" (UID: \"4e5455e9-fa02-46bd-8786-1888543b55cc\") " pod="openstack/neutron-db-sync-ntvbd" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.690124 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e5455e9-fa02-46bd-8786-1888543b55cc-combined-ca-bundle\") pod \"neutron-db-sync-ntvbd\" (UID: \"4e5455e9-fa02-46bd-8786-1888543b55cc\") " pod="openstack/neutron-db-sync-ntvbd" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.671470 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.701647 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4e5455e9-fa02-46bd-8786-1888543b55cc-config\") pod \"neutron-db-sync-ntvbd\" (UID: \"4e5455e9-fa02-46bd-8786-1888543b55cc\") " pod="openstack/neutron-db-sync-ntvbd" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.796231 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b893f89-a9bc-4a39-bd26-b394cbb0a374-run-httpd\") pod \"ceilometer-0\" (UID: \"3b893f89-a9bc-4a39-bd26-b394cbb0a374\") " pod="openstack/ceilometer-0" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.796621 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b893f89-a9bc-4a39-bd26-b394cbb0a374-scripts\") pod \"ceilometer-0\" (UID: \"3b893f89-a9bc-4a39-bd26-b394cbb0a374\") " pod="openstack/ceilometer-0" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.796676 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jf4rm\" (UniqueName: \"kubernetes.io/projected/3b893f89-a9bc-4a39-bd26-b394cbb0a374-kube-api-access-jf4rm\") pod \"ceilometer-0\" (UID: \"3b893f89-a9bc-4a39-bd26-b394cbb0a374\") " pod="openstack/ceilometer-0" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.796707 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b893f89-a9bc-4a39-bd26-b394cbb0a374-log-httpd\") pod \"ceilometer-0\" (UID: \"3b893f89-a9bc-4a39-bd26-b394cbb0a374\") " pod="openstack/ceilometer-0" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.796736 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b893f89-a9bc-4a39-bd26-b394cbb0a374-config-data\") pod \"ceilometer-0\" (UID: \"3b893f89-a9bc-4a39-bd26-b394cbb0a374\") " pod="openstack/ceilometer-0" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.796765 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b893f89-a9bc-4a39-bd26-b394cbb0a374-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3b893f89-a9bc-4a39-bd26-b394cbb0a374\") " pod="openstack/ceilometer-0" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.796824 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3b893f89-a9bc-4a39-bd26-b394cbb0a374-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3b893f89-a9bc-4a39-bd26-b394cbb0a374\") " pod="openstack/ceilometer-0" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.816611 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqf5r\" (UniqueName: \"kubernetes.io/projected/4e5455e9-fa02-46bd-8786-1888543b55cc-kube-api-access-tqf5r\") pod \"neutron-db-sync-ntvbd\" (UID: \"4e5455e9-fa02-46bd-8786-1888543b55cc\") " pod="openstack/neutron-db-sync-ntvbd" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.849876 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-8wrdd"] Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.851212 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-8wrdd" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.856051 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.857397 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-rqcfw" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.867921 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-8wrdd"] Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.873050 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75c886f8b5-wq27s" podUID="e2c54f63-d580-4793-bb62-e293cdef7ae3" containerName="dnsmasq-dns" containerID="cri-o://d5a7fec222bc50f31223a619c031d95f1d36876e8287e2b77be3256b87578703" gracePeriod=10 Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.881423 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-648545bc7-vkgcg"] Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.882849 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-648545bc7-vkgcg" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.890001 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-648545bc7-vkgcg"] Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.899148 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jf4rm\" (UniqueName: \"kubernetes.io/projected/3b893f89-a9bc-4a39-bd26-b394cbb0a374-kube-api-access-jf4rm\") pod \"ceilometer-0\" (UID: \"3b893f89-a9bc-4a39-bd26-b394cbb0a374\") " pod="openstack/ceilometer-0" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.899206 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b893f89-a9bc-4a39-bd26-b394cbb0a374-log-httpd\") pod \"ceilometer-0\" (UID: \"3b893f89-a9bc-4a39-bd26-b394cbb0a374\") " pod="openstack/ceilometer-0" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.899230 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/44c35b48-50b9-4dd8-846a-99714c14d3ab-db-sync-config-data\") pod \"barbican-db-sync-8wrdd\" (UID: \"44c35b48-50b9-4dd8-846a-99714c14d3ab\") " pod="openstack/barbican-db-sync-8wrdd" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.899261 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4drhb\" (UniqueName: \"kubernetes.io/projected/44c35b48-50b9-4dd8-846a-99714c14d3ab-kube-api-access-4drhb\") pod \"barbican-db-sync-8wrdd\" (UID: \"44c35b48-50b9-4dd8-846a-99714c14d3ab\") " pod="openstack/barbican-db-sync-8wrdd" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.899277 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b893f89-a9bc-4a39-bd26-b394cbb0a374-config-data\") pod \"ceilometer-0\" (UID: \"3b893f89-a9bc-4a39-bd26-b394cbb0a374\") " pod="openstack/ceilometer-0" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.899302 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b893f89-a9bc-4a39-bd26-b394cbb0a374-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3b893f89-a9bc-4a39-bd26-b394cbb0a374\") " pod="openstack/ceilometer-0" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.899327 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44c35b48-50b9-4dd8-846a-99714c14d3ab-combined-ca-bundle\") pod \"barbican-db-sync-8wrdd\" (UID: \"44c35b48-50b9-4dd8-846a-99714c14d3ab\") " pod="openstack/barbican-db-sync-8wrdd" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.899348 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3b893f89-a9bc-4a39-bd26-b394cbb0a374-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3b893f89-a9bc-4a39-bd26-b394cbb0a374\") " pod="openstack/ceilometer-0" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.899400 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b893f89-a9bc-4a39-bd26-b394cbb0a374-run-httpd\") pod \"ceilometer-0\" (UID: \"3b893f89-a9bc-4a39-bd26-b394cbb0a374\") " pod="openstack/ceilometer-0" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.899423 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b893f89-a9bc-4a39-bd26-b394cbb0a374-scripts\") pod \"ceilometer-0\" (UID: \"3b893f89-a9bc-4a39-bd26-b394cbb0a374\") " pod="openstack/ceilometer-0" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.904013 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b893f89-a9bc-4a39-bd26-b394cbb0a374-scripts\") pod \"ceilometer-0\" (UID: \"3b893f89-a9bc-4a39-bd26-b394cbb0a374\") " pod="openstack/ceilometer-0" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.904608 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b893f89-a9bc-4a39-bd26-b394cbb0a374-log-httpd\") pod \"ceilometer-0\" (UID: \"3b893f89-a9bc-4a39-bd26-b394cbb0a374\") " pod="openstack/ceilometer-0" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.905647 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5985c59c55-2fk4t"] Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.911722 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b893f89-a9bc-4a39-bd26-b394cbb0a374-config-data\") pod \"ceilometer-0\" (UID: \"3b893f89-a9bc-4a39-bd26-b394cbb0a374\") " pod="openstack/ceilometer-0" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.912391 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b893f89-a9bc-4a39-bd26-b394cbb0a374-run-httpd\") pod \"ceilometer-0\" (UID: \"3b893f89-a9bc-4a39-bd26-b394cbb0a374\") " pod="openstack/ceilometer-0" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.913876 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3b893f89-a9bc-4a39-bd26-b394cbb0a374-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3b893f89-a9bc-4a39-bd26-b394cbb0a374\") " pod="openstack/ceilometer-0" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.913934 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-k46zw"] Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.915686 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-k46zw" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.926818 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-k46zw"] Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.939986 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-ccd7c9f8f-j7b59"] Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.942701 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ccd7c9f8f-j7b59" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.950463 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-ccd7c9f8f-j7b59"] Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.956628 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.956816 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-6mklb" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.956936 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.957029 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b893f89-a9bc-4a39-bd26-b394cbb0a374-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3b893f89-a9bc-4a39-bd26-b394cbb0a374\") " pod="openstack/ceilometer-0" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.957360 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-ntvbd" Mar 09 09:24:39 crc kubenswrapper[4861]: I0309 09:24:39.958118 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jf4rm\" (UniqueName: \"kubernetes.io/projected/3b893f89-a9bc-4a39-bd26-b394cbb0a374-kube-api-access-jf4rm\") pod \"ceilometer-0\" (UID: \"3b893f89-a9bc-4a39-bd26-b394cbb0a374\") " pod="openstack/ceilometer-0" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.000186 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.001695 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldwqp\" (UniqueName: \"kubernetes.io/projected/c9820d89-3a89-4982-8520-f23dd0d099ad-kube-api-access-ldwqp\") pod \"placement-db-sync-k46zw\" (UID: \"c9820d89-3a89-4982-8520-f23dd0d099ad\") " pod="openstack/placement-db-sync-k46zw" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.001744 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9820d89-3a89-4982-8520-f23dd0d099ad-logs\") pod \"placement-db-sync-k46zw\" (UID: \"c9820d89-3a89-4982-8520-f23dd0d099ad\") " pod="openstack/placement-db-sync-k46zw" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.001771 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44c35b48-50b9-4dd8-846a-99714c14d3ab-combined-ca-bundle\") pod \"barbican-db-sync-8wrdd\" (UID: \"44c35b48-50b9-4dd8-846a-99714c14d3ab\") " pod="openstack/barbican-db-sync-8wrdd" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.001794 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/236224b4-5b63-4290-8f78-6367a7567dd5-config-data\") pod \"horizon-648545bc7-vkgcg\" (UID: \"236224b4-5b63-4290-8f78-6367a7567dd5\") " pod="openstack/horizon-648545bc7-vkgcg" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.001812 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9820d89-3a89-4982-8520-f23dd0d099ad-config-data\") pod \"placement-db-sync-k46zw\" (UID: \"c9820d89-3a89-4982-8520-f23dd0d099ad\") " pod="openstack/placement-db-sync-k46zw" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.001842 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/236224b4-5b63-4290-8f78-6367a7567dd5-scripts\") pod \"horizon-648545bc7-vkgcg\" (UID: \"236224b4-5b63-4290-8f78-6367a7567dd5\") " pod="openstack/horizon-648545bc7-vkgcg" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.001861 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhcns\" (UniqueName: \"kubernetes.io/projected/ebf4ebf5-adc0-48b4-b80b-0b0d88e64910-kube-api-access-qhcns\") pod \"dnsmasq-dns-ccd7c9f8f-j7b59\" (UID: \"ebf4ebf5-adc0-48b4-b80b-0b0d88e64910\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-j7b59" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.001896 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9820d89-3a89-4982-8520-f23dd0d099ad-scripts\") pod \"placement-db-sync-k46zw\" (UID: \"c9820d89-3a89-4982-8520-f23dd0d099ad\") " pod="openstack/placement-db-sync-k46zw" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.001918 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ebf4ebf5-adc0-48b4-b80b-0b0d88e64910-ovsdbserver-sb\") pod \"dnsmasq-dns-ccd7c9f8f-j7b59\" (UID: \"ebf4ebf5-adc0-48b4-b80b-0b0d88e64910\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-j7b59" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.001938 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/236224b4-5b63-4290-8f78-6367a7567dd5-horizon-secret-key\") pod \"horizon-648545bc7-vkgcg\" (UID: \"236224b4-5b63-4290-8f78-6367a7567dd5\") " pod="openstack/horizon-648545bc7-vkgcg" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.001960 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9820d89-3a89-4982-8520-f23dd0d099ad-combined-ca-bundle\") pod \"placement-db-sync-k46zw\" (UID: \"c9820d89-3a89-4982-8520-f23dd0d099ad\") " pod="openstack/placement-db-sync-k46zw" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.001981 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgf9l\" (UniqueName: \"kubernetes.io/projected/236224b4-5b63-4290-8f78-6367a7567dd5-kube-api-access-sgf9l\") pod \"horizon-648545bc7-vkgcg\" (UID: \"236224b4-5b63-4290-8f78-6367a7567dd5\") " pod="openstack/horizon-648545bc7-vkgcg" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.002016 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ebf4ebf5-adc0-48b4-b80b-0b0d88e64910-dns-svc\") pod \"dnsmasq-dns-ccd7c9f8f-j7b59\" (UID: \"ebf4ebf5-adc0-48b4-b80b-0b0d88e64910\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-j7b59" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.002041 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebf4ebf5-adc0-48b4-b80b-0b0d88e64910-config\") pod \"dnsmasq-dns-ccd7c9f8f-j7b59\" (UID: \"ebf4ebf5-adc0-48b4-b80b-0b0d88e64910\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-j7b59" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.002062 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/236224b4-5b63-4290-8f78-6367a7567dd5-logs\") pod \"horizon-648545bc7-vkgcg\" (UID: \"236224b4-5b63-4290-8f78-6367a7567dd5\") " pod="openstack/horizon-648545bc7-vkgcg" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.002095 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ebf4ebf5-adc0-48b4-b80b-0b0d88e64910-dns-swift-storage-0\") pod \"dnsmasq-dns-ccd7c9f8f-j7b59\" (UID: \"ebf4ebf5-adc0-48b4-b80b-0b0d88e64910\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-j7b59" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.002113 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/44c35b48-50b9-4dd8-846a-99714c14d3ab-db-sync-config-data\") pod \"barbican-db-sync-8wrdd\" (UID: \"44c35b48-50b9-4dd8-846a-99714c14d3ab\") " pod="openstack/barbican-db-sync-8wrdd" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.002139 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ebf4ebf5-adc0-48b4-b80b-0b0d88e64910-ovsdbserver-nb\") pod \"dnsmasq-dns-ccd7c9f8f-j7b59\" (UID: \"ebf4ebf5-adc0-48b4-b80b-0b0d88e64910\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-j7b59" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.002158 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4drhb\" (UniqueName: \"kubernetes.io/projected/44c35b48-50b9-4dd8-846a-99714c14d3ab-kube-api-access-4drhb\") pod \"barbican-db-sync-8wrdd\" (UID: \"44c35b48-50b9-4dd8-846a-99714c14d3ab\") " pod="openstack/barbican-db-sync-8wrdd" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.003168 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.006652 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.006980 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.007192 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-vwtrw" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.007433 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.007676 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44c35b48-50b9-4dd8-846a-99714c14d3ab-combined-ca-bundle\") pod \"barbican-db-sync-8wrdd\" (UID: \"44c35b48-50b9-4dd8-846a-99714c14d3ab\") " pod="openstack/barbican-db-sync-8wrdd" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.010059 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.019402 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/44c35b48-50b9-4dd8-846a-99714c14d3ab-db-sync-config-data\") pod \"barbican-db-sync-8wrdd\" (UID: \"44c35b48-50b9-4dd8-846a-99714c14d3ab\") " pod="openstack/barbican-db-sync-8wrdd" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.039270 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4drhb\" (UniqueName: \"kubernetes.io/projected/44c35b48-50b9-4dd8-846a-99714c14d3ab-kube-api-access-4drhb\") pod \"barbican-db-sync-8wrdd\" (UID: \"44c35b48-50b9-4dd8-846a-99714c14d3ab\") " pod="openstack/barbican-db-sync-8wrdd" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.097649 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.099044 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.104144 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.105877 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prvxq\" (UniqueName: \"kubernetes.io/projected/0e84a83c-79e9-4576-9586-656933127b06-kube-api-access-prvxq\") pod \"glance-default-external-api-0\" (UID: \"0e84a83c-79e9-4576-9586-656933127b06\") " pod="openstack/glance-default-external-api-0" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.105929 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9820d89-3a89-4982-8520-f23dd0d099ad-scripts\") pod \"placement-db-sync-k46zw\" (UID: \"c9820d89-3a89-4982-8520-f23dd0d099ad\") " pod="openstack/placement-db-sync-k46zw" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.105957 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ebf4ebf5-adc0-48b4-b80b-0b0d88e64910-ovsdbserver-sb\") pod \"dnsmasq-dns-ccd7c9f8f-j7b59\" (UID: \"ebf4ebf5-adc0-48b4-b80b-0b0d88e64910\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-j7b59" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.105976 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/236224b4-5b63-4290-8f78-6367a7567dd5-horizon-secret-key\") pod \"horizon-648545bc7-vkgcg\" (UID: \"236224b4-5b63-4290-8f78-6367a7567dd5\") " pod="openstack/horizon-648545bc7-vkgcg" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.105998 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9820d89-3a89-4982-8520-f23dd0d099ad-combined-ca-bundle\") pod \"placement-db-sync-k46zw\" (UID: \"c9820d89-3a89-4982-8520-f23dd0d099ad\") " pod="openstack/placement-db-sync-k46zw" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.106020 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgf9l\" (UniqueName: \"kubernetes.io/projected/236224b4-5b63-4290-8f78-6367a7567dd5-kube-api-access-sgf9l\") pod \"horizon-648545bc7-vkgcg\" (UID: \"236224b4-5b63-4290-8f78-6367a7567dd5\") " pod="openstack/horizon-648545bc7-vkgcg" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.106045 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ebf4ebf5-adc0-48b4-b80b-0b0d88e64910-dns-svc\") pod \"dnsmasq-dns-ccd7c9f8f-j7b59\" (UID: \"ebf4ebf5-adc0-48b4-b80b-0b0d88e64910\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-j7b59" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.106063 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e84a83c-79e9-4576-9586-656933127b06-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0e84a83c-79e9-4576-9586-656933127b06\") " pod="openstack/glance-default-external-api-0" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.106091 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebf4ebf5-adc0-48b4-b80b-0b0d88e64910-config\") pod \"dnsmasq-dns-ccd7c9f8f-j7b59\" (UID: \"ebf4ebf5-adc0-48b4-b80b-0b0d88e64910\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-j7b59" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.106126 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e84a83c-79e9-4576-9586-656933127b06-logs\") pod \"glance-default-external-api-0\" (UID: \"0e84a83c-79e9-4576-9586-656933127b06\") " pod="openstack/glance-default-external-api-0" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.106149 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/236224b4-5b63-4290-8f78-6367a7567dd5-logs\") pod \"horizon-648545bc7-vkgcg\" (UID: \"236224b4-5b63-4290-8f78-6367a7567dd5\") " pod="openstack/horizon-648545bc7-vkgcg" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.106171 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ebf4ebf5-adc0-48b4-b80b-0b0d88e64910-dns-swift-storage-0\") pod \"dnsmasq-dns-ccd7c9f8f-j7b59\" (UID: \"ebf4ebf5-adc0-48b4-b80b-0b0d88e64910\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-j7b59" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.106192 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0e84a83c-79e9-4576-9586-656933127b06-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0e84a83c-79e9-4576-9586-656933127b06\") " pod="openstack/glance-default-external-api-0" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.106228 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ebf4ebf5-adc0-48b4-b80b-0b0d88e64910-ovsdbserver-nb\") pod \"dnsmasq-dns-ccd7c9f8f-j7b59\" (UID: \"ebf4ebf5-adc0-48b4-b80b-0b0d88e64910\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-j7b59" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.106259 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldwqp\" (UniqueName: \"kubernetes.io/projected/c9820d89-3a89-4982-8520-f23dd0d099ad-kube-api-access-ldwqp\") pod \"placement-db-sync-k46zw\" (UID: \"c9820d89-3a89-4982-8520-f23dd0d099ad\") " pod="openstack/placement-db-sync-k46zw" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.106281 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e84a83c-79e9-4576-9586-656933127b06-config-data\") pod \"glance-default-external-api-0\" (UID: \"0e84a83c-79e9-4576-9586-656933127b06\") " pod="openstack/glance-default-external-api-0" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.106301 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"0e84a83c-79e9-4576-9586-656933127b06\") " pod="openstack/glance-default-external-api-0" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.106317 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e84a83c-79e9-4576-9586-656933127b06-scripts\") pod \"glance-default-external-api-0\" (UID: \"0e84a83c-79e9-4576-9586-656933127b06\") " pod="openstack/glance-default-external-api-0" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.106350 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9820d89-3a89-4982-8520-f23dd0d099ad-logs\") pod \"placement-db-sync-k46zw\" (UID: \"c9820d89-3a89-4982-8520-f23dd0d099ad\") " pod="openstack/placement-db-sync-k46zw" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.106400 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/236224b4-5b63-4290-8f78-6367a7567dd5-config-data\") pod \"horizon-648545bc7-vkgcg\" (UID: \"236224b4-5b63-4290-8f78-6367a7567dd5\") " pod="openstack/horizon-648545bc7-vkgcg" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.106416 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9820d89-3a89-4982-8520-f23dd0d099ad-config-data\") pod \"placement-db-sync-k46zw\" (UID: \"c9820d89-3a89-4982-8520-f23dd0d099ad\") " pod="openstack/placement-db-sync-k46zw" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.106443 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/236224b4-5b63-4290-8f78-6367a7567dd5-scripts\") pod \"horizon-648545bc7-vkgcg\" (UID: \"236224b4-5b63-4290-8f78-6367a7567dd5\") " pod="openstack/horizon-648545bc7-vkgcg" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.107104 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e84a83c-79e9-4576-9586-656933127b06-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0e84a83c-79e9-4576-9586-656933127b06\") " pod="openstack/glance-default-external-api-0" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.107133 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhcns\" (UniqueName: \"kubernetes.io/projected/ebf4ebf5-adc0-48b4-b80b-0b0d88e64910-kube-api-access-qhcns\") pod \"dnsmasq-dns-ccd7c9f8f-j7b59\" (UID: \"ebf4ebf5-adc0-48b4-b80b-0b0d88e64910\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-j7b59" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.107989 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.108568 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.109480 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ebf4ebf5-adc0-48b4-b80b-0b0d88e64910-ovsdbserver-sb\") pod \"dnsmasq-dns-ccd7c9f8f-j7b59\" (UID: \"ebf4ebf5-adc0-48b4-b80b-0b0d88e64910\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-j7b59" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.109715 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ebf4ebf5-adc0-48b4-b80b-0b0d88e64910-ovsdbserver-nb\") pod \"dnsmasq-dns-ccd7c9f8f-j7b59\" (UID: \"ebf4ebf5-adc0-48b4-b80b-0b0d88e64910\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-j7b59" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.109986 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9820d89-3a89-4982-8520-f23dd0d099ad-logs\") pod \"placement-db-sync-k46zw\" (UID: \"c9820d89-3a89-4982-8520-f23dd0d099ad\") " pod="openstack/placement-db-sync-k46zw" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.110754 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ebf4ebf5-adc0-48b4-b80b-0b0d88e64910-dns-swift-storage-0\") pod \"dnsmasq-dns-ccd7c9f8f-j7b59\" (UID: \"ebf4ebf5-adc0-48b4-b80b-0b0d88e64910\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-j7b59" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.111023 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/236224b4-5b63-4290-8f78-6367a7567dd5-scripts\") pod \"horizon-648545bc7-vkgcg\" (UID: \"236224b4-5b63-4290-8f78-6367a7567dd5\") " pod="openstack/horizon-648545bc7-vkgcg" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.111493 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ebf4ebf5-adc0-48b4-b80b-0b0d88e64910-dns-svc\") pod \"dnsmasq-dns-ccd7c9f8f-j7b59\" (UID: \"ebf4ebf5-adc0-48b4-b80b-0b0d88e64910\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-j7b59" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.111745 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/236224b4-5b63-4290-8f78-6367a7567dd5-logs\") pod \"horizon-648545bc7-vkgcg\" (UID: \"236224b4-5b63-4290-8f78-6367a7567dd5\") " pod="openstack/horizon-648545bc7-vkgcg" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.112928 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/236224b4-5b63-4290-8f78-6367a7567dd5-config-data\") pod \"horizon-648545bc7-vkgcg\" (UID: \"236224b4-5b63-4290-8f78-6367a7567dd5\") " pod="openstack/horizon-648545bc7-vkgcg" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.114447 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebf4ebf5-adc0-48b4-b80b-0b0d88e64910-config\") pod \"dnsmasq-dns-ccd7c9f8f-j7b59\" (UID: \"ebf4ebf5-adc0-48b4-b80b-0b0d88e64910\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-j7b59" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.121016 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.132022 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9820d89-3a89-4982-8520-f23dd0d099ad-config-data\") pod \"placement-db-sync-k46zw\" (UID: \"c9820d89-3a89-4982-8520-f23dd0d099ad\") " pod="openstack/placement-db-sync-k46zw" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.139274 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldwqp\" (UniqueName: \"kubernetes.io/projected/c9820d89-3a89-4982-8520-f23dd0d099ad-kube-api-access-ldwqp\") pod \"placement-db-sync-k46zw\" (UID: \"c9820d89-3a89-4982-8520-f23dd0d099ad\") " pod="openstack/placement-db-sync-k46zw" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.144568 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9820d89-3a89-4982-8520-f23dd0d099ad-scripts\") pod \"placement-db-sync-k46zw\" (UID: \"c9820d89-3a89-4982-8520-f23dd0d099ad\") " pod="openstack/placement-db-sync-k46zw" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.144582 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgf9l\" (UniqueName: \"kubernetes.io/projected/236224b4-5b63-4290-8f78-6367a7567dd5-kube-api-access-sgf9l\") pod \"horizon-648545bc7-vkgcg\" (UID: \"236224b4-5b63-4290-8f78-6367a7567dd5\") " pod="openstack/horizon-648545bc7-vkgcg" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.144956 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/236224b4-5b63-4290-8f78-6367a7567dd5-horizon-secret-key\") pod \"horizon-648545bc7-vkgcg\" (UID: \"236224b4-5b63-4290-8f78-6367a7567dd5\") " pod="openstack/horizon-648545bc7-vkgcg" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.146630 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9820d89-3a89-4982-8520-f23dd0d099ad-combined-ca-bundle\") pod \"placement-db-sync-k46zw\" (UID: \"c9820d89-3a89-4982-8520-f23dd0d099ad\") " pod="openstack/placement-db-sync-k46zw" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.147074 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhcns\" (UniqueName: \"kubernetes.io/projected/ebf4ebf5-adc0-48b4-b80b-0b0d88e64910-kube-api-access-qhcns\") pod \"dnsmasq-dns-ccd7c9f8f-j7b59\" (UID: \"ebf4ebf5-adc0-48b4-b80b-0b0d88e64910\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-j7b59" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.194215 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-8wrdd" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.209816 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e84a83c-79e9-4576-9586-656933127b06-logs\") pod \"glance-default-external-api-0\" (UID: \"0e84a83c-79e9-4576-9586-656933127b06\") " pod="openstack/glance-default-external-api-0" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.210032 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0e84a83c-79e9-4576-9586-656933127b06-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0e84a83c-79e9-4576-9586-656933127b06\") " pod="openstack/glance-default-external-api-0" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.210148 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e84a83c-79e9-4576-9586-656933127b06-config-data\") pod \"glance-default-external-api-0\" (UID: \"0e84a83c-79e9-4576-9586-656933127b06\") " pod="openstack/glance-default-external-api-0" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.210216 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"0e84a83c-79e9-4576-9586-656933127b06\") " pod="openstack/glance-default-external-api-0" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.210295 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e84a83c-79e9-4576-9586-656933127b06-scripts\") pod \"glance-default-external-api-0\" (UID: \"0e84a83c-79e9-4576-9586-656933127b06\") " pod="openstack/glance-default-external-api-0" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.211177 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e84a83c-79e9-4576-9586-656933127b06-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0e84a83c-79e9-4576-9586-656933127b06\") " pod="openstack/glance-default-external-api-0" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.211430 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prvxq\" (UniqueName: \"kubernetes.io/projected/0e84a83c-79e9-4576-9586-656933127b06-kube-api-access-prvxq\") pod \"glance-default-external-api-0\" (UID: \"0e84a83c-79e9-4576-9586-656933127b06\") " pod="openstack/glance-default-external-api-0" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.218281 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e84a83c-79e9-4576-9586-656933127b06-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0e84a83c-79e9-4576-9586-656933127b06\") " pod="openstack/glance-default-external-api-0" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.220481 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e84a83c-79e9-4576-9586-656933127b06-logs\") pod \"glance-default-external-api-0\" (UID: \"0e84a83c-79e9-4576-9586-656933127b06\") " pod="openstack/glance-default-external-api-0" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.222803 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0e84a83c-79e9-4576-9586-656933127b06-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0e84a83c-79e9-4576-9586-656933127b06\") " pod="openstack/glance-default-external-api-0" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.223740 4861 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"0e84a83c-79e9-4576-9586-656933127b06\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-external-api-0" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.224869 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e84a83c-79e9-4576-9586-656933127b06-config-data\") pod \"glance-default-external-api-0\" (UID: \"0e84a83c-79e9-4576-9586-656933127b06\") " pod="openstack/glance-default-external-api-0" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.242851 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-648545bc7-vkgcg" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.250033 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e84a83c-79e9-4576-9586-656933127b06-scripts\") pod \"glance-default-external-api-0\" (UID: \"0e84a83c-79e9-4576-9586-656933127b06\") " pod="openstack/glance-default-external-api-0" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.250835 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prvxq\" (UniqueName: \"kubernetes.io/projected/0e84a83c-79e9-4576-9586-656933127b06-kube-api-access-prvxq\") pod \"glance-default-external-api-0\" (UID: \"0e84a83c-79e9-4576-9586-656933127b06\") " pod="openstack/glance-default-external-api-0" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.251231 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e84a83c-79e9-4576-9586-656933127b06-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0e84a83c-79e9-4576-9586-656933127b06\") " pod="openstack/glance-default-external-api-0" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.255606 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"0e84a83c-79e9-4576-9586-656933127b06\") " pod="openstack/glance-default-external-api-0" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.265517 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e84a83c-79e9-4576-9586-656933127b06-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0e84a83c-79e9-4576-9586-656933127b06\") " pod="openstack/glance-default-external-api-0" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.304084 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-k46zw" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.319763 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/868b4cd9-0c47-4173-b2f6-710c28e73f16-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"868b4cd9-0c47-4173-b2f6-710c28e73f16\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.319843 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vww57\" (UniqueName: \"kubernetes.io/projected/868b4cd9-0c47-4173-b2f6-710c28e73f16-kube-api-access-vww57\") pod \"glance-default-internal-api-0\" (UID: \"868b4cd9-0c47-4173-b2f6-710c28e73f16\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.319878 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/868b4cd9-0c47-4173-b2f6-710c28e73f16-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"868b4cd9-0c47-4173-b2f6-710c28e73f16\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.319932 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"868b4cd9-0c47-4173-b2f6-710c28e73f16\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.320002 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/868b4cd9-0c47-4173-b2f6-710c28e73f16-config-data\") pod \"glance-default-internal-api-0\" (UID: \"868b4cd9-0c47-4173-b2f6-710c28e73f16\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.320038 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/868b4cd9-0c47-4173-b2f6-710c28e73f16-logs\") pod \"glance-default-internal-api-0\" (UID: \"868b4cd9-0c47-4173-b2f6-710c28e73f16\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.320062 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/868b4cd9-0c47-4173-b2f6-710c28e73f16-scripts\") pod \"glance-default-internal-api-0\" (UID: \"868b4cd9-0c47-4173-b2f6-710c28e73f16\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.320088 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/868b4cd9-0c47-4173-b2f6-710c28e73f16-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"868b4cd9-0c47-4173-b2f6-710c28e73f16\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.359734 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ccd7c9f8f-j7b59" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.427405 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/868b4cd9-0c47-4173-b2f6-710c28e73f16-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"868b4cd9-0c47-4173-b2f6-710c28e73f16\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.427671 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vww57\" (UniqueName: \"kubernetes.io/projected/868b4cd9-0c47-4173-b2f6-710c28e73f16-kube-api-access-vww57\") pod \"glance-default-internal-api-0\" (UID: \"868b4cd9-0c47-4173-b2f6-710c28e73f16\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.427779 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/868b4cd9-0c47-4173-b2f6-710c28e73f16-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"868b4cd9-0c47-4173-b2f6-710c28e73f16\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.427870 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"868b4cd9-0c47-4173-b2f6-710c28e73f16\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.427999 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/868b4cd9-0c47-4173-b2f6-710c28e73f16-config-data\") pod \"glance-default-internal-api-0\" (UID: \"868b4cd9-0c47-4173-b2f6-710c28e73f16\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.428112 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/868b4cd9-0c47-4173-b2f6-710c28e73f16-logs\") pod \"glance-default-internal-api-0\" (UID: \"868b4cd9-0c47-4173-b2f6-710c28e73f16\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.428200 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/868b4cd9-0c47-4173-b2f6-710c28e73f16-scripts\") pod \"glance-default-internal-api-0\" (UID: \"868b4cd9-0c47-4173-b2f6-710c28e73f16\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.428293 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/868b4cd9-0c47-4173-b2f6-710c28e73f16-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"868b4cd9-0c47-4173-b2f6-710c28e73f16\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.428534 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/868b4cd9-0c47-4173-b2f6-710c28e73f16-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"868b4cd9-0c47-4173-b2f6-710c28e73f16\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.430035 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/868b4cd9-0c47-4173-b2f6-710c28e73f16-logs\") pod \"glance-default-internal-api-0\" (UID: \"868b4cd9-0c47-4173-b2f6-710c28e73f16\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.430427 4861 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"868b4cd9-0c47-4173-b2f6-710c28e73f16\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.440268 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/868b4cd9-0c47-4173-b2f6-710c28e73f16-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"868b4cd9-0c47-4173-b2f6-710c28e73f16\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.445410 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/868b4cd9-0c47-4173-b2f6-710c28e73f16-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"868b4cd9-0c47-4173-b2f6-710c28e73f16\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.454611 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/868b4cd9-0c47-4173-b2f6-710c28e73f16-config-data\") pod \"glance-default-internal-api-0\" (UID: \"868b4cd9-0c47-4173-b2f6-710c28e73f16\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.474291 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/868b4cd9-0c47-4173-b2f6-710c28e73f16-scripts\") pod \"glance-default-internal-api-0\" (UID: \"868b4cd9-0c47-4173-b2f6-710c28e73f16\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.474299 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vww57\" (UniqueName: \"kubernetes.io/projected/868b4cd9-0c47-4173-b2f6-710c28e73f16-kube-api-access-vww57\") pod \"glance-default-internal-api-0\" (UID: \"868b4cd9-0c47-4173-b2f6-710c28e73f16\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.484263 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.539678 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"868b4cd9-0c47-4173-b2f6-710c28e73f16\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.592618 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5985c59c55-2fk4t"] Mar 09 09:24:40 crc kubenswrapper[4861]: W0309 09:24:40.656465 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b84f271_fc53_4d64_ae57_acfd2a5b1696.slice/crio-933bd051735f7e87abe851669bd08b7af7dbbf5e3f79e121cb770afa96eb0fb3 WatchSource:0}: Error finding container 933bd051735f7e87abe851669bd08b7af7dbbf5e3f79e121cb770afa96eb0fb3: Status 404 returned error can't find the container with id 933bd051735f7e87abe851669bd08b7af7dbbf5e3f79e121cb770afa96eb0fb3 Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.698256 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-sr27s"] Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.745904 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c886f8b5-wq27s" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.799554 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.865828 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7f474b7d57-rss5x"] Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.872215 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-ktlzn"] Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.887501 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5985c59c55-2fk4t" event={"ID":"1b84f271-fc53-4d64-ae57-acfd2a5b1696","Type":"ContainerStarted","Data":"933bd051735f7e87abe851669bd08b7af7dbbf5e3f79e121cb770afa96eb0fb3"} Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.896896 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-sr27s" event={"ID":"deb8e24b-1a6f-4173-9a5f-62974b0331a5","Type":"ContainerStarted","Data":"d1519d1ea655a3e17773abb735d5d6d4849433a94b524594ee840073b82eb9fa"} Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.911528 4861 generic.go:334] "Generic (PLEG): container finished" podID="e2c54f63-d580-4793-bb62-e293cdef7ae3" containerID="d5a7fec222bc50f31223a619c031d95f1d36876e8287e2b77be3256b87578703" exitCode=0 Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.911573 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c886f8b5-wq27s" event={"ID":"e2c54f63-d580-4793-bb62-e293cdef7ae3","Type":"ContainerDied","Data":"d5a7fec222bc50f31223a619c031d95f1d36876e8287e2b77be3256b87578703"} Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.911601 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c886f8b5-wq27s" event={"ID":"e2c54f63-d580-4793-bb62-e293cdef7ae3","Type":"ContainerDied","Data":"be2b95886ed7e4e6a793ccadc23d178a6a11504b28bbef88c739899e893ffb00"} Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.911619 4861 scope.go:117] "RemoveContainer" containerID="d5a7fec222bc50f31223a619c031d95f1d36876e8287e2b77be3256b87578703" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.911795 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c886f8b5-wq27s" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.939514 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e2c54f63-d580-4793-bb62-e293cdef7ae3-dns-svc\") pod \"e2c54f63-d580-4793-bb62-e293cdef7ae3\" (UID: \"e2c54f63-d580-4793-bb62-e293cdef7ae3\") " Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.939620 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2c54f63-d580-4793-bb62-e293cdef7ae3-config\") pod \"e2c54f63-d580-4793-bb62-e293cdef7ae3\" (UID: \"e2c54f63-d580-4793-bb62-e293cdef7ae3\") " Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.939649 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e2c54f63-d580-4793-bb62-e293cdef7ae3-dns-swift-storage-0\") pod \"e2c54f63-d580-4793-bb62-e293cdef7ae3\" (UID: \"e2c54f63-d580-4793-bb62-e293cdef7ae3\") " Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.939719 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9t9sh\" (UniqueName: \"kubernetes.io/projected/e2c54f63-d580-4793-bb62-e293cdef7ae3-kube-api-access-9t9sh\") pod \"e2c54f63-d580-4793-bb62-e293cdef7ae3\" (UID: \"e2c54f63-d580-4793-bb62-e293cdef7ae3\") " Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.939809 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e2c54f63-d580-4793-bb62-e293cdef7ae3-ovsdbserver-sb\") pod \"e2c54f63-d580-4793-bb62-e293cdef7ae3\" (UID: \"e2c54f63-d580-4793-bb62-e293cdef7ae3\") " Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.939836 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e2c54f63-d580-4793-bb62-e293cdef7ae3-ovsdbserver-nb\") pod \"e2c54f63-d580-4793-bb62-e293cdef7ae3\" (UID: \"e2c54f63-d580-4793-bb62-e293cdef7ae3\") " Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.963486 4861 scope.go:117] "RemoveContainer" containerID="d29bfac4c915a0273fce2c1c7f9964bf2bc24c64c52b27bc0f6173b430ed3cc6" Mar 09 09:24:40 crc kubenswrapper[4861]: I0309 09:24:40.964040 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2c54f63-d580-4793-bb62-e293cdef7ae3-kube-api-access-9t9sh" (OuterVolumeSpecName: "kube-api-access-9t9sh") pod "e2c54f63-d580-4793-bb62-e293cdef7ae3" (UID: "e2c54f63-d580-4793-bb62-e293cdef7ae3"). InnerVolumeSpecName "kube-api-access-9t9sh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:24:41 crc kubenswrapper[4861]: I0309 09:24:41.000204 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2c54f63-d580-4793-bb62-e293cdef7ae3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e2c54f63-d580-4793-bb62-e293cdef7ae3" (UID: "e2c54f63-d580-4793-bb62-e293cdef7ae3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:24:41 crc kubenswrapper[4861]: I0309 09:24:41.002639 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2c54f63-d580-4793-bb62-e293cdef7ae3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e2c54f63-d580-4793-bb62-e293cdef7ae3" (UID: "e2c54f63-d580-4793-bb62-e293cdef7ae3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:24:41 crc kubenswrapper[4861]: I0309 09:24:41.004123 4861 scope.go:117] "RemoveContainer" containerID="d5a7fec222bc50f31223a619c031d95f1d36876e8287e2b77be3256b87578703" Mar 09 09:24:41 crc kubenswrapper[4861]: E0309 09:24:41.005024 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5a7fec222bc50f31223a619c031d95f1d36876e8287e2b77be3256b87578703\": container with ID starting with d5a7fec222bc50f31223a619c031d95f1d36876e8287e2b77be3256b87578703 not found: ID does not exist" containerID="d5a7fec222bc50f31223a619c031d95f1d36876e8287e2b77be3256b87578703" Mar 09 09:24:41 crc kubenswrapper[4861]: I0309 09:24:41.005638 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5a7fec222bc50f31223a619c031d95f1d36876e8287e2b77be3256b87578703"} err="failed to get container status \"d5a7fec222bc50f31223a619c031d95f1d36876e8287e2b77be3256b87578703\": rpc error: code = NotFound desc = could not find container \"d5a7fec222bc50f31223a619c031d95f1d36876e8287e2b77be3256b87578703\": container with ID starting with d5a7fec222bc50f31223a619c031d95f1d36876e8287e2b77be3256b87578703 not found: ID does not exist" Mar 09 09:24:41 crc kubenswrapper[4861]: I0309 09:24:41.005730 4861 scope.go:117] "RemoveContainer" containerID="d29bfac4c915a0273fce2c1c7f9964bf2bc24c64c52b27bc0f6173b430ed3cc6" Mar 09 09:24:41 crc kubenswrapper[4861]: I0309 09:24:41.006646 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2c54f63-d580-4793-bb62-e293cdef7ae3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e2c54f63-d580-4793-bb62-e293cdef7ae3" (UID: "e2c54f63-d580-4793-bb62-e293cdef7ae3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:24:41 crc kubenswrapper[4861]: E0309 09:24:41.007328 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d29bfac4c915a0273fce2c1c7f9964bf2bc24c64c52b27bc0f6173b430ed3cc6\": container with ID starting with d29bfac4c915a0273fce2c1c7f9964bf2bc24c64c52b27bc0f6173b430ed3cc6 not found: ID does not exist" containerID="d29bfac4c915a0273fce2c1c7f9964bf2bc24c64c52b27bc0f6173b430ed3cc6" Mar 09 09:24:41 crc kubenswrapper[4861]: I0309 09:24:41.008736 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d29bfac4c915a0273fce2c1c7f9964bf2bc24c64c52b27bc0f6173b430ed3cc6"} err="failed to get container status \"d29bfac4c915a0273fce2c1c7f9964bf2bc24c64c52b27bc0f6173b430ed3cc6\": rpc error: code = NotFound desc = could not find container \"d29bfac4c915a0273fce2c1c7f9964bf2bc24c64c52b27bc0f6173b430ed3cc6\": container with ID starting with d29bfac4c915a0273fce2c1c7f9964bf2bc24c64c52b27bc0f6173b430ed3cc6 not found: ID does not exist" Mar 09 09:24:41 crc kubenswrapper[4861]: I0309 09:24:41.025491 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2c54f63-d580-4793-bb62-e293cdef7ae3-config" (OuterVolumeSpecName: "config") pod "e2c54f63-d580-4793-bb62-e293cdef7ae3" (UID: "e2c54f63-d580-4793-bb62-e293cdef7ae3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:24:41 crc kubenswrapper[4861]: I0309 09:24:41.032413 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2c54f63-d580-4793-bb62-e293cdef7ae3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e2c54f63-d580-4793-bb62-e293cdef7ae3" (UID: "e2c54f63-d580-4793-bb62-e293cdef7ae3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:24:41 crc kubenswrapper[4861]: I0309 09:24:41.041798 4861 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2c54f63-d580-4793-bb62-e293cdef7ae3-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:41 crc kubenswrapper[4861]: I0309 09:24:41.041845 4861 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e2c54f63-d580-4793-bb62-e293cdef7ae3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:41 crc kubenswrapper[4861]: I0309 09:24:41.041864 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9t9sh\" (UniqueName: \"kubernetes.io/projected/e2c54f63-d580-4793-bb62-e293cdef7ae3-kube-api-access-9t9sh\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:41 crc kubenswrapper[4861]: I0309 09:24:41.041875 4861 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e2c54f63-d580-4793-bb62-e293cdef7ae3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:41 crc kubenswrapper[4861]: I0309 09:24:41.041883 4861 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e2c54f63-d580-4793-bb62-e293cdef7ae3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:41 crc kubenswrapper[4861]: I0309 09:24:41.041891 4861 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e2c54f63-d580-4793-bb62-e293cdef7ae3-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:41 crc kubenswrapper[4861]: I0309 09:24:41.272239 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75c886f8b5-wq27s"] Mar 09 09:24:41 crc kubenswrapper[4861]: I0309 09:24:41.294809 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75c886f8b5-wq27s"] Mar 09 09:24:41 crc kubenswrapper[4861]: I0309 09:24:41.336019 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-k46zw"] Mar 09 09:24:41 crc kubenswrapper[4861]: I0309 09:24:41.349989 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-8wrdd"] Mar 09 09:24:41 crc kubenswrapper[4861]: I0309 09:24:41.356900 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:24:41 crc kubenswrapper[4861]: I0309 09:24:41.363689 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-648545bc7-vkgcg"] Mar 09 09:24:41 crc kubenswrapper[4861]: I0309 09:24:41.370413 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-ntvbd"] Mar 09 09:24:41 crc kubenswrapper[4861]: I0309 09:24:41.395710 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-ccd7c9f8f-j7b59"] Mar 09 09:24:41 crc kubenswrapper[4861]: I0309 09:24:41.467326 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 09:24:41 crc kubenswrapper[4861]: W0309 09:24:41.619130 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod868b4cd9_0c47_4173_b2f6_710c28e73f16.slice/crio-fb731a0bc40c976812836be5b517850302a682e0f1ed6490242c787936bfcddb WatchSource:0}: Error finding container fb731a0bc40c976812836be5b517850302a682e0f1ed6490242c787936bfcddb: Status 404 returned error can't find the container with id fb731a0bc40c976812836be5b517850302a682e0f1ed6490242c787936bfcddb Mar 09 09:24:41 crc kubenswrapper[4861]: I0309 09:24:41.628290 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 09:24:41 crc kubenswrapper[4861]: I0309 09:24:41.732772 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2c54f63-d580-4793-bb62-e293cdef7ae3" path="/var/lib/kubelet/pods/e2c54f63-d580-4793-bb62-e293cdef7ae3/volumes" Mar 09 09:24:41 crc kubenswrapper[4861]: I0309 09:24:41.733324 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-648545bc7-vkgcg"] Mar 09 09:24:41 crc kubenswrapper[4861]: I0309 09:24:41.733346 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 09:24:41 crc kubenswrapper[4861]: I0309 09:24:41.733357 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-679c9c695-9vt85"] Mar 09 09:24:41 crc kubenswrapper[4861]: E0309 09:24:41.733851 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2c54f63-d580-4793-bb62-e293cdef7ae3" containerName="init" Mar 09 09:24:41 crc kubenswrapper[4861]: I0309 09:24:41.733864 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2c54f63-d580-4793-bb62-e293cdef7ae3" containerName="init" Mar 09 09:24:41 crc kubenswrapper[4861]: E0309 09:24:41.733876 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2c54f63-d580-4793-bb62-e293cdef7ae3" containerName="dnsmasq-dns" Mar 09 09:24:41 crc kubenswrapper[4861]: I0309 09:24:41.733881 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2c54f63-d580-4793-bb62-e293cdef7ae3" containerName="dnsmasq-dns" Mar 09 09:24:41 crc kubenswrapper[4861]: I0309 09:24:41.734044 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2c54f63-d580-4793-bb62-e293cdef7ae3" containerName="dnsmasq-dns" Mar 09 09:24:41 crc kubenswrapper[4861]: I0309 09:24:41.737187 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-679c9c695-9vt85"] Mar 09 09:24:41 crc kubenswrapper[4861]: I0309 09:24:41.737294 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-679c9c695-9vt85" Mar 09 09:24:41 crc kubenswrapper[4861]: I0309 09:24:41.786457 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 09:24:41 crc kubenswrapper[4861]: I0309 09:24:41.810737 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:24:41 crc kubenswrapper[4861]: I0309 09:24:41.876092 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/906444a4-92dd-48ac-931d-0799f1256e9b-horizon-secret-key\") pod \"horizon-679c9c695-9vt85\" (UID: \"906444a4-92dd-48ac-931d-0799f1256e9b\") " pod="openstack/horizon-679c9c695-9vt85" Mar 09 09:24:41 crc kubenswrapper[4861]: I0309 09:24:41.876140 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/906444a4-92dd-48ac-931d-0799f1256e9b-scripts\") pod \"horizon-679c9c695-9vt85\" (UID: \"906444a4-92dd-48ac-931d-0799f1256e9b\") " pod="openstack/horizon-679c9c695-9vt85" Mar 09 09:24:41 crc kubenswrapper[4861]: I0309 09:24:41.876180 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/906444a4-92dd-48ac-931d-0799f1256e9b-logs\") pod \"horizon-679c9c695-9vt85\" (UID: \"906444a4-92dd-48ac-931d-0799f1256e9b\") " pod="openstack/horizon-679c9c695-9vt85" Mar 09 09:24:41 crc kubenswrapper[4861]: I0309 09:24:41.876223 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/906444a4-92dd-48ac-931d-0799f1256e9b-config-data\") pod \"horizon-679c9c695-9vt85\" (UID: \"906444a4-92dd-48ac-931d-0799f1256e9b\") " pod="openstack/horizon-679c9c695-9vt85" Mar 09 09:24:41 crc kubenswrapper[4861]: I0309 09:24:41.876291 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vl74\" (UniqueName: \"kubernetes.io/projected/906444a4-92dd-48ac-931d-0799f1256e9b-kube-api-access-6vl74\") pod \"horizon-679c9c695-9vt85\" (UID: \"906444a4-92dd-48ac-931d-0799f1256e9b\") " pod="openstack/horizon-679c9c695-9vt85" Mar 09 09:24:41 crc kubenswrapper[4861]: I0309 09:24:41.936118 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-ntvbd" event={"ID":"4e5455e9-fa02-46bd-8786-1888543b55cc","Type":"ContainerStarted","Data":"50c06f6ec486776fed27be621771092d458b92984683634750db4b2fa7ca73f7"} Mar 09 09:24:41 crc kubenswrapper[4861]: I0309 09:24:41.936186 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-ntvbd" event={"ID":"4e5455e9-fa02-46bd-8786-1888543b55cc","Type":"ContainerStarted","Data":"b0b2e31f7e30f05d77af04aafb944fe11be8dbc9e906baaef16860bcd9981d52"} Mar 09 09:24:41 crc kubenswrapper[4861]: I0309 09:24:41.939924 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-8wrdd" event={"ID":"44c35b48-50b9-4dd8-846a-99714c14d3ab","Type":"ContainerStarted","Data":"81e0a3d1569509e91b9bda5c83c35a42e9a081d1d969caa44317edbe2b0b1cb1"} Mar 09 09:24:41 crc kubenswrapper[4861]: I0309 09:24:41.954033 4861 generic.go:334] "Generic (PLEG): container finished" podID="ebf4ebf5-adc0-48b4-b80b-0b0d88e64910" containerID="9a8ce14b29f228323a6005799d32951adce6e6653badb4b08b86fea8ceed6624" exitCode=0 Mar 09 09:24:41 crc kubenswrapper[4861]: I0309 09:24:41.954120 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ccd7c9f8f-j7b59" event={"ID":"ebf4ebf5-adc0-48b4-b80b-0b0d88e64910","Type":"ContainerDied","Data":"9a8ce14b29f228323a6005799d32951adce6e6653badb4b08b86fea8ceed6624"} Mar 09 09:24:41 crc kubenswrapper[4861]: I0309 09:24:41.954148 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ccd7c9f8f-j7b59" event={"ID":"ebf4ebf5-adc0-48b4-b80b-0b0d88e64910","Type":"ContainerStarted","Data":"2f89109730897ac555a52f2c10697385d9bd7efabbb87dd18441d5e0f0ce854b"} Mar 09 09:24:41 crc kubenswrapper[4861]: I0309 09:24:41.961616 4861 generic.go:334] "Generic (PLEG): container finished" podID="1b84f271-fc53-4d64-ae57-acfd2a5b1696" containerID="1a841917a89f612bdbf92725ecdf117256fc02e449393ce76fe437e4b78c1c6d" exitCode=0 Mar 09 09:24:41 crc kubenswrapper[4861]: I0309 09:24:41.961711 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5985c59c55-2fk4t" event={"ID":"1b84f271-fc53-4d64-ae57-acfd2a5b1696","Type":"ContainerDied","Data":"1a841917a89f612bdbf92725ecdf117256fc02e449393ce76fe437e4b78c1c6d"} Mar 09 09:24:41 crc kubenswrapper[4861]: I0309 09:24:41.981498 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-ntvbd" podStartSLOduration=2.9814809540000002 podStartE2EDuration="2.981480954s" podCreationTimestamp="2026-03-09 09:24:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:24:41.955265288 +0000 UTC m=+1125.040304689" watchObservedRunningTime="2026-03-09 09:24:41.981480954 +0000 UTC m=+1125.066520355" Mar 09 09:24:41 crc kubenswrapper[4861]: I0309 09:24:41.991290 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vl74\" (UniqueName: \"kubernetes.io/projected/906444a4-92dd-48ac-931d-0799f1256e9b-kube-api-access-6vl74\") pod \"horizon-679c9c695-9vt85\" (UID: \"906444a4-92dd-48ac-931d-0799f1256e9b\") " pod="openstack/horizon-679c9c695-9vt85" Mar 09 09:24:41 crc kubenswrapper[4861]: I0309 09:24:41.991376 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/906444a4-92dd-48ac-931d-0799f1256e9b-horizon-secret-key\") pod \"horizon-679c9c695-9vt85\" (UID: \"906444a4-92dd-48ac-931d-0799f1256e9b\") " pod="openstack/horizon-679c9c695-9vt85" Mar 09 09:24:41 crc kubenswrapper[4861]: I0309 09:24:41.991398 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/906444a4-92dd-48ac-931d-0799f1256e9b-scripts\") pod \"horizon-679c9c695-9vt85\" (UID: \"906444a4-92dd-48ac-931d-0799f1256e9b\") " pod="openstack/horizon-679c9c695-9vt85" Mar 09 09:24:41 crc kubenswrapper[4861]: I0309 09:24:41.991442 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/906444a4-92dd-48ac-931d-0799f1256e9b-logs\") pod \"horizon-679c9c695-9vt85\" (UID: \"906444a4-92dd-48ac-931d-0799f1256e9b\") " pod="openstack/horizon-679c9c695-9vt85" Mar 09 09:24:41 crc kubenswrapper[4861]: I0309 09:24:41.991487 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/906444a4-92dd-48ac-931d-0799f1256e9b-config-data\") pod \"horizon-679c9c695-9vt85\" (UID: \"906444a4-92dd-48ac-931d-0799f1256e9b\") " pod="openstack/horizon-679c9c695-9vt85" Mar 09 09:24:41 crc kubenswrapper[4861]: I0309 09:24:41.992690 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/906444a4-92dd-48ac-931d-0799f1256e9b-config-data\") pod \"horizon-679c9c695-9vt85\" (UID: \"906444a4-92dd-48ac-931d-0799f1256e9b\") " pod="openstack/horizon-679c9c695-9vt85" Mar 09 09:24:42 crc kubenswrapper[4861]: I0309 09:24:42.005172 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/906444a4-92dd-48ac-931d-0799f1256e9b-scripts\") pod \"horizon-679c9c695-9vt85\" (UID: \"906444a4-92dd-48ac-931d-0799f1256e9b\") " pod="openstack/horizon-679c9c695-9vt85" Mar 09 09:24:42 crc kubenswrapper[4861]: I0309 09:24:42.005304 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7f474b7d57-rss5x" event={"ID":"2ff07d5b-934f-40e1-a938-4c9c6d6bd846","Type":"ContainerStarted","Data":"e0cb35cb9a56e24e4200a0ede4e63a9d715328089533f5024ebcfa5561ba6114"} Mar 09 09:24:42 crc kubenswrapper[4861]: I0309 09:24:42.005591 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/906444a4-92dd-48ac-931d-0799f1256e9b-logs\") pod \"horizon-679c9c695-9vt85\" (UID: \"906444a4-92dd-48ac-931d-0799f1256e9b\") " pod="openstack/horizon-679c9c695-9vt85" Mar 09 09:24:42 crc kubenswrapper[4861]: I0309 09:24:42.011307 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ktlzn" event={"ID":"35527270-268b-4b4d-864b-de9c9c6182ea","Type":"ContainerStarted","Data":"e72c1d9cb3d3ed802613c080c0934aa922e277a849d6ac09e873332c57084265"} Mar 09 09:24:42 crc kubenswrapper[4861]: I0309 09:24:42.029475 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vl74\" (UniqueName: \"kubernetes.io/projected/906444a4-92dd-48ac-931d-0799f1256e9b-kube-api-access-6vl74\") pod \"horizon-679c9c695-9vt85\" (UID: \"906444a4-92dd-48ac-931d-0799f1256e9b\") " pod="openstack/horizon-679c9c695-9vt85" Mar 09 09:24:42 crc kubenswrapper[4861]: I0309 09:24:42.029966 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ktlzn" event={"ID":"35527270-268b-4b4d-864b-de9c9c6182ea","Type":"ContainerStarted","Data":"da5ab606d7eaee11c4feb596fb941723fa419698c2bc62f5ceff29cf892e29fe"} Mar 09 09:24:42 crc kubenswrapper[4861]: I0309 09:24:42.030019 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"868b4cd9-0c47-4173-b2f6-710c28e73f16","Type":"ContainerStarted","Data":"fb731a0bc40c976812836be5b517850302a682e0f1ed6490242c787936bfcddb"} Mar 09 09:24:42 crc kubenswrapper[4861]: I0309 09:24:42.034821 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3b893f89-a9bc-4a39-bd26-b394cbb0a374","Type":"ContainerStarted","Data":"5c137fca2c97e53ec76c975c0721b95dead1743dd631b64a64772a241c8817b5"} Mar 09 09:24:42 crc kubenswrapper[4861]: I0309 09:24:42.038731 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-k46zw" event={"ID":"c9820d89-3a89-4982-8520-f23dd0d099ad","Type":"ContainerStarted","Data":"86fea03e1d5fad3480ce069e34284c1552eba9c6e60d62cca203f18c0a193adf"} Mar 09 09:24:42 crc kubenswrapper[4861]: I0309 09:24:42.039885 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0e84a83c-79e9-4576-9586-656933127b06","Type":"ContainerStarted","Data":"e2973ff4cb856e7f179ad0fc3bc69aa88a8c21142ec7b5d570029030f597ff4f"} Mar 09 09:24:42 crc kubenswrapper[4861]: I0309 09:24:42.043559 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/906444a4-92dd-48ac-931d-0799f1256e9b-horizon-secret-key\") pod \"horizon-679c9c695-9vt85\" (UID: \"906444a4-92dd-48ac-931d-0799f1256e9b\") " pod="openstack/horizon-679c9c695-9vt85" Mar 09 09:24:42 crc kubenswrapper[4861]: I0309 09:24:42.043993 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-648545bc7-vkgcg" event={"ID":"236224b4-5b63-4290-8f78-6367a7567dd5","Type":"ContainerStarted","Data":"0435b434ad60e57ebfd94b6e8a8c4c7c1dd396652614f429f0b01e627e333c6a"} Mar 09 09:24:42 crc kubenswrapper[4861]: I0309 09:24:42.065528 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-ktlzn" podStartSLOduration=3.065510617 podStartE2EDuration="3.065510617s" podCreationTimestamp="2026-03-09 09:24:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:24:42.045918105 +0000 UTC m=+1125.130957506" watchObservedRunningTime="2026-03-09 09:24:42.065510617 +0000 UTC m=+1125.150550018" Mar 09 09:24:42 crc kubenswrapper[4861]: I0309 09:24:42.085646 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-679c9c695-9vt85" Mar 09 09:24:42 crc kubenswrapper[4861]: I0309 09:24:42.567481 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5985c59c55-2fk4t" Mar 09 09:24:42 crc kubenswrapper[4861]: I0309 09:24:42.723712 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1b84f271-fc53-4d64-ae57-acfd2a5b1696-dns-swift-storage-0\") pod \"1b84f271-fc53-4d64-ae57-acfd2a5b1696\" (UID: \"1b84f271-fc53-4d64-ae57-acfd2a5b1696\") " Mar 09 09:24:42 crc kubenswrapper[4861]: I0309 09:24:42.724041 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b84f271-fc53-4d64-ae57-acfd2a5b1696-dns-svc\") pod \"1b84f271-fc53-4d64-ae57-acfd2a5b1696\" (UID: \"1b84f271-fc53-4d64-ae57-acfd2a5b1696\") " Mar 09 09:24:42 crc kubenswrapper[4861]: I0309 09:24:42.724126 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b84f271-fc53-4d64-ae57-acfd2a5b1696-ovsdbserver-nb\") pod \"1b84f271-fc53-4d64-ae57-acfd2a5b1696\" (UID: \"1b84f271-fc53-4d64-ae57-acfd2a5b1696\") " Mar 09 09:24:42 crc kubenswrapper[4861]: I0309 09:24:42.724145 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b84f271-fc53-4d64-ae57-acfd2a5b1696-config\") pod \"1b84f271-fc53-4d64-ae57-acfd2a5b1696\" (UID: \"1b84f271-fc53-4d64-ae57-acfd2a5b1696\") " Mar 09 09:24:42 crc kubenswrapper[4861]: I0309 09:24:42.724216 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mnfb\" (UniqueName: \"kubernetes.io/projected/1b84f271-fc53-4d64-ae57-acfd2a5b1696-kube-api-access-6mnfb\") pod \"1b84f271-fc53-4d64-ae57-acfd2a5b1696\" (UID: \"1b84f271-fc53-4d64-ae57-acfd2a5b1696\") " Mar 09 09:24:42 crc kubenswrapper[4861]: I0309 09:24:42.724231 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b84f271-fc53-4d64-ae57-acfd2a5b1696-ovsdbserver-sb\") pod \"1b84f271-fc53-4d64-ae57-acfd2a5b1696\" (UID: \"1b84f271-fc53-4d64-ae57-acfd2a5b1696\") " Mar 09 09:24:42 crc kubenswrapper[4861]: I0309 09:24:42.734985 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b84f271-fc53-4d64-ae57-acfd2a5b1696-kube-api-access-6mnfb" (OuterVolumeSpecName: "kube-api-access-6mnfb") pod "1b84f271-fc53-4d64-ae57-acfd2a5b1696" (UID: "1b84f271-fc53-4d64-ae57-acfd2a5b1696"). InnerVolumeSpecName "kube-api-access-6mnfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:24:42 crc kubenswrapper[4861]: I0309 09:24:42.826892 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mnfb\" (UniqueName: \"kubernetes.io/projected/1b84f271-fc53-4d64-ae57-acfd2a5b1696-kube-api-access-6mnfb\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:42 crc kubenswrapper[4861]: I0309 09:24:42.846562 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b84f271-fc53-4d64-ae57-acfd2a5b1696-config" (OuterVolumeSpecName: "config") pod "1b84f271-fc53-4d64-ae57-acfd2a5b1696" (UID: "1b84f271-fc53-4d64-ae57-acfd2a5b1696"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:24:42 crc kubenswrapper[4861]: I0309 09:24:42.869657 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b84f271-fc53-4d64-ae57-acfd2a5b1696-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1b84f271-fc53-4d64-ae57-acfd2a5b1696" (UID: "1b84f271-fc53-4d64-ae57-acfd2a5b1696"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:24:42 crc kubenswrapper[4861]: I0309 09:24:42.870342 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b84f271-fc53-4d64-ae57-acfd2a5b1696-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1b84f271-fc53-4d64-ae57-acfd2a5b1696" (UID: "1b84f271-fc53-4d64-ae57-acfd2a5b1696"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:24:42 crc kubenswrapper[4861]: I0309 09:24:42.882827 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b84f271-fc53-4d64-ae57-acfd2a5b1696-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1b84f271-fc53-4d64-ae57-acfd2a5b1696" (UID: "1b84f271-fc53-4d64-ae57-acfd2a5b1696"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:24:42 crc kubenswrapper[4861]: I0309 09:24:42.920963 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-679c9c695-9vt85"] Mar 09 09:24:42 crc kubenswrapper[4861]: I0309 09:24:42.928222 4861 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b84f271-fc53-4d64-ae57-acfd2a5b1696-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:42 crc kubenswrapper[4861]: I0309 09:24:42.928256 4861 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1b84f271-fc53-4d64-ae57-acfd2a5b1696-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:42 crc kubenswrapper[4861]: I0309 09:24:42.928270 4861 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b84f271-fc53-4d64-ae57-acfd2a5b1696-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:42 crc kubenswrapper[4861]: I0309 09:24:42.928282 4861 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b84f271-fc53-4d64-ae57-acfd2a5b1696-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:42 crc kubenswrapper[4861]: I0309 09:24:42.933790 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b84f271-fc53-4d64-ae57-acfd2a5b1696-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1b84f271-fc53-4d64-ae57-acfd2a5b1696" (UID: "1b84f271-fc53-4d64-ae57-acfd2a5b1696"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:24:43 crc kubenswrapper[4861]: I0309 09:24:43.031783 4861 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b84f271-fc53-4d64-ae57-acfd2a5b1696-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:43 crc kubenswrapper[4861]: I0309 09:24:43.063167 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"868b4cd9-0c47-4173-b2f6-710c28e73f16","Type":"ContainerStarted","Data":"f365affb6c3a106ef2c4bea634d0a820dfa2a72f4e8bb311fdfd9cd6db69b6d5"} Mar 09 09:24:43 crc kubenswrapper[4861]: I0309 09:24:43.075851 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ccd7c9f8f-j7b59" event={"ID":"ebf4ebf5-adc0-48b4-b80b-0b0d88e64910","Type":"ContainerStarted","Data":"2f04fc77f99198d1100f898ca09e71671b5dce06e42013b8980d3efc2cff2c97"} Mar 09 09:24:43 crc kubenswrapper[4861]: I0309 09:24:43.077015 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-ccd7c9f8f-j7b59" Mar 09 09:24:43 crc kubenswrapper[4861]: I0309 09:24:43.081684 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5985c59c55-2fk4t" event={"ID":"1b84f271-fc53-4d64-ae57-acfd2a5b1696","Type":"ContainerDied","Data":"933bd051735f7e87abe851669bd08b7af7dbbf5e3f79e121cb770afa96eb0fb3"} Mar 09 09:24:43 crc kubenswrapper[4861]: I0309 09:24:43.081694 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5985c59c55-2fk4t" Mar 09 09:24:43 crc kubenswrapper[4861]: I0309 09:24:43.081728 4861 scope.go:117] "RemoveContainer" containerID="1a841917a89f612bdbf92725ecdf117256fc02e449393ce76fe437e4b78c1c6d" Mar 09 09:24:43 crc kubenswrapper[4861]: I0309 09:24:43.086706 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0e84a83c-79e9-4576-9586-656933127b06","Type":"ContainerStarted","Data":"85b70180b0939947a13dbd49e81e04864e884ff4dae384fffe528e08033e45ac"} Mar 09 09:24:43 crc kubenswrapper[4861]: I0309 09:24:43.092075 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-679c9c695-9vt85" event={"ID":"906444a4-92dd-48ac-931d-0799f1256e9b","Type":"ContainerStarted","Data":"56eaea71ed937670a5cc2a4657b82d4019963b984dba0e1625515f13dafb8335"} Mar 09 09:24:43 crc kubenswrapper[4861]: I0309 09:24:43.107636 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-ccd7c9f8f-j7b59" podStartSLOduration=4.107613329 podStartE2EDuration="4.107613329s" podCreationTimestamp="2026-03-09 09:24:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:24:43.095579198 +0000 UTC m=+1126.180618599" watchObservedRunningTime="2026-03-09 09:24:43.107613329 +0000 UTC m=+1126.192652730" Mar 09 09:24:43 crc kubenswrapper[4861]: I0309 09:24:43.153846 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5985c59c55-2fk4t"] Mar 09 09:24:43 crc kubenswrapper[4861]: I0309 09:24:43.162714 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5985c59c55-2fk4t"] Mar 09 09:24:43 crc kubenswrapper[4861]: I0309 09:24:43.691696 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b84f271-fc53-4d64-ae57-acfd2a5b1696" path="/var/lib/kubelet/pods/1b84f271-fc53-4d64-ae57-acfd2a5b1696/volumes" Mar 09 09:24:44 crc kubenswrapper[4861]: I0309 09:24:44.115927 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="0e84a83c-79e9-4576-9586-656933127b06" containerName="glance-log" containerID="cri-o://85b70180b0939947a13dbd49e81e04864e884ff4dae384fffe528e08033e45ac" gracePeriod=30 Mar 09 09:24:44 crc kubenswrapper[4861]: I0309 09:24:44.115979 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="0e84a83c-79e9-4576-9586-656933127b06" containerName="glance-httpd" containerID="cri-o://891ee311d748d2ad6c414b13478b96e19ee8e6aeba1037c01a21c8326c398466" gracePeriod=30 Mar 09 09:24:44 crc kubenswrapper[4861]: I0309 09:24:44.146204 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.146187669 podStartE2EDuration="5.146187669s" podCreationTimestamp="2026-03-09 09:24:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:24:44.14004118 +0000 UTC m=+1127.225080591" watchObservedRunningTime="2026-03-09 09:24:44.146187669 +0000 UTC m=+1127.231227070" Mar 09 09:24:44 crc kubenswrapper[4861]: E0309 09:24:44.276455 4861 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e84a83c_79e9_4576_9586_656933127b06.slice/crio-conmon-85b70180b0939947a13dbd49e81e04864e884ff4dae384fffe528e08033e45ac.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e84a83c_79e9_4576_9586_656933127b06.slice/crio-891ee311d748d2ad6c414b13478b96e19ee8e6aeba1037c01a21c8326c398466.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e84a83c_79e9_4576_9586_656933127b06.slice/crio-85b70180b0939947a13dbd49e81e04864e884ff4dae384fffe528e08033e45ac.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e84a83c_79e9_4576_9586_656933127b06.slice/crio-conmon-891ee311d748d2ad6c414b13478b96e19ee8e6aeba1037c01a21c8326c398466.scope\": RecentStats: unable to find data in memory cache]" Mar 09 09:24:44 crc kubenswrapper[4861]: I0309 09:24:44.844487 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.012772 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e84a83c-79e9-4576-9586-656933127b06-logs\") pod \"0e84a83c-79e9-4576-9586-656933127b06\" (UID: \"0e84a83c-79e9-4576-9586-656933127b06\") " Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.013220 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prvxq\" (UniqueName: \"kubernetes.io/projected/0e84a83c-79e9-4576-9586-656933127b06-kube-api-access-prvxq\") pod \"0e84a83c-79e9-4576-9586-656933127b06\" (UID: \"0e84a83c-79e9-4576-9586-656933127b06\") " Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.013269 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"0e84a83c-79e9-4576-9586-656933127b06\" (UID: \"0e84a83c-79e9-4576-9586-656933127b06\") " Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.013355 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e84a83c-79e9-4576-9586-656933127b06-public-tls-certs\") pod \"0e84a83c-79e9-4576-9586-656933127b06\" (UID: \"0e84a83c-79e9-4576-9586-656933127b06\") " Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.013405 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e84a83c-79e9-4576-9586-656933127b06-config-data\") pod \"0e84a83c-79e9-4576-9586-656933127b06\" (UID: \"0e84a83c-79e9-4576-9586-656933127b06\") " Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.013508 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e84a83c-79e9-4576-9586-656933127b06-combined-ca-bundle\") pod \"0e84a83c-79e9-4576-9586-656933127b06\" (UID: \"0e84a83c-79e9-4576-9586-656933127b06\") " Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.013538 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e84a83c-79e9-4576-9586-656933127b06-scripts\") pod \"0e84a83c-79e9-4576-9586-656933127b06\" (UID: \"0e84a83c-79e9-4576-9586-656933127b06\") " Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.013566 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0e84a83c-79e9-4576-9586-656933127b06-httpd-run\") pod \"0e84a83c-79e9-4576-9586-656933127b06\" (UID: \"0e84a83c-79e9-4576-9586-656933127b06\") " Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.014207 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e84a83c-79e9-4576-9586-656933127b06-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "0e84a83c-79e9-4576-9586-656933127b06" (UID: "0e84a83c-79e9-4576-9586-656933127b06"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.016845 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e84a83c-79e9-4576-9586-656933127b06-logs" (OuterVolumeSpecName: "logs") pod "0e84a83c-79e9-4576-9586-656933127b06" (UID: "0e84a83c-79e9-4576-9586-656933127b06"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.021037 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "0e84a83c-79e9-4576-9586-656933127b06" (UID: "0e84a83c-79e9-4576-9586-656933127b06"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.026482 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e84a83c-79e9-4576-9586-656933127b06-scripts" (OuterVolumeSpecName: "scripts") pod "0e84a83c-79e9-4576-9586-656933127b06" (UID: "0e84a83c-79e9-4576-9586-656933127b06"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.043482 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e84a83c-79e9-4576-9586-656933127b06-kube-api-access-prvxq" (OuterVolumeSpecName: "kube-api-access-prvxq") pod "0e84a83c-79e9-4576-9586-656933127b06" (UID: "0e84a83c-79e9-4576-9586-656933127b06"). InnerVolumeSpecName "kube-api-access-prvxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.060538 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e84a83c-79e9-4576-9586-656933127b06-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0e84a83c-79e9-4576-9586-656933127b06" (UID: "0e84a83c-79e9-4576-9586-656933127b06"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.090610 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e84a83c-79e9-4576-9586-656933127b06-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0e84a83c-79e9-4576-9586-656933127b06" (UID: "0e84a83c-79e9-4576-9586-656933127b06"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.095517 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e84a83c-79e9-4576-9586-656933127b06-config-data" (OuterVolumeSpecName: "config-data") pod "0e84a83c-79e9-4576-9586-656933127b06" (UID: "0e84a83c-79e9-4576-9586-656933127b06"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.115152 4861 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e84a83c-79e9-4576-9586-656933127b06-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.115190 4861 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e84a83c-79e9-4576-9586-656933127b06-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.115799 4861 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0e84a83c-79e9-4576-9586-656933127b06-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.115840 4861 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e84a83c-79e9-4576-9586-656933127b06-logs\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.115854 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prvxq\" (UniqueName: \"kubernetes.io/projected/0e84a83c-79e9-4576-9586-656933127b06-kube-api-access-prvxq\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.115890 4861 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.115904 4861 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e84a83c-79e9-4576-9586-656933127b06-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.115916 4861 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e84a83c-79e9-4576-9586-656933127b06-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.126491 4861 generic.go:334] "Generic (PLEG): container finished" podID="0e84a83c-79e9-4576-9586-656933127b06" containerID="891ee311d748d2ad6c414b13478b96e19ee8e6aeba1037c01a21c8326c398466" exitCode=143 Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.126528 4861 generic.go:334] "Generic (PLEG): container finished" podID="0e84a83c-79e9-4576-9586-656933127b06" containerID="85b70180b0939947a13dbd49e81e04864e884ff4dae384fffe528e08033e45ac" exitCode=143 Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.126544 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.126578 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0e84a83c-79e9-4576-9586-656933127b06","Type":"ContainerDied","Data":"891ee311d748d2ad6c414b13478b96e19ee8e6aeba1037c01a21c8326c398466"} Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.126672 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0e84a83c-79e9-4576-9586-656933127b06","Type":"ContainerDied","Data":"85b70180b0939947a13dbd49e81e04864e884ff4dae384fffe528e08033e45ac"} Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.126690 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0e84a83c-79e9-4576-9586-656933127b06","Type":"ContainerDied","Data":"e2973ff4cb856e7f179ad0fc3bc69aa88a8c21142ec7b5d570029030f597ff4f"} Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.126737 4861 scope.go:117] "RemoveContainer" containerID="891ee311d748d2ad6c414b13478b96e19ee8e6aeba1037c01a21c8326c398466" Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.136158 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"868b4cd9-0c47-4173-b2f6-710c28e73f16","Type":"ContainerStarted","Data":"af13b5ea98003838a86bd094b8836c89c496c72416b4d1f88661983ae79d77a4"} Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.136197 4861 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.136335 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="868b4cd9-0c47-4173-b2f6-710c28e73f16" containerName="glance-log" containerID="cri-o://f365affb6c3a106ef2c4bea634d0a820dfa2a72f4e8bb311fdfd9cd6db69b6d5" gracePeriod=30 Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.136566 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="868b4cd9-0c47-4173-b2f6-710c28e73f16" containerName="glance-httpd" containerID="cri-o://af13b5ea98003838a86bd094b8836c89c496c72416b4d1f88661983ae79d77a4" gracePeriod=30 Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.168048 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.16802658 podStartE2EDuration="5.16802658s" podCreationTimestamp="2026-03-09 09:24:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:24:45.159746148 +0000 UTC m=+1128.244785569" watchObservedRunningTime="2026-03-09 09:24:45.16802658 +0000 UTC m=+1128.253065981" Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.217200 4861 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.223403 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.255933 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.273647 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 09:24:45 crc kubenswrapper[4861]: E0309 09:24:45.274185 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e84a83c-79e9-4576-9586-656933127b06" containerName="glance-httpd" Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.274210 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e84a83c-79e9-4576-9586-656933127b06" containerName="glance-httpd" Mar 09 09:24:45 crc kubenswrapper[4861]: E0309 09:24:45.274232 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b84f271-fc53-4d64-ae57-acfd2a5b1696" containerName="init" Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.274241 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b84f271-fc53-4d64-ae57-acfd2a5b1696" containerName="init" Mar 09 09:24:45 crc kubenswrapper[4861]: E0309 09:24:45.274257 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e84a83c-79e9-4576-9586-656933127b06" containerName="glance-log" Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.274265 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e84a83c-79e9-4576-9586-656933127b06" containerName="glance-log" Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.274491 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e84a83c-79e9-4576-9586-656933127b06" containerName="glance-log" Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.274519 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b84f271-fc53-4d64-ae57-acfd2a5b1696" containerName="init" Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.274531 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e84a83c-79e9-4576-9586-656933127b06" containerName="glance-httpd" Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.275593 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.279808 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.280048 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.292638 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.420379 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9e5dae99-f3a2-4a51-afff-03137206590b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9e5dae99-f3a2-4a51-afff-03137206590b\") " pod="openstack/glance-default-external-api-0" Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.420425 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e5dae99-f3a2-4a51-afff-03137206590b-scripts\") pod \"glance-default-external-api-0\" (UID: \"9e5dae99-f3a2-4a51-afff-03137206590b\") " pod="openstack/glance-default-external-api-0" Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.420465 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7nq4\" (UniqueName: \"kubernetes.io/projected/9e5dae99-f3a2-4a51-afff-03137206590b-kube-api-access-m7nq4\") pod \"glance-default-external-api-0\" (UID: \"9e5dae99-f3a2-4a51-afff-03137206590b\") " pod="openstack/glance-default-external-api-0" Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.420504 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e5dae99-f3a2-4a51-afff-03137206590b-config-data\") pod \"glance-default-external-api-0\" (UID: \"9e5dae99-f3a2-4a51-afff-03137206590b\") " pod="openstack/glance-default-external-api-0" Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.420557 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e5dae99-f3a2-4a51-afff-03137206590b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9e5dae99-f3a2-4a51-afff-03137206590b\") " pod="openstack/glance-default-external-api-0" Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.420600 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e5dae99-f3a2-4a51-afff-03137206590b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9e5dae99-f3a2-4a51-afff-03137206590b\") " pod="openstack/glance-default-external-api-0" Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.420650 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"9e5dae99-f3a2-4a51-afff-03137206590b\") " pod="openstack/glance-default-external-api-0" Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.420692 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e5dae99-f3a2-4a51-afff-03137206590b-logs\") pod \"glance-default-external-api-0\" (UID: \"9e5dae99-f3a2-4a51-afff-03137206590b\") " pod="openstack/glance-default-external-api-0" Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.523479 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e5dae99-f3a2-4a51-afff-03137206590b-logs\") pod \"glance-default-external-api-0\" (UID: \"9e5dae99-f3a2-4a51-afff-03137206590b\") " pod="openstack/glance-default-external-api-0" Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.523844 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9e5dae99-f3a2-4a51-afff-03137206590b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9e5dae99-f3a2-4a51-afff-03137206590b\") " pod="openstack/glance-default-external-api-0" Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.523867 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e5dae99-f3a2-4a51-afff-03137206590b-scripts\") pod \"glance-default-external-api-0\" (UID: \"9e5dae99-f3a2-4a51-afff-03137206590b\") " pod="openstack/glance-default-external-api-0" Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.523898 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7nq4\" (UniqueName: \"kubernetes.io/projected/9e5dae99-f3a2-4a51-afff-03137206590b-kube-api-access-m7nq4\") pod \"glance-default-external-api-0\" (UID: \"9e5dae99-f3a2-4a51-afff-03137206590b\") " pod="openstack/glance-default-external-api-0" Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.523925 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e5dae99-f3a2-4a51-afff-03137206590b-config-data\") pod \"glance-default-external-api-0\" (UID: \"9e5dae99-f3a2-4a51-afff-03137206590b\") " pod="openstack/glance-default-external-api-0" Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.523952 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e5dae99-f3a2-4a51-afff-03137206590b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9e5dae99-f3a2-4a51-afff-03137206590b\") " pod="openstack/glance-default-external-api-0" Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.523984 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e5dae99-f3a2-4a51-afff-03137206590b-logs\") pod \"glance-default-external-api-0\" (UID: \"9e5dae99-f3a2-4a51-afff-03137206590b\") " pod="openstack/glance-default-external-api-0" Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.523989 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e5dae99-f3a2-4a51-afff-03137206590b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9e5dae99-f3a2-4a51-afff-03137206590b\") " pod="openstack/glance-default-external-api-0" Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.524171 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"9e5dae99-f3a2-4a51-afff-03137206590b\") " pod="openstack/glance-default-external-api-0" Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.524310 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9e5dae99-f3a2-4a51-afff-03137206590b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9e5dae99-f3a2-4a51-afff-03137206590b\") " pod="openstack/glance-default-external-api-0" Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.524654 4861 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"9e5dae99-f3a2-4a51-afff-03137206590b\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-external-api-0" Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.527775 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e5dae99-f3a2-4a51-afff-03137206590b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9e5dae99-f3a2-4a51-afff-03137206590b\") " pod="openstack/glance-default-external-api-0" Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.528807 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e5dae99-f3a2-4a51-afff-03137206590b-config-data\") pod \"glance-default-external-api-0\" (UID: \"9e5dae99-f3a2-4a51-afff-03137206590b\") " pod="openstack/glance-default-external-api-0" Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.529219 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e5dae99-f3a2-4a51-afff-03137206590b-scripts\") pod \"glance-default-external-api-0\" (UID: \"9e5dae99-f3a2-4a51-afff-03137206590b\") " pod="openstack/glance-default-external-api-0" Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.532560 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e5dae99-f3a2-4a51-afff-03137206590b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9e5dae99-f3a2-4a51-afff-03137206590b\") " pod="openstack/glance-default-external-api-0" Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.539632 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7nq4\" (UniqueName: \"kubernetes.io/projected/9e5dae99-f3a2-4a51-afff-03137206590b-kube-api-access-m7nq4\") pod \"glance-default-external-api-0\" (UID: \"9e5dae99-f3a2-4a51-afff-03137206590b\") " pod="openstack/glance-default-external-api-0" Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.560485 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"9e5dae99-f3a2-4a51-afff-03137206590b\") " pod="openstack/glance-default-external-api-0" Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.641007 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 09:24:45 crc kubenswrapper[4861]: I0309 09:24:45.672600 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e84a83c-79e9-4576-9586-656933127b06" path="/var/lib/kubelet/pods/0e84a83c-79e9-4576-9586-656933127b06/volumes" Mar 09 09:24:46 crc kubenswrapper[4861]: I0309 09:24:46.149571 4861 generic.go:334] "Generic (PLEG): container finished" podID="868b4cd9-0c47-4173-b2f6-710c28e73f16" containerID="af13b5ea98003838a86bd094b8836c89c496c72416b4d1f88661983ae79d77a4" exitCode=0 Mar 09 09:24:46 crc kubenswrapper[4861]: I0309 09:24:46.149605 4861 generic.go:334] "Generic (PLEG): container finished" podID="868b4cd9-0c47-4173-b2f6-710c28e73f16" containerID="f365affb6c3a106ef2c4bea634d0a820dfa2a72f4e8bb311fdfd9cd6db69b6d5" exitCode=143 Mar 09 09:24:46 crc kubenswrapper[4861]: I0309 09:24:46.149650 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"868b4cd9-0c47-4173-b2f6-710c28e73f16","Type":"ContainerDied","Data":"af13b5ea98003838a86bd094b8836c89c496c72416b4d1f88661983ae79d77a4"} Mar 09 09:24:46 crc kubenswrapper[4861]: I0309 09:24:46.149689 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"868b4cd9-0c47-4173-b2f6-710c28e73f16","Type":"ContainerDied","Data":"f365affb6c3a106ef2c4bea634d0a820dfa2a72f4e8bb311fdfd9cd6db69b6d5"} Mar 09 09:24:46 crc kubenswrapper[4861]: I0309 09:24:46.155698 4861 generic.go:334] "Generic (PLEG): container finished" podID="35527270-268b-4b4d-864b-de9c9c6182ea" containerID="e72c1d9cb3d3ed802613c080c0934aa922e277a849d6ac09e873332c57084265" exitCode=0 Mar 09 09:24:46 crc kubenswrapper[4861]: I0309 09:24:46.155743 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ktlzn" event={"ID":"35527270-268b-4b4d-864b-de9c9c6182ea","Type":"ContainerDied","Data":"e72c1d9cb3d3ed802613c080c0934aa922e277a849d6ac09e873332c57084265"} Mar 09 09:24:48 crc kubenswrapper[4861]: I0309 09:24:48.489543 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7f474b7d57-rss5x"] Mar 09 09:24:48 crc kubenswrapper[4861]: I0309 09:24:48.509148 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 09:24:48 crc kubenswrapper[4861]: I0309 09:24:48.524808 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7bb4db8c4-sxjc7"] Mar 09 09:24:48 crc kubenswrapper[4861]: I0309 09:24:48.526130 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7bb4db8c4-sxjc7" Mar 09 09:24:48 crc kubenswrapper[4861]: I0309 09:24:48.530652 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Mar 09 09:24:48 crc kubenswrapper[4861]: I0309 09:24:48.573476 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7bb4db8c4-sxjc7"] Mar 09 09:24:48 crc kubenswrapper[4861]: I0309 09:24:48.594552 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/71492031-e589-409e-b8c8-c0a1194b97ed-horizon-secret-key\") pod \"horizon-7bb4db8c4-sxjc7\" (UID: \"71492031-e589-409e-b8c8-c0a1194b97ed\") " pod="openstack/horizon-7bb4db8c4-sxjc7" Mar 09 09:24:48 crc kubenswrapper[4861]: I0309 09:24:48.594654 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/71492031-e589-409e-b8c8-c0a1194b97ed-scripts\") pod \"horizon-7bb4db8c4-sxjc7\" (UID: \"71492031-e589-409e-b8c8-c0a1194b97ed\") " pod="openstack/horizon-7bb4db8c4-sxjc7" Mar 09 09:24:48 crc kubenswrapper[4861]: I0309 09:24:48.594680 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71492031-e589-409e-b8c8-c0a1194b97ed-combined-ca-bundle\") pod \"horizon-7bb4db8c4-sxjc7\" (UID: \"71492031-e589-409e-b8c8-c0a1194b97ed\") " pod="openstack/horizon-7bb4db8c4-sxjc7" Mar 09 09:24:48 crc kubenswrapper[4861]: I0309 09:24:48.594706 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71492031-e589-409e-b8c8-c0a1194b97ed-logs\") pod \"horizon-7bb4db8c4-sxjc7\" (UID: \"71492031-e589-409e-b8c8-c0a1194b97ed\") " pod="openstack/horizon-7bb4db8c4-sxjc7" Mar 09 09:24:48 crc kubenswrapper[4861]: I0309 09:24:48.594749 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/71492031-e589-409e-b8c8-c0a1194b97ed-config-data\") pod \"horizon-7bb4db8c4-sxjc7\" (UID: \"71492031-e589-409e-b8c8-c0a1194b97ed\") " pod="openstack/horizon-7bb4db8c4-sxjc7" Mar 09 09:24:48 crc kubenswrapper[4861]: I0309 09:24:48.594812 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jbdt\" (UniqueName: \"kubernetes.io/projected/71492031-e589-409e-b8c8-c0a1194b97ed-kube-api-access-9jbdt\") pod \"horizon-7bb4db8c4-sxjc7\" (UID: \"71492031-e589-409e-b8c8-c0a1194b97ed\") " pod="openstack/horizon-7bb4db8c4-sxjc7" Mar 09 09:24:48 crc kubenswrapper[4861]: I0309 09:24:48.594840 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/71492031-e589-409e-b8c8-c0a1194b97ed-horizon-tls-certs\") pod \"horizon-7bb4db8c4-sxjc7\" (UID: \"71492031-e589-409e-b8c8-c0a1194b97ed\") " pod="openstack/horizon-7bb4db8c4-sxjc7" Mar 09 09:24:48 crc kubenswrapper[4861]: I0309 09:24:48.611846 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-679c9c695-9vt85"] Mar 09 09:24:48 crc kubenswrapper[4861]: I0309 09:24:48.626434 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5487f4d458-lnthc"] Mar 09 09:24:48 crc kubenswrapper[4861]: I0309 09:24:48.644989 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5487f4d458-lnthc"] Mar 09 09:24:48 crc kubenswrapper[4861]: I0309 09:24:48.645085 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5487f4d458-lnthc" Mar 09 09:24:48 crc kubenswrapper[4861]: I0309 09:24:48.696222 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9049886d-2460-47fe-ac82-2dfde4858bd0-config-data\") pod \"horizon-5487f4d458-lnthc\" (UID: \"9049886d-2460-47fe-ac82-2dfde4858bd0\") " pod="openstack/horizon-5487f4d458-lnthc" Mar 09 09:24:48 crc kubenswrapper[4861]: I0309 09:24:48.696572 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jbdt\" (UniqueName: \"kubernetes.io/projected/71492031-e589-409e-b8c8-c0a1194b97ed-kube-api-access-9jbdt\") pod \"horizon-7bb4db8c4-sxjc7\" (UID: \"71492031-e589-409e-b8c8-c0a1194b97ed\") " pod="openstack/horizon-7bb4db8c4-sxjc7" Mar 09 09:24:48 crc kubenswrapper[4861]: I0309 09:24:48.696594 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/71492031-e589-409e-b8c8-c0a1194b97ed-horizon-tls-certs\") pod \"horizon-7bb4db8c4-sxjc7\" (UID: \"71492031-e589-409e-b8c8-c0a1194b97ed\") " pod="openstack/horizon-7bb4db8c4-sxjc7" Mar 09 09:24:48 crc kubenswrapper[4861]: I0309 09:24:48.696616 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9049886d-2460-47fe-ac82-2dfde4858bd0-scripts\") pod \"horizon-5487f4d458-lnthc\" (UID: \"9049886d-2460-47fe-ac82-2dfde4858bd0\") " pod="openstack/horizon-5487f4d458-lnthc" Mar 09 09:24:48 crc kubenswrapper[4861]: I0309 09:24:48.696660 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/71492031-e589-409e-b8c8-c0a1194b97ed-horizon-secret-key\") pod \"horizon-7bb4db8c4-sxjc7\" (UID: \"71492031-e589-409e-b8c8-c0a1194b97ed\") " pod="openstack/horizon-7bb4db8c4-sxjc7" Mar 09 09:24:48 crc kubenswrapper[4861]: I0309 09:24:48.696685 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9049886d-2460-47fe-ac82-2dfde4858bd0-logs\") pod \"horizon-5487f4d458-lnthc\" (UID: \"9049886d-2460-47fe-ac82-2dfde4858bd0\") " pod="openstack/horizon-5487f4d458-lnthc" Mar 09 09:24:48 crc kubenswrapper[4861]: I0309 09:24:48.696710 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9049886d-2460-47fe-ac82-2dfde4858bd0-combined-ca-bundle\") pod \"horizon-5487f4d458-lnthc\" (UID: \"9049886d-2460-47fe-ac82-2dfde4858bd0\") " pod="openstack/horizon-5487f4d458-lnthc" Mar 09 09:24:48 crc kubenswrapper[4861]: I0309 09:24:48.696735 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4zln\" (UniqueName: \"kubernetes.io/projected/9049886d-2460-47fe-ac82-2dfde4858bd0-kube-api-access-j4zln\") pod \"horizon-5487f4d458-lnthc\" (UID: \"9049886d-2460-47fe-ac82-2dfde4858bd0\") " pod="openstack/horizon-5487f4d458-lnthc" Mar 09 09:24:48 crc kubenswrapper[4861]: I0309 09:24:48.696759 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/71492031-e589-409e-b8c8-c0a1194b97ed-scripts\") pod \"horizon-7bb4db8c4-sxjc7\" (UID: \"71492031-e589-409e-b8c8-c0a1194b97ed\") " pod="openstack/horizon-7bb4db8c4-sxjc7" Mar 09 09:24:48 crc kubenswrapper[4861]: I0309 09:24:48.696774 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71492031-e589-409e-b8c8-c0a1194b97ed-combined-ca-bundle\") pod \"horizon-7bb4db8c4-sxjc7\" (UID: \"71492031-e589-409e-b8c8-c0a1194b97ed\") " pod="openstack/horizon-7bb4db8c4-sxjc7" Mar 09 09:24:48 crc kubenswrapper[4861]: I0309 09:24:48.696793 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71492031-e589-409e-b8c8-c0a1194b97ed-logs\") pod \"horizon-7bb4db8c4-sxjc7\" (UID: \"71492031-e589-409e-b8c8-c0a1194b97ed\") " pod="openstack/horizon-7bb4db8c4-sxjc7" Mar 09 09:24:48 crc kubenswrapper[4861]: I0309 09:24:48.696827 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9049886d-2460-47fe-ac82-2dfde4858bd0-horizon-secret-key\") pod \"horizon-5487f4d458-lnthc\" (UID: \"9049886d-2460-47fe-ac82-2dfde4858bd0\") " pod="openstack/horizon-5487f4d458-lnthc" Mar 09 09:24:48 crc kubenswrapper[4861]: I0309 09:24:48.696851 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/71492031-e589-409e-b8c8-c0a1194b97ed-config-data\") pod \"horizon-7bb4db8c4-sxjc7\" (UID: \"71492031-e589-409e-b8c8-c0a1194b97ed\") " pod="openstack/horizon-7bb4db8c4-sxjc7" Mar 09 09:24:48 crc kubenswrapper[4861]: I0309 09:24:48.696882 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/9049886d-2460-47fe-ac82-2dfde4858bd0-horizon-tls-certs\") pod \"horizon-5487f4d458-lnthc\" (UID: \"9049886d-2460-47fe-ac82-2dfde4858bd0\") " pod="openstack/horizon-5487f4d458-lnthc" Mar 09 09:24:48 crc kubenswrapper[4861]: I0309 09:24:48.697480 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71492031-e589-409e-b8c8-c0a1194b97ed-logs\") pod \"horizon-7bb4db8c4-sxjc7\" (UID: \"71492031-e589-409e-b8c8-c0a1194b97ed\") " pod="openstack/horizon-7bb4db8c4-sxjc7" Mar 09 09:24:48 crc kubenswrapper[4861]: I0309 09:24:48.698070 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/71492031-e589-409e-b8c8-c0a1194b97ed-scripts\") pod \"horizon-7bb4db8c4-sxjc7\" (UID: \"71492031-e589-409e-b8c8-c0a1194b97ed\") " pod="openstack/horizon-7bb4db8c4-sxjc7" Mar 09 09:24:48 crc kubenswrapper[4861]: I0309 09:24:48.698719 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/71492031-e589-409e-b8c8-c0a1194b97ed-config-data\") pod \"horizon-7bb4db8c4-sxjc7\" (UID: \"71492031-e589-409e-b8c8-c0a1194b97ed\") " pod="openstack/horizon-7bb4db8c4-sxjc7" Mar 09 09:24:48 crc kubenswrapper[4861]: I0309 09:24:48.702452 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/71492031-e589-409e-b8c8-c0a1194b97ed-horizon-secret-key\") pod \"horizon-7bb4db8c4-sxjc7\" (UID: \"71492031-e589-409e-b8c8-c0a1194b97ed\") " pod="openstack/horizon-7bb4db8c4-sxjc7" Mar 09 09:24:48 crc kubenswrapper[4861]: I0309 09:24:48.702686 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71492031-e589-409e-b8c8-c0a1194b97ed-combined-ca-bundle\") pod \"horizon-7bb4db8c4-sxjc7\" (UID: \"71492031-e589-409e-b8c8-c0a1194b97ed\") " pod="openstack/horizon-7bb4db8c4-sxjc7" Mar 09 09:24:48 crc kubenswrapper[4861]: I0309 09:24:48.713563 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jbdt\" (UniqueName: \"kubernetes.io/projected/71492031-e589-409e-b8c8-c0a1194b97ed-kube-api-access-9jbdt\") pod \"horizon-7bb4db8c4-sxjc7\" (UID: \"71492031-e589-409e-b8c8-c0a1194b97ed\") " pod="openstack/horizon-7bb4db8c4-sxjc7" Mar 09 09:24:48 crc kubenswrapper[4861]: I0309 09:24:48.720002 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/71492031-e589-409e-b8c8-c0a1194b97ed-horizon-tls-certs\") pod \"horizon-7bb4db8c4-sxjc7\" (UID: \"71492031-e589-409e-b8c8-c0a1194b97ed\") " pod="openstack/horizon-7bb4db8c4-sxjc7" Mar 09 09:24:48 crc kubenswrapper[4861]: I0309 09:24:48.799295 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9049886d-2460-47fe-ac82-2dfde4858bd0-scripts\") pod \"horizon-5487f4d458-lnthc\" (UID: \"9049886d-2460-47fe-ac82-2dfde4858bd0\") " pod="openstack/horizon-5487f4d458-lnthc" Mar 09 09:24:48 crc kubenswrapper[4861]: I0309 09:24:48.799407 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9049886d-2460-47fe-ac82-2dfde4858bd0-logs\") pod \"horizon-5487f4d458-lnthc\" (UID: \"9049886d-2460-47fe-ac82-2dfde4858bd0\") " pod="openstack/horizon-5487f4d458-lnthc" Mar 09 09:24:48 crc kubenswrapper[4861]: I0309 09:24:48.799438 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9049886d-2460-47fe-ac82-2dfde4858bd0-combined-ca-bundle\") pod \"horizon-5487f4d458-lnthc\" (UID: \"9049886d-2460-47fe-ac82-2dfde4858bd0\") " pod="openstack/horizon-5487f4d458-lnthc" Mar 09 09:24:48 crc kubenswrapper[4861]: I0309 09:24:48.799470 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4zln\" (UniqueName: \"kubernetes.io/projected/9049886d-2460-47fe-ac82-2dfde4858bd0-kube-api-access-j4zln\") pod \"horizon-5487f4d458-lnthc\" (UID: \"9049886d-2460-47fe-ac82-2dfde4858bd0\") " pod="openstack/horizon-5487f4d458-lnthc" Mar 09 09:24:48 crc kubenswrapper[4861]: I0309 09:24:48.799506 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9049886d-2460-47fe-ac82-2dfde4858bd0-horizon-secret-key\") pod \"horizon-5487f4d458-lnthc\" (UID: \"9049886d-2460-47fe-ac82-2dfde4858bd0\") " pod="openstack/horizon-5487f4d458-lnthc" Mar 09 09:24:48 crc kubenswrapper[4861]: I0309 09:24:48.799552 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/9049886d-2460-47fe-ac82-2dfde4858bd0-horizon-tls-certs\") pod \"horizon-5487f4d458-lnthc\" (UID: \"9049886d-2460-47fe-ac82-2dfde4858bd0\") " pod="openstack/horizon-5487f4d458-lnthc" Mar 09 09:24:48 crc kubenswrapper[4861]: I0309 09:24:48.799576 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9049886d-2460-47fe-ac82-2dfde4858bd0-config-data\") pod \"horizon-5487f4d458-lnthc\" (UID: \"9049886d-2460-47fe-ac82-2dfde4858bd0\") " pod="openstack/horizon-5487f4d458-lnthc" Mar 09 09:24:48 crc kubenswrapper[4861]: I0309 09:24:48.801015 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9049886d-2460-47fe-ac82-2dfde4858bd0-logs\") pod \"horizon-5487f4d458-lnthc\" (UID: \"9049886d-2460-47fe-ac82-2dfde4858bd0\") " pod="openstack/horizon-5487f4d458-lnthc" Mar 09 09:24:48 crc kubenswrapper[4861]: I0309 09:24:48.804436 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9049886d-2460-47fe-ac82-2dfde4858bd0-horizon-secret-key\") pod \"horizon-5487f4d458-lnthc\" (UID: \"9049886d-2460-47fe-ac82-2dfde4858bd0\") " pod="openstack/horizon-5487f4d458-lnthc" Mar 09 09:24:48 crc kubenswrapper[4861]: I0309 09:24:48.805019 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9049886d-2460-47fe-ac82-2dfde4858bd0-config-data\") pod \"horizon-5487f4d458-lnthc\" (UID: \"9049886d-2460-47fe-ac82-2dfde4858bd0\") " pod="openstack/horizon-5487f4d458-lnthc" Mar 09 09:24:48 crc kubenswrapper[4861]: I0309 09:24:48.809306 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9049886d-2460-47fe-ac82-2dfde4858bd0-scripts\") pod \"horizon-5487f4d458-lnthc\" (UID: \"9049886d-2460-47fe-ac82-2dfde4858bd0\") " pod="openstack/horizon-5487f4d458-lnthc" Mar 09 09:24:48 crc kubenswrapper[4861]: I0309 09:24:48.809430 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9049886d-2460-47fe-ac82-2dfde4858bd0-combined-ca-bundle\") pod \"horizon-5487f4d458-lnthc\" (UID: \"9049886d-2460-47fe-ac82-2dfde4858bd0\") " pod="openstack/horizon-5487f4d458-lnthc" Mar 09 09:24:48 crc kubenswrapper[4861]: I0309 09:24:48.812299 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/9049886d-2460-47fe-ac82-2dfde4858bd0-horizon-tls-certs\") pod \"horizon-5487f4d458-lnthc\" (UID: \"9049886d-2460-47fe-ac82-2dfde4858bd0\") " pod="openstack/horizon-5487f4d458-lnthc" Mar 09 09:24:48 crc kubenswrapper[4861]: I0309 09:24:48.817357 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4zln\" (UniqueName: \"kubernetes.io/projected/9049886d-2460-47fe-ac82-2dfde4858bd0-kube-api-access-j4zln\") pod \"horizon-5487f4d458-lnthc\" (UID: \"9049886d-2460-47fe-ac82-2dfde4858bd0\") " pod="openstack/horizon-5487f4d458-lnthc" Mar 09 09:24:48 crc kubenswrapper[4861]: I0309 09:24:48.852949 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7bb4db8c4-sxjc7" Mar 09 09:24:48 crc kubenswrapper[4861]: I0309 09:24:48.987799 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5487f4d458-lnthc" Mar 09 09:24:50 crc kubenswrapper[4861]: I0309 09:24:50.361710 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-ccd7c9f8f-j7b59" Mar 09 09:24:50 crc kubenswrapper[4861]: I0309 09:24:50.440520 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b684fb9f5-wg2v4"] Mar 09 09:24:50 crc kubenswrapper[4861]: I0309 09:24:50.440793 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b684fb9f5-wg2v4" podUID="45ff338f-f912-422e-bb87-1db962b83fc4" containerName="dnsmasq-dns" containerID="cri-o://e52331302f6d484869f3c328b0ea4ef009c57e85ef9a0f6696c7579c7b951bd3" gracePeriod=10 Mar 09 09:24:51 crc kubenswrapper[4861]: I0309 09:24:51.205900 4861 generic.go:334] "Generic (PLEG): container finished" podID="45ff338f-f912-422e-bb87-1db962b83fc4" containerID="e52331302f6d484869f3c328b0ea4ef009c57e85ef9a0f6696c7579c7b951bd3" exitCode=0 Mar 09 09:24:51 crc kubenswrapper[4861]: I0309 09:24:51.205932 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b684fb9f5-wg2v4" event={"ID":"45ff338f-f912-422e-bb87-1db962b83fc4","Type":"ContainerDied","Data":"e52331302f6d484869f3c328b0ea4ef009c57e85ef9a0f6696c7579c7b951bd3"} Mar 09 09:24:53 crc kubenswrapper[4861]: I0309 09:24:53.202311 4861 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7b684fb9f5-wg2v4" podUID="45ff338f-f912-422e-bb87-1db962b83fc4" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.130:5353: connect: connection refused" Mar 09 09:24:54 crc kubenswrapper[4861]: I0309 09:24:54.325948 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ktlzn" Mar 09 09:24:54 crc kubenswrapper[4861]: I0309 09:24:54.423552 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/35527270-268b-4b4d-864b-de9c9c6182ea-credential-keys\") pod \"35527270-268b-4b4d-864b-de9c9c6182ea\" (UID: \"35527270-268b-4b4d-864b-de9c9c6182ea\") " Mar 09 09:24:54 crc kubenswrapper[4861]: I0309 09:24:54.423647 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-274cv\" (UniqueName: \"kubernetes.io/projected/35527270-268b-4b4d-864b-de9c9c6182ea-kube-api-access-274cv\") pod \"35527270-268b-4b4d-864b-de9c9c6182ea\" (UID: \"35527270-268b-4b4d-864b-de9c9c6182ea\") " Mar 09 09:24:54 crc kubenswrapper[4861]: I0309 09:24:54.423673 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35527270-268b-4b4d-864b-de9c9c6182ea-config-data\") pod \"35527270-268b-4b4d-864b-de9c9c6182ea\" (UID: \"35527270-268b-4b4d-864b-de9c9c6182ea\") " Mar 09 09:24:54 crc kubenswrapper[4861]: I0309 09:24:54.423692 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/35527270-268b-4b4d-864b-de9c9c6182ea-fernet-keys\") pod \"35527270-268b-4b4d-864b-de9c9c6182ea\" (UID: \"35527270-268b-4b4d-864b-de9c9c6182ea\") " Mar 09 09:24:54 crc kubenswrapper[4861]: I0309 09:24:54.423716 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35527270-268b-4b4d-864b-de9c9c6182ea-scripts\") pod \"35527270-268b-4b4d-864b-de9c9c6182ea\" (UID: \"35527270-268b-4b4d-864b-de9c9c6182ea\") " Mar 09 09:24:54 crc kubenswrapper[4861]: I0309 09:24:54.423780 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35527270-268b-4b4d-864b-de9c9c6182ea-combined-ca-bundle\") pod \"35527270-268b-4b4d-864b-de9c9c6182ea\" (UID: \"35527270-268b-4b4d-864b-de9c9c6182ea\") " Mar 09 09:24:54 crc kubenswrapper[4861]: I0309 09:24:54.432604 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35527270-268b-4b4d-864b-de9c9c6182ea-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "35527270-268b-4b4d-864b-de9c9c6182ea" (UID: "35527270-268b-4b4d-864b-de9c9c6182ea"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:24:54 crc kubenswrapper[4861]: I0309 09:24:54.432624 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35527270-268b-4b4d-864b-de9c9c6182ea-kube-api-access-274cv" (OuterVolumeSpecName: "kube-api-access-274cv") pod "35527270-268b-4b4d-864b-de9c9c6182ea" (UID: "35527270-268b-4b4d-864b-de9c9c6182ea"). InnerVolumeSpecName "kube-api-access-274cv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:24:54 crc kubenswrapper[4861]: I0309 09:24:54.434579 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35527270-268b-4b4d-864b-de9c9c6182ea-scripts" (OuterVolumeSpecName: "scripts") pod "35527270-268b-4b4d-864b-de9c9c6182ea" (UID: "35527270-268b-4b4d-864b-de9c9c6182ea"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:24:54 crc kubenswrapper[4861]: I0309 09:24:54.440519 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35527270-268b-4b4d-864b-de9c9c6182ea-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "35527270-268b-4b4d-864b-de9c9c6182ea" (UID: "35527270-268b-4b4d-864b-de9c9c6182ea"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:24:54 crc kubenswrapper[4861]: I0309 09:24:54.479215 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35527270-268b-4b4d-864b-de9c9c6182ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "35527270-268b-4b4d-864b-de9c9c6182ea" (UID: "35527270-268b-4b4d-864b-de9c9c6182ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:24:54 crc kubenswrapper[4861]: I0309 09:24:54.480724 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35527270-268b-4b4d-864b-de9c9c6182ea-config-data" (OuterVolumeSpecName: "config-data") pod "35527270-268b-4b4d-864b-de9c9c6182ea" (UID: "35527270-268b-4b4d-864b-de9c9c6182ea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:24:54 crc kubenswrapper[4861]: I0309 09:24:54.525339 4861 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35527270-268b-4b4d-864b-de9c9c6182ea-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:54 crc kubenswrapper[4861]: I0309 09:24:54.525402 4861 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35527270-268b-4b4d-864b-de9c9c6182ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:54 crc kubenswrapper[4861]: I0309 09:24:54.525426 4861 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/35527270-268b-4b4d-864b-de9c9c6182ea-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:54 crc kubenswrapper[4861]: I0309 09:24:54.525440 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-274cv\" (UniqueName: \"kubernetes.io/projected/35527270-268b-4b4d-864b-de9c9c6182ea-kube-api-access-274cv\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:54 crc kubenswrapper[4861]: I0309 09:24:54.525454 4861 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35527270-268b-4b4d-864b-de9c9c6182ea-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:54 crc kubenswrapper[4861]: I0309 09:24:54.525464 4861 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/35527270-268b-4b4d-864b-de9c9c6182ea-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:54 crc kubenswrapper[4861]: I0309 09:24:54.606037 4861 patch_prober.go:28] interesting pod/machine-config-daemon-5g7gc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:24:54 crc kubenswrapper[4861]: I0309 09:24:54.606102 4861 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:24:55 crc kubenswrapper[4861]: I0309 09:24:55.242122 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-ktlzn" event={"ID":"35527270-268b-4b4d-864b-de9c9c6182ea","Type":"ContainerDied","Data":"da5ab606d7eaee11c4feb596fb941723fa419698c2bc62f5ceff29cf892e29fe"} Mar 09 09:24:55 crc kubenswrapper[4861]: I0309 09:24:55.242402 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da5ab606d7eaee11c4feb596fb941723fa419698c2bc62f5ceff29cf892e29fe" Mar 09 09:24:55 crc kubenswrapper[4861]: I0309 09:24:55.242208 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-ktlzn" Mar 09 09:24:55 crc kubenswrapper[4861]: I0309 09:24:55.474412 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-ktlzn"] Mar 09 09:24:55 crc kubenswrapper[4861]: I0309 09:24:55.482678 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-ktlzn"] Mar 09 09:24:55 crc kubenswrapper[4861]: I0309 09:24:55.579025 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-bcd5c"] Mar 09 09:24:55 crc kubenswrapper[4861]: E0309 09:24:55.580417 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35527270-268b-4b4d-864b-de9c9c6182ea" containerName="keystone-bootstrap" Mar 09 09:24:55 crc kubenswrapper[4861]: I0309 09:24:55.580442 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="35527270-268b-4b4d-864b-de9c9c6182ea" containerName="keystone-bootstrap" Mar 09 09:24:55 crc kubenswrapper[4861]: I0309 09:24:55.580611 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="35527270-268b-4b4d-864b-de9c9c6182ea" containerName="keystone-bootstrap" Mar 09 09:24:55 crc kubenswrapper[4861]: I0309 09:24:55.581983 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bcd5c" Mar 09 09:24:55 crc kubenswrapper[4861]: I0309 09:24:55.584086 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 09 09:24:55 crc kubenswrapper[4861]: I0309 09:24:55.585903 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 09 09:24:55 crc kubenswrapper[4861]: I0309 09:24:55.586226 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 09 09:24:55 crc kubenswrapper[4861]: I0309 09:24:55.586346 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-sk77l" Mar 09 09:24:55 crc kubenswrapper[4861]: I0309 09:24:55.586900 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 09 09:24:55 crc kubenswrapper[4861]: I0309 09:24:55.594617 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-bcd5c"] Mar 09 09:24:55 crc kubenswrapper[4861]: I0309 09:24:55.670597 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35527270-268b-4b4d-864b-de9c9c6182ea" path="/var/lib/kubelet/pods/35527270-268b-4b4d-864b-de9c9c6182ea/volumes" Mar 09 09:24:55 crc kubenswrapper[4861]: I0309 09:24:55.745610 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bbd4cfda-d65e-4915-be68-f207820fe15b-credential-keys\") pod \"keystone-bootstrap-bcd5c\" (UID: \"bbd4cfda-d65e-4915-be68-f207820fe15b\") " pod="openstack/keystone-bootstrap-bcd5c" Mar 09 09:24:55 crc kubenswrapper[4861]: I0309 09:24:55.745737 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bbd4cfda-d65e-4915-be68-f207820fe15b-fernet-keys\") pod \"keystone-bootstrap-bcd5c\" (UID: \"bbd4cfda-d65e-4915-be68-f207820fe15b\") " pod="openstack/keystone-bootstrap-bcd5c" Mar 09 09:24:55 crc kubenswrapper[4861]: I0309 09:24:55.745801 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbd4cfda-d65e-4915-be68-f207820fe15b-scripts\") pod \"keystone-bootstrap-bcd5c\" (UID: \"bbd4cfda-d65e-4915-be68-f207820fe15b\") " pod="openstack/keystone-bootstrap-bcd5c" Mar 09 09:24:55 crc kubenswrapper[4861]: I0309 09:24:55.745823 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbd4cfda-d65e-4915-be68-f207820fe15b-config-data\") pod \"keystone-bootstrap-bcd5c\" (UID: \"bbd4cfda-d65e-4915-be68-f207820fe15b\") " pod="openstack/keystone-bootstrap-bcd5c" Mar 09 09:24:55 crc kubenswrapper[4861]: I0309 09:24:55.746018 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbd4cfda-d65e-4915-be68-f207820fe15b-combined-ca-bundle\") pod \"keystone-bootstrap-bcd5c\" (UID: \"bbd4cfda-d65e-4915-be68-f207820fe15b\") " pod="openstack/keystone-bootstrap-bcd5c" Mar 09 09:24:55 crc kubenswrapper[4861]: I0309 09:24:55.746234 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkqzw\" (UniqueName: \"kubernetes.io/projected/bbd4cfda-d65e-4915-be68-f207820fe15b-kube-api-access-wkqzw\") pod \"keystone-bootstrap-bcd5c\" (UID: \"bbd4cfda-d65e-4915-be68-f207820fe15b\") " pod="openstack/keystone-bootstrap-bcd5c" Mar 09 09:24:55 crc kubenswrapper[4861]: I0309 09:24:55.847778 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbd4cfda-d65e-4915-be68-f207820fe15b-combined-ca-bundle\") pod \"keystone-bootstrap-bcd5c\" (UID: \"bbd4cfda-d65e-4915-be68-f207820fe15b\") " pod="openstack/keystone-bootstrap-bcd5c" Mar 09 09:24:55 crc kubenswrapper[4861]: I0309 09:24:55.847891 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkqzw\" (UniqueName: \"kubernetes.io/projected/bbd4cfda-d65e-4915-be68-f207820fe15b-kube-api-access-wkqzw\") pod \"keystone-bootstrap-bcd5c\" (UID: \"bbd4cfda-d65e-4915-be68-f207820fe15b\") " pod="openstack/keystone-bootstrap-bcd5c" Mar 09 09:24:55 crc kubenswrapper[4861]: I0309 09:24:55.847956 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bbd4cfda-d65e-4915-be68-f207820fe15b-credential-keys\") pod \"keystone-bootstrap-bcd5c\" (UID: \"bbd4cfda-d65e-4915-be68-f207820fe15b\") " pod="openstack/keystone-bootstrap-bcd5c" Mar 09 09:24:55 crc kubenswrapper[4861]: I0309 09:24:55.848076 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bbd4cfda-d65e-4915-be68-f207820fe15b-fernet-keys\") pod \"keystone-bootstrap-bcd5c\" (UID: \"bbd4cfda-d65e-4915-be68-f207820fe15b\") " pod="openstack/keystone-bootstrap-bcd5c" Mar 09 09:24:55 crc kubenswrapper[4861]: I0309 09:24:55.848098 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbd4cfda-d65e-4915-be68-f207820fe15b-scripts\") pod \"keystone-bootstrap-bcd5c\" (UID: \"bbd4cfda-d65e-4915-be68-f207820fe15b\") " pod="openstack/keystone-bootstrap-bcd5c" Mar 09 09:24:55 crc kubenswrapper[4861]: I0309 09:24:55.848115 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbd4cfda-d65e-4915-be68-f207820fe15b-config-data\") pod \"keystone-bootstrap-bcd5c\" (UID: \"bbd4cfda-d65e-4915-be68-f207820fe15b\") " pod="openstack/keystone-bootstrap-bcd5c" Mar 09 09:24:55 crc kubenswrapper[4861]: I0309 09:24:55.854449 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bbd4cfda-d65e-4915-be68-f207820fe15b-credential-keys\") pod \"keystone-bootstrap-bcd5c\" (UID: \"bbd4cfda-d65e-4915-be68-f207820fe15b\") " pod="openstack/keystone-bootstrap-bcd5c" Mar 09 09:24:55 crc kubenswrapper[4861]: I0309 09:24:55.854726 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbd4cfda-d65e-4915-be68-f207820fe15b-config-data\") pod \"keystone-bootstrap-bcd5c\" (UID: \"bbd4cfda-d65e-4915-be68-f207820fe15b\") " pod="openstack/keystone-bootstrap-bcd5c" Mar 09 09:24:55 crc kubenswrapper[4861]: I0309 09:24:55.855427 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbd4cfda-d65e-4915-be68-f207820fe15b-combined-ca-bundle\") pod \"keystone-bootstrap-bcd5c\" (UID: \"bbd4cfda-d65e-4915-be68-f207820fe15b\") " pod="openstack/keystone-bootstrap-bcd5c" Mar 09 09:24:55 crc kubenswrapper[4861]: I0309 09:24:55.855952 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbd4cfda-d65e-4915-be68-f207820fe15b-scripts\") pod \"keystone-bootstrap-bcd5c\" (UID: \"bbd4cfda-d65e-4915-be68-f207820fe15b\") " pod="openstack/keystone-bootstrap-bcd5c" Mar 09 09:24:55 crc kubenswrapper[4861]: I0309 09:24:55.862612 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bbd4cfda-d65e-4915-be68-f207820fe15b-fernet-keys\") pod \"keystone-bootstrap-bcd5c\" (UID: \"bbd4cfda-d65e-4915-be68-f207820fe15b\") " pod="openstack/keystone-bootstrap-bcd5c" Mar 09 09:24:55 crc kubenswrapper[4861]: I0309 09:24:55.865946 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkqzw\" (UniqueName: \"kubernetes.io/projected/bbd4cfda-d65e-4915-be68-f207820fe15b-kube-api-access-wkqzw\") pod \"keystone-bootstrap-bcd5c\" (UID: \"bbd4cfda-d65e-4915-be68-f207820fe15b\") " pod="openstack/keystone-bootstrap-bcd5c" Mar 09 09:24:55 crc kubenswrapper[4861]: I0309 09:24:55.906727 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bcd5c" Mar 09 09:24:56 crc kubenswrapper[4861]: E0309 09:24:56.596387 4861 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api@sha256:11d4431e4af1735fbd9d425596f81dd62b0ca934d84d7c4e67902656c2b688d3" Mar 09 09:24:56 crc kubenswrapper[4861]: E0309 09:24:56.597276 4861 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api@sha256:11d4431e4af1735fbd9d425596f81dd62b0ca934d84d7c4e67902656c2b688d3,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ldwqp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-k46zw_openstack(c9820d89-3a89-4982-8520-f23dd0d099ad): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 09:24:56 crc kubenswrapper[4861]: E0309 09:24:56.603498 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-k46zw" podUID="c9820d89-3a89-4982-8520-f23dd0d099ad" Mar 09 09:24:56 crc kubenswrapper[4861]: I0309 09:24:56.704818 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 09:24:56 crc kubenswrapper[4861]: I0309 09:24:56.766279 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/868b4cd9-0c47-4173-b2f6-710c28e73f16-config-data\") pod \"868b4cd9-0c47-4173-b2f6-710c28e73f16\" (UID: \"868b4cd9-0c47-4173-b2f6-710c28e73f16\") " Mar 09 09:24:56 crc kubenswrapper[4861]: I0309 09:24:56.766334 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/868b4cd9-0c47-4173-b2f6-710c28e73f16-scripts\") pod \"868b4cd9-0c47-4173-b2f6-710c28e73f16\" (UID: \"868b4cd9-0c47-4173-b2f6-710c28e73f16\") " Mar 09 09:24:56 crc kubenswrapper[4861]: I0309 09:24:56.766382 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vww57\" (UniqueName: \"kubernetes.io/projected/868b4cd9-0c47-4173-b2f6-710c28e73f16-kube-api-access-vww57\") pod \"868b4cd9-0c47-4173-b2f6-710c28e73f16\" (UID: \"868b4cd9-0c47-4173-b2f6-710c28e73f16\") " Mar 09 09:24:56 crc kubenswrapper[4861]: I0309 09:24:56.766464 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/868b4cd9-0c47-4173-b2f6-710c28e73f16-combined-ca-bundle\") pod \"868b4cd9-0c47-4173-b2f6-710c28e73f16\" (UID: \"868b4cd9-0c47-4173-b2f6-710c28e73f16\") " Mar 09 09:24:56 crc kubenswrapper[4861]: I0309 09:24:56.766511 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/868b4cd9-0c47-4173-b2f6-710c28e73f16-logs\") pod \"868b4cd9-0c47-4173-b2f6-710c28e73f16\" (UID: \"868b4cd9-0c47-4173-b2f6-710c28e73f16\") " Mar 09 09:24:56 crc kubenswrapper[4861]: I0309 09:24:56.766544 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/868b4cd9-0c47-4173-b2f6-710c28e73f16-httpd-run\") pod \"868b4cd9-0c47-4173-b2f6-710c28e73f16\" (UID: \"868b4cd9-0c47-4173-b2f6-710c28e73f16\") " Mar 09 09:24:56 crc kubenswrapper[4861]: I0309 09:24:56.766777 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/868b4cd9-0c47-4173-b2f6-710c28e73f16-internal-tls-certs\") pod \"868b4cd9-0c47-4173-b2f6-710c28e73f16\" (UID: \"868b4cd9-0c47-4173-b2f6-710c28e73f16\") " Mar 09 09:24:56 crc kubenswrapper[4861]: I0309 09:24:56.766804 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"868b4cd9-0c47-4173-b2f6-710c28e73f16\" (UID: \"868b4cd9-0c47-4173-b2f6-710c28e73f16\") " Mar 09 09:24:56 crc kubenswrapper[4861]: I0309 09:24:56.767398 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/868b4cd9-0c47-4173-b2f6-710c28e73f16-logs" (OuterVolumeSpecName: "logs") pod "868b4cd9-0c47-4173-b2f6-710c28e73f16" (UID: "868b4cd9-0c47-4173-b2f6-710c28e73f16"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:24:56 crc kubenswrapper[4861]: I0309 09:24:56.767453 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/868b4cd9-0c47-4173-b2f6-710c28e73f16-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "868b4cd9-0c47-4173-b2f6-710c28e73f16" (UID: "868b4cd9-0c47-4173-b2f6-710c28e73f16"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:24:56 crc kubenswrapper[4861]: I0309 09:24:56.767938 4861 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/868b4cd9-0c47-4173-b2f6-710c28e73f16-logs\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:56 crc kubenswrapper[4861]: I0309 09:24:56.767964 4861 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/868b4cd9-0c47-4173-b2f6-710c28e73f16-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:56 crc kubenswrapper[4861]: I0309 09:24:56.773052 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/868b4cd9-0c47-4173-b2f6-710c28e73f16-kube-api-access-vww57" (OuterVolumeSpecName: "kube-api-access-vww57") pod "868b4cd9-0c47-4173-b2f6-710c28e73f16" (UID: "868b4cd9-0c47-4173-b2f6-710c28e73f16"). InnerVolumeSpecName "kube-api-access-vww57". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:24:56 crc kubenswrapper[4861]: I0309 09:24:56.773086 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "868b4cd9-0c47-4173-b2f6-710c28e73f16" (UID: "868b4cd9-0c47-4173-b2f6-710c28e73f16"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 09 09:24:56 crc kubenswrapper[4861]: I0309 09:24:56.773667 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/868b4cd9-0c47-4173-b2f6-710c28e73f16-scripts" (OuterVolumeSpecName: "scripts") pod "868b4cd9-0c47-4173-b2f6-710c28e73f16" (UID: "868b4cd9-0c47-4173-b2f6-710c28e73f16"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:24:56 crc kubenswrapper[4861]: I0309 09:24:56.791792 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/868b4cd9-0c47-4173-b2f6-710c28e73f16-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "868b4cd9-0c47-4173-b2f6-710c28e73f16" (UID: "868b4cd9-0c47-4173-b2f6-710c28e73f16"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:24:56 crc kubenswrapper[4861]: I0309 09:24:56.813765 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/868b4cd9-0c47-4173-b2f6-710c28e73f16-config-data" (OuterVolumeSpecName: "config-data") pod "868b4cd9-0c47-4173-b2f6-710c28e73f16" (UID: "868b4cd9-0c47-4173-b2f6-710c28e73f16"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:24:56 crc kubenswrapper[4861]: I0309 09:24:56.817638 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/868b4cd9-0c47-4173-b2f6-710c28e73f16-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "868b4cd9-0c47-4173-b2f6-710c28e73f16" (UID: "868b4cd9-0c47-4173-b2f6-710c28e73f16"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:24:56 crc kubenswrapper[4861]: I0309 09:24:56.869788 4861 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/868b4cd9-0c47-4173-b2f6-710c28e73f16-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:56 crc kubenswrapper[4861]: I0309 09:24:56.869839 4861 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Mar 09 09:24:56 crc kubenswrapper[4861]: I0309 09:24:56.869851 4861 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/868b4cd9-0c47-4173-b2f6-710c28e73f16-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:56 crc kubenswrapper[4861]: I0309 09:24:56.869863 4861 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/868b4cd9-0c47-4173-b2f6-710c28e73f16-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:56 crc kubenswrapper[4861]: I0309 09:24:56.869872 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vww57\" (UniqueName: \"kubernetes.io/projected/868b4cd9-0c47-4173-b2f6-710c28e73f16-kube-api-access-vww57\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:56 crc kubenswrapper[4861]: I0309 09:24:56.869882 4861 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/868b4cd9-0c47-4173-b2f6-710c28e73f16-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:56 crc kubenswrapper[4861]: I0309 09:24:56.886309 4861 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Mar 09 09:24:56 crc kubenswrapper[4861]: I0309 09:24:56.971318 4861 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Mar 09 09:24:57 crc kubenswrapper[4861]: I0309 09:24:57.259256 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 09:24:57 crc kubenswrapper[4861]: I0309 09:24:57.259912 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"868b4cd9-0c47-4173-b2f6-710c28e73f16","Type":"ContainerDied","Data":"fb731a0bc40c976812836be5b517850302a682e0f1ed6490242c787936bfcddb"} Mar 09 09:24:57 crc kubenswrapper[4861]: E0309 09:24:57.260095 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api@sha256:11d4431e4af1735fbd9d425596f81dd62b0ca934d84d7c4e67902656c2b688d3\\\"\"" pod="openstack/placement-db-sync-k46zw" podUID="c9820d89-3a89-4982-8520-f23dd0d099ad" Mar 09 09:24:57 crc kubenswrapper[4861]: I0309 09:24:57.304438 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 09:24:57 crc kubenswrapper[4861]: I0309 09:24:57.318388 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 09:24:57 crc kubenswrapper[4861]: I0309 09:24:57.328581 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 09:24:57 crc kubenswrapper[4861]: E0309 09:24:57.329285 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="868b4cd9-0c47-4173-b2f6-710c28e73f16" containerName="glance-httpd" Mar 09 09:24:57 crc kubenswrapper[4861]: I0309 09:24:57.329307 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="868b4cd9-0c47-4173-b2f6-710c28e73f16" containerName="glance-httpd" Mar 09 09:24:57 crc kubenswrapper[4861]: E0309 09:24:57.329323 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="868b4cd9-0c47-4173-b2f6-710c28e73f16" containerName="glance-log" Mar 09 09:24:57 crc kubenswrapper[4861]: I0309 09:24:57.329329 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="868b4cd9-0c47-4173-b2f6-710c28e73f16" containerName="glance-log" Mar 09 09:24:57 crc kubenswrapper[4861]: I0309 09:24:57.329516 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="868b4cd9-0c47-4173-b2f6-710c28e73f16" containerName="glance-log" Mar 09 09:24:57 crc kubenswrapper[4861]: I0309 09:24:57.329538 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="868b4cd9-0c47-4173-b2f6-710c28e73f16" containerName="glance-httpd" Mar 09 09:24:57 crc kubenswrapper[4861]: I0309 09:24:57.330596 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 09:24:57 crc kubenswrapper[4861]: I0309 09:24:57.332298 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 09 09:24:57 crc kubenswrapper[4861]: I0309 09:24:57.332630 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 09 09:24:57 crc kubenswrapper[4861]: I0309 09:24:57.349628 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 09:24:57 crc kubenswrapper[4861]: I0309 09:24:57.484614 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ace45eda-d816-4b28-9be4-88ba845234d5-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ace45eda-d816-4b28-9be4-88ba845234d5\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:24:57 crc kubenswrapper[4861]: I0309 09:24:57.484691 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ace45eda-d816-4b28-9be4-88ba845234d5-logs\") pod \"glance-default-internal-api-0\" (UID: \"ace45eda-d816-4b28-9be4-88ba845234d5\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:24:57 crc kubenswrapper[4861]: I0309 09:24:57.484748 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ace45eda-d816-4b28-9be4-88ba845234d5-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ace45eda-d816-4b28-9be4-88ba845234d5\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:24:57 crc kubenswrapper[4861]: I0309 09:24:57.484780 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jt7ng\" (UniqueName: \"kubernetes.io/projected/ace45eda-d816-4b28-9be4-88ba845234d5-kube-api-access-jt7ng\") pod \"glance-default-internal-api-0\" (UID: \"ace45eda-d816-4b28-9be4-88ba845234d5\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:24:57 crc kubenswrapper[4861]: I0309 09:24:57.484841 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ace45eda-d816-4b28-9be4-88ba845234d5-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ace45eda-d816-4b28-9be4-88ba845234d5\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:24:57 crc kubenswrapper[4861]: I0309 09:24:57.484884 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ace45eda-d816-4b28-9be4-88ba845234d5-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ace45eda-d816-4b28-9be4-88ba845234d5\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:24:57 crc kubenswrapper[4861]: I0309 09:24:57.484908 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"ace45eda-d816-4b28-9be4-88ba845234d5\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:24:57 crc kubenswrapper[4861]: I0309 09:24:57.484936 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ace45eda-d816-4b28-9be4-88ba845234d5-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ace45eda-d816-4b28-9be4-88ba845234d5\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:24:57 crc kubenswrapper[4861]: I0309 09:24:57.587081 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ace45eda-d816-4b28-9be4-88ba845234d5-logs\") pod \"glance-default-internal-api-0\" (UID: \"ace45eda-d816-4b28-9be4-88ba845234d5\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:24:57 crc kubenswrapper[4861]: I0309 09:24:57.587218 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ace45eda-d816-4b28-9be4-88ba845234d5-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ace45eda-d816-4b28-9be4-88ba845234d5\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:24:57 crc kubenswrapper[4861]: I0309 09:24:57.587295 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jt7ng\" (UniqueName: \"kubernetes.io/projected/ace45eda-d816-4b28-9be4-88ba845234d5-kube-api-access-jt7ng\") pod \"glance-default-internal-api-0\" (UID: \"ace45eda-d816-4b28-9be4-88ba845234d5\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:24:57 crc kubenswrapper[4861]: I0309 09:24:57.587331 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ace45eda-d816-4b28-9be4-88ba845234d5-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ace45eda-d816-4b28-9be4-88ba845234d5\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:24:57 crc kubenswrapper[4861]: I0309 09:24:57.587386 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ace45eda-d816-4b28-9be4-88ba845234d5-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ace45eda-d816-4b28-9be4-88ba845234d5\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:24:57 crc kubenswrapper[4861]: I0309 09:24:57.587434 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"ace45eda-d816-4b28-9be4-88ba845234d5\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:24:57 crc kubenswrapper[4861]: I0309 09:24:57.587471 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ace45eda-d816-4b28-9be4-88ba845234d5-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ace45eda-d816-4b28-9be4-88ba845234d5\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:24:57 crc kubenswrapper[4861]: I0309 09:24:57.587535 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ace45eda-d816-4b28-9be4-88ba845234d5-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ace45eda-d816-4b28-9be4-88ba845234d5\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:24:57 crc kubenswrapper[4861]: I0309 09:24:57.587576 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ace45eda-d816-4b28-9be4-88ba845234d5-logs\") pod \"glance-default-internal-api-0\" (UID: \"ace45eda-d816-4b28-9be4-88ba845234d5\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:24:57 crc kubenswrapper[4861]: I0309 09:24:57.587626 4861 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"ace45eda-d816-4b28-9be4-88ba845234d5\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Mar 09 09:24:57 crc kubenswrapper[4861]: I0309 09:24:57.587781 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ace45eda-d816-4b28-9be4-88ba845234d5-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ace45eda-d816-4b28-9be4-88ba845234d5\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:24:57 crc kubenswrapper[4861]: I0309 09:24:57.592149 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ace45eda-d816-4b28-9be4-88ba845234d5-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ace45eda-d816-4b28-9be4-88ba845234d5\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:24:57 crc kubenswrapper[4861]: I0309 09:24:57.592710 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ace45eda-d816-4b28-9be4-88ba845234d5-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ace45eda-d816-4b28-9be4-88ba845234d5\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:24:57 crc kubenswrapper[4861]: I0309 09:24:57.593335 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ace45eda-d816-4b28-9be4-88ba845234d5-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ace45eda-d816-4b28-9be4-88ba845234d5\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:24:57 crc kubenswrapper[4861]: I0309 09:24:57.598114 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ace45eda-d816-4b28-9be4-88ba845234d5-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"ace45eda-d816-4b28-9be4-88ba845234d5\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:24:57 crc kubenswrapper[4861]: I0309 09:24:57.606866 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jt7ng\" (UniqueName: \"kubernetes.io/projected/ace45eda-d816-4b28-9be4-88ba845234d5-kube-api-access-jt7ng\") pod \"glance-default-internal-api-0\" (UID: \"ace45eda-d816-4b28-9be4-88ba845234d5\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:24:57 crc kubenswrapper[4861]: I0309 09:24:57.615691 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"ace45eda-d816-4b28-9be4-88ba845234d5\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:24:57 crc kubenswrapper[4861]: I0309 09:24:57.651186 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 09:24:57 crc kubenswrapper[4861]: I0309 09:24:57.668346 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="868b4cd9-0c47-4173-b2f6-710c28e73f16" path="/var/lib/kubelet/pods/868b4cd9-0c47-4173-b2f6-710c28e73f16/volumes" Mar 09 09:24:58 crc kubenswrapper[4861]: I0309 09:24:58.202737 4861 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7b684fb9f5-wg2v4" podUID="45ff338f-f912-422e-bb87-1db962b83fc4" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.130:5353: connect: connection refused" Mar 09 09:24:58 crc kubenswrapper[4861]: E0309 09:24:58.885080 4861 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon@sha256:5a7229ed1d162f4f064013951a60f26c51a7f8e4aee40d6ac207e8d0dfd8b79b" Mar 09 09:24:58 crc kubenswrapper[4861]: E0309 09:24:58.885343 4861 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon@sha256:5a7229ed1d162f4f064013951a60f26c51a7f8e4aee40d6ac207e8d0dfd8b79b,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n66dhc5h698h6ch9ch5b7hc8hbfhch57dh65ch5d7h5b7h4h576h8fh64dhf7h68bhffh656h85h65ch8bh5c6h549h5d7h59ch675h667hc5hc6q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zs7g4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-7f474b7d57-rss5x_openstack(2ff07d5b-934f-40e1-a938-4c9c6d6bd846): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 09:24:58 crc kubenswrapper[4861]: E0309 09:24:58.889671 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon@sha256:5a7229ed1d162f4f064013951a60f26c51a7f8e4aee40d6ac207e8d0dfd8b79b\\\"\"]" pod="openstack/horizon-7f474b7d57-rss5x" podUID="2ff07d5b-934f-40e1-a938-4c9c6d6bd846" Mar 09 09:24:58 crc kubenswrapper[4861]: E0309 09:24:58.897529 4861 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon@sha256:5a7229ed1d162f4f064013951a60f26c51a7f8e4aee40d6ac207e8d0dfd8b79b" Mar 09 09:24:58 crc kubenswrapper[4861]: E0309 09:24:58.897683 4861 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon@sha256:5a7229ed1d162f4f064013951a60f26c51a7f8e4aee40d6ac207e8d0dfd8b79b,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n684hc8h659h68dhfh576h84h655hbh669h64bh5dh58h677h77h5dbh54dh59dh5ffh556h586h76h5fhb8h678h79h575h548h57ch87h5bfh57fq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sgf9l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-648545bc7-vkgcg_openstack(236224b4-5b63-4290-8f78-6367a7567dd5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 09:24:58 crc kubenswrapper[4861]: E0309 09:24:58.899741 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon@sha256:5a7229ed1d162f4f064013951a60f26c51a7f8e4aee40d6ac207e8d0dfd8b79b\\\"\"]" pod="openstack/horizon-648545bc7-vkgcg" podUID="236224b4-5b63-4290-8f78-6367a7567dd5" Mar 09 09:25:01 crc kubenswrapper[4861]: I0309 09:25:01.288323 4861 generic.go:334] "Generic (PLEG): container finished" podID="4e5455e9-fa02-46bd-8786-1888543b55cc" containerID="50c06f6ec486776fed27be621771092d458b92984683634750db4b2fa7ca73f7" exitCode=0 Mar 09 09:25:01 crc kubenswrapper[4861]: I0309 09:25:01.288433 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-ntvbd" event={"ID":"4e5455e9-fa02-46bd-8786-1888543b55cc","Type":"ContainerDied","Data":"50c06f6ec486776fed27be621771092d458b92984683634750db4b2fa7ca73f7"} Mar 09 09:25:03 crc kubenswrapper[4861]: I0309 09:25:03.203256 4861 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7b684fb9f5-wg2v4" podUID="45ff338f-f912-422e-bb87-1db962b83fc4" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.130:5353: connect: connection refused" Mar 09 09:25:03 crc kubenswrapper[4861]: I0309 09:25:03.203703 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b684fb9f5-wg2v4" Mar 09 09:25:07 crc kubenswrapper[4861]: I0309 09:25:07.194689 4861 scope.go:117] "RemoveContainer" containerID="85b70180b0939947a13dbd49e81e04864e884ff4dae384fffe528e08033e45ac" Mar 09 09:25:07 crc kubenswrapper[4861]: E0309 09:25:07.676347 4861 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:2c52c0f4b4baa15796eb284522adff7fa9e5c85a2d77c2e47ef4afdf8e4a7c7d" Mar 09 09:25:07 crc kubenswrapper[4861]: E0309 09:25:07.676768 4861 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:2c52c0f4b4baa15796eb284522adff7fa9e5c85a2d77c2e47ef4afdf8e4a7c7d,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4drhb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-8wrdd_openstack(44c35b48-50b9-4dd8-846a-99714c14d3ab): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 09:25:07 crc kubenswrapper[4861]: E0309 09:25:07.678605 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-8wrdd" podUID="44c35b48-50b9-4dd8-846a-99714c14d3ab" Mar 09 09:25:07 crc kubenswrapper[4861]: I0309 09:25:07.788557 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7f474b7d57-rss5x" Mar 09 09:25:07 crc kubenswrapper[4861]: I0309 09:25:07.795953 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-ntvbd" Mar 09 09:25:07 crc kubenswrapper[4861]: I0309 09:25:07.802535 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-648545bc7-vkgcg" Mar 09 09:25:07 crc kubenswrapper[4861]: I0309 09:25:07.971626 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4e5455e9-fa02-46bd-8786-1888543b55cc-config\") pod \"4e5455e9-fa02-46bd-8786-1888543b55cc\" (UID: \"4e5455e9-fa02-46bd-8786-1888543b55cc\") " Mar 09 09:25:07 crc kubenswrapper[4861]: I0309 09:25:07.971672 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/236224b4-5b63-4290-8f78-6367a7567dd5-logs\") pod \"236224b4-5b63-4290-8f78-6367a7567dd5\" (UID: \"236224b4-5b63-4290-8f78-6367a7567dd5\") " Mar 09 09:25:07 crc kubenswrapper[4861]: I0309 09:25:07.971722 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/236224b4-5b63-4290-8f78-6367a7567dd5-horizon-secret-key\") pod \"236224b4-5b63-4290-8f78-6367a7567dd5\" (UID: \"236224b4-5b63-4290-8f78-6367a7567dd5\") " Mar 09 09:25:07 crc kubenswrapper[4861]: I0309 09:25:07.971755 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2ff07d5b-934f-40e1-a938-4c9c6d6bd846-config-data\") pod \"2ff07d5b-934f-40e1-a938-4c9c6d6bd846\" (UID: \"2ff07d5b-934f-40e1-a938-4c9c6d6bd846\") " Mar 09 09:25:07 crc kubenswrapper[4861]: I0309 09:25:07.971813 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zs7g4\" (UniqueName: \"kubernetes.io/projected/2ff07d5b-934f-40e1-a938-4c9c6d6bd846-kube-api-access-zs7g4\") pod \"2ff07d5b-934f-40e1-a938-4c9c6d6bd846\" (UID: \"2ff07d5b-934f-40e1-a938-4c9c6d6bd846\") " Mar 09 09:25:07 crc kubenswrapper[4861]: I0309 09:25:07.971862 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqf5r\" (UniqueName: \"kubernetes.io/projected/4e5455e9-fa02-46bd-8786-1888543b55cc-kube-api-access-tqf5r\") pod \"4e5455e9-fa02-46bd-8786-1888543b55cc\" (UID: \"4e5455e9-fa02-46bd-8786-1888543b55cc\") " Mar 09 09:25:07 crc kubenswrapper[4861]: I0309 09:25:07.971898 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2ff07d5b-934f-40e1-a938-4c9c6d6bd846-scripts\") pod \"2ff07d5b-934f-40e1-a938-4c9c6d6bd846\" (UID: \"2ff07d5b-934f-40e1-a938-4c9c6d6bd846\") " Mar 09 09:25:07 crc kubenswrapper[4861]: I0309 09:25:07.971945 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/236224b4-5b63-4290-8f78-6367a7567dd5-scripts\") pod \"236224b4-5b63-4290-8f78-6367a7567dd5\" (UID: \"236224b4-5b63-4290-8f78-6367a7567dd5\") " Mar 09 09:25:07 crc kubenswrapper[4861]: I0309 09:25:07.971969 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ff07d5b-934f-40e1-a938-4c9c6d6bd846-logs\") pod \"2ff07d5b-934f-40e1-a938-4c9c6d6bd846\" (UID: \"2ff07d5b-934f-40e1-a938-4c9c6d6bd846\") " Mar 09 09:25:07 crc kubenswrapper[4861]: I0309 09:25:07.971990 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgf9l\" (UniqueName: \"kubernetes.io/projected/236224b4-5b63-4290-8f78-6367a7567dd5-kube-api-access-sgf9l\") pod \"236224b4-5b63-4290-8f78-6367a7567dd5\" (UID: \"236224b4-5b63-4290-8f78-6367a7567dd5\") " Mar 09 09:25:07 crc kubenswrapper[4861]: I0309 09:25:07.972012 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e5455e9-fa02-46bd-8786-1888543b55cc-combined-ca-bundle\") pod \"4e5455e9-fa02-46bd-8786-1888543b55cc\" (UID: \"4e5455e9-fa02-46bd-8786-1888543b55cc\") " Mar 09 09:25:07 crc kubenswrapper[4861]: I0309 09:25:07.972040 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2ff07d5b-934f-40e1-a938-4c9c6d6bd846-horizon-secret-key\") pod \"2ff07d5b-934f-40e1-a938-4c9c6d6bd846\" (UID: \"2ff07d5b-934f-40e1-a938-4c9c6d6bd846\") " Mar 09 09:25:07 crc kubenswrapper[4861]: I0309 09:25:07.972072 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/236224b4-5b63-4290-8f78-6367a7567dd5-config-data\") pod \"236224b4-5b63-4290-8f78-6367a7567dd5\" (UID: \"236224b4-5b63-4290-8f78-6367a7567dd5\") " Mar 09 09:25:07 crc kubenswrapper[4861]: I0309 09:25:07.972091 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/236224b4-5b63-4290-8f78-6367a7567dd5-logs" (OuterVolumeSpecName: "logs") pod "236224b4-5b63-4290-8f78-6367a7567dd5" (UID: "236224b4-5b63-4290-8f78-6367a7567dd5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:25:07 crc kubenswrapper[4861]: I0309 09:25:07.972423 4861 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/236224b4-5b63-4290-8f78-6367a7567dd5-logs\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:07 crc kubenswrapper[4861]: I0309 09:25:07.972484 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ff07d5b-934f-40e1-a938-4c9c6d6bd846-scripts" (OuterVolumeSpecName: "scripts") pod "2ff07d5b-934f-40e1-a938-4c9c6d6bd846" (UID: "2ff07d5b-934f-40e1-a938-4c9c6d6bd846"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:25:07 crc kubenswrapper[4861]: I0309 09:25:07.972924 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ff07d5b-934f-40e1-a938-4c9c6d6bd846-config-data" (OuterVolumeSpecName: "config-data") pod "2ff07d5b-934f-40e1-a938-4c9c6d6bd846" (UID: "2ff07d5b-934f-40e1-a938-4c9c6d6bd846"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:25:07 crc kubenswrapper[4861]: I0309 09:25:07.973044 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/236224b4-5b63-4290-8f78-6367a7567dd5-config-data" (OuterVolumeSpecName: "config-data") pod "236224b4-5b63-4290-8f78-6367a7567dd5" (UID: "236224b4-5b63-4290-8f78-6367a7567dd5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:25:07 crc kubenswrapper[4861]: I0309 09:25:07.974202 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ff07d5b-934f-40e1-a938-4c9c6d6bd846-logs" (OuterVolumeSpecName: "logs") pod "2ff07d5b-934f-40e1-a938-4c9c6d6bd846" (UID: "2ff07d5b-934f-40e1-a938-4c9c6d6bd846"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:25:07 crc kubenswrapper[4861]: I0309 09:25:07.974321 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/236224b4-5b63-4290-8f78-6367a7567dd5-scripts" (OuterVolumeSpecName: "scripts") pod "236224b4-5b63-4290-8f78-6367a7567dd5" (UID: "236224b4-5b63-4290-8f78-6367a7567dd5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:25:07 crc kubenswrapper[4861]: I0309 09:25:07.978552 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ff07d5b-934f-40e1-a938-4c9c6d6bd846-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "2ff07d5b-934f-40e1-a938-4c9c6d6bd846" (UID: "2ff07d5b-934f-40e1-a938-4c9c6d6bd846"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:25:07 crc kubenswrapper[4861]: I0309 09:25:07.978639 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/236224b4-5b63-4290-8f78-6367a7567dd5-kube-api-access-sgf9l" (OuterVolumeSpecName: "kube-api-access-sgf9l") pod "236224b4-5b63-4290-8f78-6367a7567dd5" (UID: "236224b4-5b63-4290-8f78-6367a7567dd5"). InnerVolumeSpecName "kube-api-access-sgf9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:25:07 crc kubenswrapper[4861]: I0309 09:25:07.978987 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e5455e9-fa02-46bd-8786-1888543b55cc-kube-api-access-tqf5r" (OuterVolumeSpecName: "kube-api-access-tqf5r") pod "4e5455e9-fa02-46bd-8786-1888543b55cc" (UID: "4e5455e9-fa02-46bd-8786-1888543b55cc"). InnerVolumeSpecName "kube-api-access-tqf5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:25:07 crc kubenswrapper[4861]: I0309 09:25:07.979170 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ff07d5b-934f-40e1-a938-4c9c6d6bd846-kube-api-access-zs7g4" (OuterVolumeSpecName: "kube-api-access-zs7g4") pod "2ff07d5b-934f-40e1-a938-4c9c6d6bd846" (UID: "2ff07d5b-934f-40e1-a938-4c9c6d6bd846"). InnerVolumeSpecName "kube-api-access-zs7g4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:25:07 crc kubenswrapper[4861]: I0309 09:25:07.980473 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/236224b4-5b63-4290-8f78-6367a7567dd5-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "236224b4-5b63-4290-8f78-6367a7567dd5" (UID: "236224b4-5b63-4290-8f78-6367a7567dd5"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:25:08 crc kubenswrapper[4861]: I0309 09:25:08.000450 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e5455e9-fa02-46bd-8786-1888543b55cc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4e5455e9-fa02-46bd-8786-1888543b55cc" (UID: "4e5455e9-fa02-46bd-8786-1888543b55cc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:25:08 crc kubenswrapper[4861]: I0309 09:25:08.001183 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e5455e9-fa02-46bd-8786-1888543b55cc-config" (OuterVolumeSpecName: "config") pod "4e5455e9-fa02-46bd-8786-1888543b55cc" (UID: "4e5455e9-fa02-46bd-8786-1888543b55cc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:25:08 crc kubenswrapper[4861]: I0309 09:25:08.074691 4861 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2ff07d5b-934f-40e1-a938-4c9c6d6bd846-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:08 crc kubenswrapper[4861]: I0309 09:25:08.074730 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zs7g4\" (UniqueName: \"kubernetes.io/projected/2ff07d5b-934f-40e1-a938-4c9c6d6bd846-kube-api-access-zs7g4\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:08 crc kubenswrapper[4861]: I0309 09:25:08.074744 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqf5r\" (UniqueName: \"kubernetes.io/projected/4e5455e9-fa02-46bd-8786-1888543b55cc-kube-api-access-tqf5r\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:08 crc kubenswrapper[4861]: I0309 09:25:08.074756 4861 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2ff07d5b-934f-40e1-a938-4c9c6d6bd846-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:08 crc kubenswrapper[4861]: I0309 09:25:08.074768 4861 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/236224b4-5b63-4290-8f78-6367a7567dd5-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:08 crc kubenswrapper[4861]: I0309 09:25:08.074776 4861 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2ff07d5b-934f-40e1-a938-4c9c6d6bd846-logs\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:08 crc kubenswrapper[4861]: I0309 09:25:08.074784 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgf9l\" (UniqueName: \"kubernetes.io/projected/236224b4-5b63-4290-8f78-6367a7567dd5-kube-api-access-sgf9l\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:08 crc kubenswrapper[4861]: I0309 09:25:08.074792 4861 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e5455e9-fa02-46bd-8786-1888543b55cc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:08 crc kubenswrapper[4861]: I0309 09:25:08.074800 4861 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2ff07d5b-934f-40e1-a938-4c9c6d6bd846-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:08 crc kubenswrapper[4861]: I0309 09:25:08.074807 4861 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/236224b4-5b63-4290-8f78-6367a7567dd5-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:08 crc kubenswrapper[4861]: I0309 09:25:08.074817 4861 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/4e5455e9-fa02-46bd-8786-1888543b55cc-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:08 crc kubenswrapper[4861]: I0309 09:25:08.074828 4861 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/236224b4-5b63-4290-8f78-6367a7567dd5-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:08 crc kubenswrapper[4861]: I0309 09:25:08.355288 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7f474b7d57-rss5x" event={"ID":"2ff07d5b-934f-40e1-a938-4c9c6d6bd846","Type":"ContainerDied","Data":"e0cb35cb9a56e24e4200a0ede4e63a9d715328089533f5024ebcfa5561ba6114"} Mar 09 09:25:08 crc kubenswrapper[4861]: I0309 09:25:08.355286 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7f474b7d57-rss5x" Mar 09 09:25:08 crc kubenswrapper[4861]: I0309 09:25:08.357024 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-648545bc7-vkgcg" Mar 09 09:25:08 crc kubenswrapper[4861]: I0309 09:25:08.357067 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-648545bc7-vkgcg" event={"ID":"236224b4-5b63-4290-8f78-6367a7567dd5","Type":"ContainerDied","Data":"0435b434ad60e57ebfd94b6e8a8c4c7c1dd396652614f429f0b01e627e333c6a"} Mar 09 09:25:08 crc kubenswrapper[4861]: I0309 09:25:08.360621 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-ntvbd" Mar 09 09:25:08 crc kubenswrapper[4861]: I0309 09:25:08.360746 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-ntvbd" event={"ID":"4e5455e9-fa02-46bd-8786-1888543b55cc","Type":"ContainerDied","Data":"b0b2e31f7e30f05d77af04aafb944fe11be8dbc9e906baaef16860bcd9981d52"} Mar 09 09:25:08 crc kubenswrapper[4861]: I0309 09:25:08.360776 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0b2e31f7e30f05d77af04aafb944fe11be8dbc9e906baaef16860bcd9981d52" Mar 09 09:25:08 crc kubenswrapper[4861]: E0309 09:25:08.362567 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:2c52c0f4b4baa15796eb284522adff7fa9e5c85a2d77c2e47ef4afdf8e4a7c7d\\\"\"" pod="openstack/barbican-db-sync-8wrdd" podUID="44c35b48-50b9-4dd8-846a-99714c14d3ab" Mar 09 09:25:08 crc kubenswrapper[4861]: I0309 09:25:08.406334 4861 scope.go:117] "RemoveContainer" containerID="ad8e73462be5f0224a635ca2365c40f471e78fa9cded994a064d5de52f0a2d76" Mar 09 09:25:08 crc kubenswrapper[4861]: I0309 09:25:08.448383 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7f474b7d57-rss5x"] Mar 09 09:25:08 crc kubenswrapper[4861]: I0309 09:25:08.460853 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7f474b7d57-rss5x"] Mar 09 09:25:08 crc kubenswrapper[4861]: I0309 09:25:08.478772 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-648545bc7-vkgcg"] Mar 09 09:25:08 crc kubenswrapper[4861]: I0309 09:25:08.486406 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-648545bc7-vkgcg"] Mar 09 09:25:08 crc kubenswrapper[4861]: I0309 09:25:08.965508 4861 scope.go:117] "RemoveContainer" containerID="891ee311d748d2ad6c414b13478b96e19ee8e6aeba1037c01a21c8326c398466" Mar 09 09:25:08 crc kubenswrapper[4861]: E0309 09:25:08.965979 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"891ee311d748d2ad6c414b13478b96e19ee8e6aeba1037c01a21c8326c398466\": container with ID starting with 891ee311d748d2ad6c414b13478b96e19ee8e6aeba1037c01a21c8326c398466 not found: ID does not exist" containerID="891ee311d748d2ad6c414b13478b96e19ee8e6aeba1037c01a21c8326c398466" Mar 09 09:25:08 crc kubenswrapper[4861]: I0309 09:25:08.966012 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"891ee311d748d2ad6c414b13478b96e19ee8e6aeba1037c01a21c8326c398466"} err="failed to get container status \"891ee311d748d2ad6c414b13478b96e19ee8e6aeba1037c01a21c8326c398466\": rpc error: code = NotFound desc = could not find container \"891ee311d748d2ad6c414b13478b96e19ee8e6aeba1037c01a21c8326c398466\": container with ID starting with 891ee311d748d2ad6c414b13478b96e19ee8e6aeba1037c01a21c8326c398466 not found: ID does not exist" Mar 09 09:25:08 crc kubenswrapper[4861]: I0309 09:25:08.966034 4861 scope.go:117] "RemoveContainer" containerID="85b70180b0939947a13dbd49e81e04864e884ff4dae384fffe528e08033e45ac" Mar 09 09:25:08 crc kubenswrapper[4861]: E0309 09:25:08.966388 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85b70180b0939947a13dbd49e81e04864e884ff4dae384fffe528e08033e45ac\": container with ID starting with 85b70180b0939947a13dbd49e81e04864e884ff4dae384fffe528e08033e45ac not found: ID does not exist" containerID="85b70180b0939947a13dbd49e81e04864e884ff4dae384fffe528e08033e45ac" Mar 09 09:25:08 crc kubenswrapper[4861]: I0309 09:25:08.966416 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85b70180b0939947a13dbd49e81e04864e884ff4dae384fffe528e08033e45ac"} err="failed to get container status \"85b70180b0939947a13dbd49e81e04864e884ff4dae384fffe528e08033e45ac\": rpc error: code = NotFound desc = could not find container \"85b70180b0939947a13dbd49e81e04864e884ff4dae384fffe528e08033e45ac\": container with ID starting with 85b70180b0939947a13dbd49e81e04864e884ff4dae384fffe528e08033e45ac not found: ID does not exist" Mar 09 09:25:08 crc kubenswrapper[4861]: I0309 09:25:08.966433 4861 scope.go:117] "RemoveContainer" containerID="891ee311d748d2ad6c414b13478b96e19ee8e6aeba1037c01a21c8326c398466" Mar 09 09:25:08 crc kubenswrapper[4861]: I0309 09:25:08.966675 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"891ee311d748d2ad6c414b13478b96e19ee8e6aeba1037c01a21c8326c398466"} err="failed to get container status \"891ee311d748d2ad6c414b13478b96e19ee8e6aeba1037c01a21c8326c398466\": rpc error: code = NotFound desc = could not find container \"891ee311d748d2ad6c414b13478b96e19ee8e6aeba1037c01a21c8326c398466\": container with ID starting with 891ee311d748d2ad6c414b13478b96e19ee8e6aeba1037c01a21c8326c398466 not found: ID does not exist" Mar 09 09:25:08 crc kubenswrapper[4861]: I0309 09:25:08.966699 4861 scope.go:117] "RemoveContainer" containerID="85b70180b0939947a13dbd49e81e04864e884ff4dae384fffe528e08033e45ac" Mar 09 09:25:08 crc kubenswrapper[4861]: I0309 09:25:08.966968 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85b70180b0939947a13dbd49e81e04864e884ff4dae384fffe528e08033e45ac"} err="failed to get container status \"85b70180b0939947a13dbd49e81e04864e884ff4dae384fffe528e08033e45ac\": rpc error: code = NotFound desc = could not find container \"85b70180b0939947a13dbd49e81e04864e884ff4dae384fffe528e08033e45ac\": container with ID starting with 85b70180b0939947a13dbd49e81e04864e884ff4dae384fffe528e08033e45ac not found: ID does not exist" Mar 09 09:25:08 crc kubenswrapper[4861]: I0309 09:25:08.966991 4861 scope.go:117] "RemoveContainer" containerID="af13b5ea98003838a86bd094b8836c89c496c72416b4d1f88661983ae79d77a4" Mar 09 09:25:08 crc kubenswrapper[4861]: E0309 09:25:08.977956 4861 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:44ed1ca84e17bd0f004cfbdc3c0827d767daba52abb8e83e076bfd0e6c02f838" Mar 09 09:25:08 crc kubenswrapper[4861]: E0309 09:25:08.978148 4861 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:44ed1ca84e17bd0f004cfbdc3c0827d767daba52abb8e83e076bfd0e6c02f838,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vtc75,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-sr27s_openstack(deb8e24b-1a6f-4173-9a5f-62974b0331a5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 09:25:08 crc kubenswrapper[4861]: E0309 09:25:08.980648 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-sr27s" podUID="deb8e24b-1a6f-4173-9a5f-62974b0331a5" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.128811 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7859c7799c-cztvv"] Mar 09 09:25:09 crc kubenswrapper[4861]: E0309 09:25:09.129569 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e5455e9-fa02-46bd-8786-1888543b55cc" containerName="neutron-db-sync" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.129587 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e5455e9-fa02-46bd-8786-1888543b55cc" containerName="neutron-db-sync" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.129784 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e5455e9-fa02-46bd-8786-1888543b55cc" containerName="neutron-db-sync" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.131071 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7859c7799c-cztvv" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.142701 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b684fb9f5-wg2v4" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.149167 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7859c7799c-cztvv"] Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.221092 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2d07e6c0-377a-44f4-a1d8-f984474200f6-ovsdbserver-nb\") pod \"dnsmasq-dns-7859c7799c-cztvv\" (UID: \"2d07e6c0-377a-44f4-a1d8-f984474200f6\") " pod="openstack/dnsmasq-dns-7859c7799c-cztvv" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.221437 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d07e6c0-377a-44f4-a1d8-f984474200f6-dns-svc\") pod \"dnsmasq-dns-7859c7799c-cztvv\" (UID: \"2d07e6c0-377a-44f4-a1d8-f984474200f6\") " pod="openstack/dnsmasq-dns-7859c7799c-cztvv" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.221474 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2d07e6c0-377a-44f4-a1d8-f984474200f6-ovsdbserver-sb\") pod \"dnsmasq-dns-7859c7799c-cztvv\" (UID: \"2d07e6c0-377a-44f4-a1d8-f984474200f6\") " pod="openstack/dnsmasq-dns-7859c7799c-cztvv" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.221497 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d07e6c0-377a-44f4-a1d8-f984474200f6-config\") pod \"dnsmasq-dns-7859c7799c-cztvv\" (UID: \"2d07e6c0-377a-44f4-a1d8-f984474200f6\") " pod="openstack/dnsmasq-dns-7859c7799c-cztvv" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.221565 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2d07e6c0-377a-44f4-a1d8-f984474200f6-dns-swift-storage-0\") pod \"dnsmasq-dns-7859c7799c-cztvv\" (UID: \"2d07e6c0-377a-44f4-a1d8-f984474200f6\") " pod="openstack/dnsmasq-dns-7859c7799c-cztvv" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.221608 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxfl7\" (UniqueName: \"kubernetes.io/projected/2d07e6c0-377a-44f4-a1d8-f984474200f6-kube-api-access-xxfl7\") pod \"dnsmasq-dns-7859c7799c-cztvv\" (UID: \"2d07e6c0-377a-44f4-a1d8-f984474200f6\") " pod="openstack/dnsmasq-dns-7859c7799c-cztvv" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.235634 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7fb7d5546d-n665d"] Mar 09 09:25:09 crc kubenswrapper[4861]: E0309 09:25:09.265457 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45ff338f-f912-422e-bb87-1db962b83fc4" containerName="init" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.265503 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="45ff338f-f912-422e-bb87-1db962b83fc4" containerName="init" Mar 09 09:25:09 crc kubenswrapper[4861]: E0309 09:25:09.265527 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45ff338f-f912-422e-bb87-1db962b83fc4" containerName="dnsmasq-dns" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.265537 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="45ff338f-f912-422e-bb87-1db962b83fc4" containerName="dnsmasq-dns" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.266022 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="45ff338f-f912-422e-bb87-1db962b83fc4" containerName="dnsmasq-dns" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.291511 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7fb7d5546d-n665d" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.297142 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.297625 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.297818 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.297994 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-mhz5j" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.309439 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7fb7d5546d-n665d"] Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.326240 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45ff338f-f912-422e-bb87-1db962b83fc4-ovsdbserver-nb\") pod \"45ff338f-f912-422e-bb87-1db962b83fc4\" (UID: \"45ff338f-f912-422e-bb87-1db962b83fc4\") " Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.326358 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45ff338f-f912-422e-bb87-1db962b83fc4-config\") pod \"45ff338f-f912-422e-bb87-1db962b83fc4\" (UID: \"45ff338f-f912-422e-bb87-1db962b83fc4\") " Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.326403 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/45ff338f-f912-422e-bb87-1db962b83fc4-dns-swift-storage-0\") pod \"45ff338f-f912-422e-bb87-1db962b83fc4\" (UID: \"45ff338f-f912-422e-bb87-1db962b83fc4\") " Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.326448 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45ff338f-f912-422e-bb87-1db962b83fc4-dns-svc\") pod \"45ff338f-f912-422e-bb87-1db962b83fc4\" (UID: \"45ff338f-f912-422e-bb87-1db962b83fc4\") " Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.326519 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k87xj\" (UniqueName: \"kubernetes.io/projected/45ff338f-f912-422e-bb87-1db962b83fc4-kube-api-access-k87xj\") pod \"45ff338f-f912-422e-bb87-1db962b83fc4\" (UID: \"45ff338f-f912-422e-bb87-1db962b83fc4\") " Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.326559 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45ff338f-f912-422e-bb87-1db962b83fc4-ovsdbserver-sb\") pod \"45ff338f-f912-422e-bb87-1db962b83fc4\" (UID: \"45ff338f-f912-422e-bb87-1db962b83fc4\") " Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.326926 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2d07e6c0-377a-44f4-a1d8-f984474200f6-ovsdbserver-nb\") pod \"dnsmasq-dns-7859c7799c-cztvv\" (UID: \"2d07e6c0-377a-44f4-a1d8-f984474200f6\") " pod="openstack/dnsmasq-dns-7859c7799c-cztvv" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.326980 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d07e6c0-377a-44f4-a1d8-f984474200f6-dns-svc\") pod \"dnsmasq-dns-7859c7799c-cztvv\" (UID: \"2d07e6c0-377a-44f4-a1d8-f984474200f6\") " pod="openstack/dnsmasq-dns-7859c7799c-cztvv" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.327012 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2d07e6c0-377a-44f4-a1d8-f984474200f6-ovsdbserver-sb\") pod \"dnsmasq-dns-7859c7799c-cztvv\" (UID: \"2d07e6c0-377a-44f4-a1d8-f984474200f6\") " pod="openstack/dnsmasq-dns-7859c7799c-cztvv" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.327037 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d07e6c0-377a-44f4-a1d8-f984474200f6-config\") pod \"dnsmasq-dns-7859c7799c-cztvv\" (UID: \"2d07e6c0-377a-44f4-a1d8-f984474200f6\") " pod="openstack/dnsmasq-dns-7859c7799c-cztvv" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.327132 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2d07e6c0-377a-44f4-a1d8-f984474200f6-dns-swift-storage-0\") pod \"dnsmasq-dns-7859c7799c-cztvv\" (UID: \"2d07e6c0-377a-44f4-a1d8-f984474200f6\") " pod="openstack/dnsmasq-dns-7859c7799c-cztvv" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.327163 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxfl7\" (UniqueName: \"kubernetes.io/projected/2d07e6c0-377a-44f4-a1d8-f984474200f6-kube-api-access-xxfl7\") pod \"dnsmasq-dns-7859c7799c-cztvv\" (UID: \"2d07e6c0-377a-44f4-a1d8-f984474200f6\") " pod="openstack/dnsmasq-dns-7859c7799c-cztvv" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.328526 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2d07e6c0-377a-44f4-a1d8-f984474200f6-ovsdbserver-nb\") pod \"dnsmasq-dns-7859c7799c-cztvv\" (UID: \"2d07e6c0-377a-44f4-a1d8-f984474200f6\") " pod="openstack/dnsmasq-dns-7859c7799c-cztvv" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.329332 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d07e6c0-377a-44f4-a1d8-f984474200f6-dns-svc\") pod \"dnsmasq-dns-7859c7799c-cztvv\" (UID: \"2d07e6c0-377a-44f4-a1d8-f984474200f6\") " pod="openstack/dnsmasq-dns-7859c7799c-cztvv" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.329946 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2d07e6c0-377a-44f4-a1d8-f984474200f6-ovsdbserver-sb\") pod \"dnsmasq-dns-7859c7799c-cztvv\" (UID: \"2d07e6c0-377a-44f4-a1d8-f984474200f6\") " pod="openstack/dnsmasq-dns-7859c7799c-cztvv" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.330725 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d07e6c0-377a-44f4-a1d8-f984474200f6-config\") pod \"dnsmasq-dns-7859c7799c-cztvv\" (UID: \"2d07e6c0-377a-44f4-a1d8-f984474200f6\") " pod="openstack/dnsmasq-dns-7859c7799c-cztvv" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.332655 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2d07e6c0-377a-44f4-a1d8-f984474200f6-dns-swift-storage-0\") pod \"dnsmasq-dns-7859c7799c-cztvv\" (UID: \"2d07e6c0-377a-44f4-a1d8-f984474200f6\") " pod="openstack/dnsmasq-dns-7859c7799c-cztvv" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.352839 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45ff338f-f912-422e-bb87-1db962b83fc4-kube-api-access-k87xj" (OuterVolumeSpecName: "kube-api-access-k87xj") pod "45ff338f-f912-422e-bb87-1db962b83fc4" (UID: "45ff338f-f912-422e-bb87-1db962b83fc4"). InnerVolumeSpecName "kube-api-access-k87xj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.354597 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxfl7\" (UniqueName: \"kubernetes.io/projected/2d07e6c0-377a-44f4-a1d8-f984474200f6-kube-api-access-xxfl7\") pod \"dnsmasq-dns-7859c7799c-cztvv\" (UID: \"2d07e6c0-377a-44f4-a1d8-f984474200f6\") " pod="openstack/dnsmasq-dns-7859c7799c-cztvv" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.371459 4861 scope.go:117] "RemoveContainer" containerID="f365affb6c3a106ef2c4bea634d0a820dfa2a72f4e8bb311fdfd9cd6db69b6d5" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.384062 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45ff338f-f912-422e-bb87-1db962b83fc4-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "45ff338f-f912-422e-bb87-1db962b83fc4" (UID: "45ff338f-f912-422e-bb87-1db962b83fc4"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.386167 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b684fb9f5-wg2v4" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.386172 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b684fb9f5-wg2v4" event={"ID":"45ff338f-f912-422e-bb87-1db962b83fc4","Type":"ContainerDied","Data":"ff7a90ae1a764e1edeab3f7a6760f2f83ea9deef382ab60da6b9cf73efcb3cbb"} Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.393282 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45ff338f-f912-422e-bb87-1db962b83fc4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "45ff338f-f912-422e-bb87-1db962b83fc4" (UID: "45ff338f-f912-422e-bb87-1db962b83fc4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:25:09 crc kubenswrapper[4861]: E0309 09:25:09.408893 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:44ed1ca84e17bd0f004cfbdc3c0827d767daba52abb8e83e076bfd0e6c02f838\\\"\"" pod="openstack/cinder-db-sync-sr27s" podUID="deb8e24b-1a6f-4173-9a5f-62974b0331a5" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.430271 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/30e45abc-d44d-4e9a-8478-b562905ee7c2-ovndb-tls-certs\") pod \"neutron-7fb7d5546d-n665d\" (UID: \"30e45abc-d44d-4e9a-8478-b562905ee7c2\") " pod="openstack/neutron-7fb7d5546d-n665d" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.430330 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/30e45abc-d44d-4e9a-8478-b562905ee7c2-httpd-config\") pod \"neutron-7fb7d5546d-n665d\" (UID: \"30e45abc-d44d-4e9a-8478-b562905ee7c2\") " pod="openstack/neutron-7fb7d5546d-n665d" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.430362 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/30e45abc-d44d-4e9a-8478-b562905ee7c2-config\") pod \"neutron-7fb7d5546d-n665d\" (UID: \"30e45abc-d44d-4e9a-8478-b562905ee7c2\") " pod="openstack/neutron-7fb7d5546d-n665d" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.430422 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30e45abc-d44d-4e9a-8478-b562905ee7c2-combined-ca-bundle\") pod \"neutron-7fb7d5546d-n665d\" (UID: \"30e45abc-d44d-4e9a-8478-b562905ee7c2\") " pod="openstack/neutron-7fb7d5546d-n665d" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.430443 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbj8m\" (UniqueName: \"kubernetes.io/projected/30e45abc-d44d-4e9a-8478-b562905ee7c2-kube-api-access-cbj8m\") pod \"neutron-7fb7d5546d-n665d\" (UID: \"30e45abc-d44d-4e9a-8478-b562905ee7c2\") " pod="openstack/neutron-7fb7d5546d-n665d" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.430539 4861 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/45ff338f-f912-422e-bb87-1db962b83fc4-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.430551 4861 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/45ff338f-f912-422e-bb87-1db962b83fc4-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.430562 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k87xj\" (UniqueName: \"kubernetes.io/projected/45ff338f-f912-422e-bb87-1db962b83fc4-kube-api-access-k87xj\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.455840 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45ff338f-f912-422e-bb87-1db962b83fc4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "45ff338f-f912-422e-bb87-1db962b83fc4" (UID: "45ff338f-f912-422e-bb87-1db962b83fc4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.463388 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45ff338f-f912-422e-bb87-1db962b83fc4-config" (OuterVolumeSpecName: "config") pod "45ff338f-f912-422e-bb87-1db962b83fc4" (UID: "45ff338f-f912-422e-bb87-1db962b83fc4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.483605 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45ff338f-f912-422e-bb87-1db962b83fc4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "45ff338f-f912-422e-bb87-1db962b83fc4" (UID: "45ff338f-f912-422e-bb87-1db962b83fc4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.483665 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7859c7799c-cztvv" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.532533 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/30e45abc-d44d-4e9a-8478-b562905ee7c2-ovndb-tls-certs\") pod \"neutron-7fb7d5546d-n665d\" (UID: \"30e45abc-d44d-4e9a-8478-b562905ee7c2\") " pod="openstack/neutron-7fb7d5546d-n665d" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.532888 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/30e45abc-d44d-4e9a-8478-b562905ee7c2-httpd-config\") pod \"neutron-7fb7d5546d-n665d\" (UID: \"30e45abc-d44d-4e9a-8478-b562905ee7c2\") " pod="openstack/neutron-7fb7d5546d-n665d" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.532934 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/30e45abc-d44d-4e9a-8478-b562905ee7c2-config\") pod \"neutron-7fb7d5546d-n665d\" (UID: \"30e45abc-d44d-4e9a-8478-b562905ee7c2\") " pod="openstack/neutron-7fb7d5546d-n665d" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.532992 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30e45abc-d44d-4e9a-8478-b562905ee7c2-combined-ca-bundle\") pod \"neutron-7fb7d5546d-n665d\" (UID: \"30e45abc-d44d-4e9a-8478-b562905ee7c2\") " pod="openstack/neutron-7fb7d5546d-n665d" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.533020 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbj8m\" (UniqueName: \"kubernetes.io/projected/30e45abc-d44d-4e9a-8478-b562905ee7c2-kube-api-access-cbj8m\") pod \"neutron-7fb7d5546d-n665d\" (UID: \"30e45abc-d44d-4e9a-8478-b562905ee7c2\") " pod="openstack/neutron-7fb7d5546d-n665d" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.533566 4861 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/45ff338f-f912-422e-bb87-1db962b83fc4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.533584 4861 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/45ff338f-f912-422e-bb87-1db962b83fc4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.533597 4861 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45ff338f-f912-422e-bb87-1db962b83fc4-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.536907 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30e45abc-d44d-4e9a-8478-b562905ee7c2-combined-ca-bundle\") pod \"neutron-7fb7d5546d-n665d\" (UID: \"30e45abc-d44d-4e9a-8478-b562905ee7c2\") " pod="openstack/neutron-7fb7d5546d-n665d" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.537946 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/30e45abc-d44d-4e9a-8478-b562905ee7c2-config\") pod \"neutron-7fb7d5546d-n665d\" (UID: \"30e45abc-d44d-4e9a-8478-b562905ee7c2\") " pod="openstack/neutron-7fb7d5546d-n665d" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.539309 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/30e45abc-d44d-4e9a-8478-b562905ee7c2-httpd-config\") pod \"neutron-7fb7d5546d-n665d\" (UID: \"30e45abc-d44d-4e9a-8478-b562905ee7c2\") " pod="openstack/neutron-7fb7d5546d-n665d" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.544529 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/30e45abc-d44d-4e9a-8478-b562905ee7c2-ovndb-tls-certs\") pod \"neutron-7fb7d5546d-n665d\" (UID: \"30e45abc-d44d-4e9a-8478-b562905ee7c2\") " pod="openstack/neutron-7fb7d5546d-n665d" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.550817 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbj8m\" (UniqueName: \"kubernetes.io/projected/30e45abc-d44d-4e9a-8478-b562905ee7c2-kube-api-access-cbj8m\") pod \"neutron-7fb7d5546d-n665d\" (UID: \"30e45abc-d44d-4e9a-8478-b562905ee7c2\") " pod="openstack/neutron-7fb7d5546d-n665d" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.551259 4861 scope.go:117] "RemoveContainer" containerID="e52331302f6d484869f3c328b0ea4ef009c57e85ef9a0f6696c7579c7b951bd3" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.602525 4861 scope.go:117] "RemoveContainer" containerID="26543cef1088372389596a46fd4618f07a932da0e0015f8554a64e7090e44470" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.627604 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7fb7d5546d-n665d" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.682667 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="236224b4-5b63-4290-8f78-6367a7567dd5" path="/var/lib/kubelet/pods/236224b4-5b63-4290-8f78-6367a7567dd5/volumes" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.683192 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ff07d5b-934f-40e1-a938-4c9c6d6bd846" path="/var/lib/kubelet/pods/2ff07d5b-934f-40e1-a938-4c9c6d6bd846/volumes" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.761481 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7bb4db8c4-sxjc7"] Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.787606 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-bcd5c"] Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.799283 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b684fb9f5-wg2v4"] Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.808929 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b684fb9f5-wg2v4"] Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.820648 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.893550 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 09:25:09 crc kubenswrapper[4861]: W0309 09:25:09.914930 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podace45eda_d816_4b28_9be4_88ba845234d5.slice/crio-7ebabdce2450c29dc4efb5ee8ae6dff72e56adf264714a1a9088428f0d2ca244 WatchSource:0}: Error finding container 7ebabdce2450c29dc4efb5ee8ae6dff72e56adf264714a1a9088428f0d2ca244: Status 404 returned error can't find the container with id 7ebabdce2450c29dc4efb5ee8ae6dff72e56adf264714a1a9088428f0d2ca244 Mar 09 09:25:09 crc kubenswrapper[4861]: I0309 09:25:09.931977 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5487f4d458-lnthc"] Mar 09 09:25:10 crc kubenswrapper[4861]: I0309 09:25:10.174505 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7859c7799c-cztvv"] Mar 09 09:25:10 crc kubenswrapper[4861]: I0309 09:25:10.383456 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7fb7d5546d-n665d"] Mar 09 09:25:10 crc kubenswrapper[4861]: W0309 09:25:10.389655 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30e45abc_d44d_4e9a_8478_b562905ee7c2.slice/crio-be97584258dd26010d40b9cabf3562f147ae461b745ab9360373a4d97795f376 WatchSource:0}: Error finding container be97584258dd26010d40b9cabf3562f147ae461b745ab9360373a4d97795f376: Status 404 returned error can't find the container with id be97584258dd26010d40b9cabf3562f147ae461b745ab9360373a4d97795f376 Mar 09 09:25:10 crc kubenswrapper[4861]: I0309 09:25:10.419562 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ace45eda-d816-4b28-9be4-88ba845234d5","Type":"ContainerStarted","Data":"7ebabdce2450c29dc4efb5ee8ae6dff72e56adf264714a1a9088428f0d2ca244"} Mar 09 09:25:10 crc kubenswrapper[4861]: I0309 09:25:10.425778 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5487f4d458-lnthc" event={"ID":"9049886d-2460-47fe-ac82-2dfde4858bd0","Type":"ContainerStarted","Data":"d632ee2e9b84d3a1d37a928cc972ef4d83cb04f89d65b4cae4496bf7f018907a"} Mar 09 09:25:10 crc kubenswrapper[4861]: I0309 09:25:10.425826 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5487f4d458-lnthc" event={"ID":"9049886d-2460-47fe-ac82-2dfde4858bd0","Type":"ContainerStarted","Data":"45bb98871a41b0cf0615ca1161de6673b67afc9e5a10c158657fb62dd8385c30"} Mar 09 09:25:10 crc kubenswrapper[4861]: I0309 09:25:10.429915 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3b893f89-a9bc-4a39-bd26-b394cbb0a374","Type":"ContainerStarted","Data":"7c1078a757c55399b615079346367a3c7063e060d616a6c7295df26e889b255e"} Mar 09 09:25:10 crc kubenswrapper[4861]: I0309 09:25:10.432306 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-k46zw" event={"ID":"c9820d89-3a89-4982-8520-f23dd0d099ad","Type":"ContainerStarted","Data":"5c1402523e38b6a059b942534e04454c0efa21ed13ded8af6b33ce288f6f6dbc"} Mar 09 09:25:10 crc kubenswrapper[4861]: I0309 09:25:10.448930 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-679c9c695-9vt85" event={"ID":"906444a4-92dd-48ac-931d-0799f1256e9b","Type":"ContainerStarted","Data":"63fca8073164eb35d747996d9451264d49b4dbd0ca09db4d4dbcbecc2c70924d"} Mar 09 09:25:10 crc kubenswrapper[4861]: I0309 09:25:10.448983 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-679c9c695-9vt85" event={"ID":"906444a4-92dd-48ac-931d-0799f1256e9b","Type":"ContainerStarted","Data":"7411e0031187d25c4f45f41d34ceadea9cb54445acc1c97f27bac54fcb423e4d"} Mar 09 09:25:10 crc kubenswrapper[4861]: I0309 09:25:10.449155 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-679c9c695-9vt85" podUID="906444a4-92dd-48ac-931d-0799f1256e9b" containerName="horizon-log" containerID="cri-o://7411e0031187d25c4f45f41d34ceadea9cb54445acc1c97f27bac54fcb423e4d" gracePeriod=30 Mar 09 09:25:10 crc kubenswrapper[4861]: I0309 09:25:10.449485 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-679c9c695-9vt85" podUID="906444a4-92dd-48ac-931d-0799f1256e9b" containerName="horizon" containerID="cri-o://63fca8073164eb35d747996d9451264d49b4dbd0ca09db4d4dbcbecc2c70924d" gracePeriod=30 Mar 09 09:25:10 crc kubenswrapper[4861]: I0309 09:25:10.450962 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-k46zw" podStartSLOduration=3.169126152 podStartE2EDuration="31.450947621s" podCreationTimestamp="2026-03-09 09:24:39 +0000 UTC" firstStartedPulling="2026-03-09 09:24:41.324817064 +0000 UTC m=+1124.409856465" lastFinishedPulling="2026-03-09 09:25:09.606638533 +0000 UTC m=+1152.691677934" observedRunningTime="2026-03-09 09:25:10.449658733 +0000 UTC m=+1153.534698154" watchObservedRunningTime="2026-03-09 09:25:10.450947621 +0000 UTC m=+1153.535987032" Mar 09 09:25:10 crc kubenswrapper[4861]: I0309 09:25:10.455449 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7fb7d5546d-n665d" event={"ID":"30e45abc-d44d-4e9a-8478-b562905ee7c2","Type":"ContainerStarted","Data":"be97584258dd26010d40b9cabf3562f147ae461b745ab9360373a4d97795f376"} Mar 09 09:25:10 crc kubenswrapper[4861]: I0309 09:25:10.458094 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7bb4db8c4-sxjc7" event={"ID":"71492031-e589-409e-b8c8-c0a1194b97ed","Type":"ContainerStarted","Data":"f789d6c7cb2c90f537f7edf8f51e1f3aa6d4ed13c1a60ecf2646f0e1dd6894ae"} Mar 09 09:25:10 crc kubenswrapper[4861]: I0309 09:25:10.458123 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7bb4db8c4-sxjc7" event={"ID":"71492031-e589-409e-b8c8-c0a1194b97ed","Type":"ContainerStarted","Data":"fdb5e36ba3482853e01391e95b3d25523a66bad351b0b4394ede23ef70f0a36c"} Mar 09 09:25:10 crc kubenswrapper[4861]: I0309 09:25:10.458969 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7859c7799c-cztvv" event={"ID":"2d07e6c0-377a-44f4-a1d8-f984474200f6","Type":"ContainerStarted","Data":"39f09b76e20369d596b9a0075258f17a2d7eecbf7179ae95d31a8c4a7717605e"} Mar 09 09:25:10 crc kubenswrapper[4861]: I0309 09:25:10.462327 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bcd5c" event={"ID":"bbd4cfda-d65e-4915-be68-f207820fe15b","Type":"ContainerStarted","Data":"c6030e0696081dc572663a630ff4a6ab639c92c512e4f4ee1d80fd3780338ec9"} Mar 09 09:25:10 crc kubenswrapper[4861]: I0309 09:25:10.462394 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bcd5c" event={"ID":"bbd4cfda-d65e-4915-be68-f207820fe15b","Type":"ContainerStarted","Data":"aa465b4db79ae0fd4dc729905d48e8b93afb616aa69f4df6f25e4d3febb78146"} Mar 09 09:25:10 crc kubenswrapper[4861]: I0309 09:25:10.476752 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-679c9c695-9vt85" podStartSLOduration=4.728198053 podStartE2EDuration="29.476734625s" podCreationTimestamp="2026-03-09 09:24:41 +0000 UTC" firstStartedPulling="2026-03-09 09:24:42.930039625 +0000 UTC m=+1126.015079026" lastFinishedPulling="2026-03-09 09:25:07.678576196 +0000 UTC m=+1150.763615598" observedRunningTime="2026-03-09 09:25:10.47418708 +0000 UTC m=+1153.559226481" watchObservedRunningTime="2026-03-09 09:25:10.476734625 +0000 UTC m=+1153.561774026" Mar 09 09:25:10 crc kubenswrapper[4861]: I0309 09:25:10.544383 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-bcd5c" podStartSLOduration=15.544348108 podStartE2EDuration="15.544348108s" podCreationTimestamp="2026-03-09 09:24:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:25:10.531800982 +0000 UTC m=+1153.616840393" watchObservedRunningTime="2026-03-09 09:25:10.544348108 +0000 UTC m=+1153.629387499" Mar 09 09:25:10 crc kubenswrapper[4861]: I0309 09:25:10.747902 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 09:25:11 crc kubenswrapper[4861]: I0309 09:25:11.485348 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7fb7d5546d-n665d" event={"ID":"30e45abc-d44d-4e9a-8478-b562905ee7c2","Type":"ContainerStarted","Data":"3fd703ecac672d85228b81f88612761d619524529973bcd35d35b8dbfa04b92c"} Mar 09 09:25:11 crc kubenswrapper[4861]: I0309 09:25:11.485933 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7fb7d5546d-n665d" event={"ID":"30e45abc-d44d-4e9a-8478-b562905ee7c2","Type":"ContainerStarted","Data":"b5be7ddaf2cc488012e82daab81d04dc4edccf05a70400a457f9705e7d7229b6"} Mar 09 09:25:11 crc kubenswrapper[4861]: I0309 09:25:11.486605 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7fb7d5546d-n665d" Mar 09 09:25:11 crc kubenswrapper[4861]: I0309 09:25:11.490837 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7bb4db8c4-sxjc7" event={"ID":"71492031-e589-409e-b8c8-c0a1194b97ed","Type":"ContainerStarted","Data":"c579881c05bf1dff8d11a8b2bc2f24f89a926610877fc04c23a9f98aa2682890"} Mar 09 09:25:11 crc kubenswrapper[4861]: I0309 09:25:11.492905 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ace45eda-d816-4b28-9be4-88ba845234d5","Type":"ContainerStarted","Data":"7f31414b1c79420e6a1f936e6325a3fd7e3796d6f4c036328d8a8a3e0db5bc87"} Mar 09 09:25:11 crc kubenswrapper[4861]: I0309 09:25:11.494635 4861 generic.go:334] "Generic (PLEG): container finished" podID="2d07e6c0-377a-44f4-a1d8-f984474200f6" containerID="fb25124f5deabeeac87d8a8c675a366d1d834083aef6ac71a805107146b18406" exitCode=0 Mar 09 09:25:11 crc kubenswrapper[4861]: I0309 09:25:11.494679 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7859c7799c-cztvv" event={"ID":"2d07e6c0-377a-44f4-a1d8-f984474200f6","Type":"ContainerDied","Data":"fb25124f5deabeeac87d8a8c675a366d1d834083aef6ac71a805107146b18406"} Mar 09 09:25:11 crc kubenswrapper[4861]: I0309 09:25:11.522489 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7fb7d5546d-n665d" podStartSLOduration=2.522469552 podStartE2EDuration="2.522469552s" podCreationTimestamp="2026-03-09 09:25:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:25:11.510865054 +0000 UTC m=+1154.595904455" watchObservedRunningTime="2026-03-09 09:25:11.522469552 +0000 UTC m=+1154.607508953" Mar 09 09:25:11 crc kubenswrapper[4861]: I0309 09:25:11.529605 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5487f4d458-lnthc" event={"ID":"9049886d-2460-47fe-ac82-2dfde4858bd0","Type":"ContainerStarted","Data":"4972a210b4a9e9ebb4003f6daf11cd3fe61b26f4ef01a38a9149c5a6fae5c266"} Mar 09 09:25:11 crc kubenswrapper[4861]: I0309 09:25:11.550592 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9e5dae99-f3a2-4a51-afff-03137206590b","Type":"ContainerStarted","Data":"2deec7142447462bd112a4138018b5f2d9c271dae8d6f5b8dff5b6643785c500"} Mar 09 09:25:11 crc kubenswrapper[4861]: I0309 09:25:11.564955 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7bb4db8c4-sxjc7" podStartSLOduration=23.564941652999998 podStartE2EDuration="23.564941653s" podCreationTimestamp="2026-03-09 09:24:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:25:11.562855951 +0000 UTC m=+1154.647895362" watchObservedRunningTime="2026-03-09 09:25:11.564941653 +0000 UTC m=+1154.649981054" Mar 09 09:25:11 crc kubenswrapper[4861]: I0309 09:25:11.593689 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5487f4d458-lnthc" podStartSLOduration=23.593668831 podStartE2EDuration="23.593668831s" podCreationTimestamp="2026-03-09 09:24:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:25:11.587502761 +0000 UTC m=+1154.672542152" watchObservedRunningTime="2026-03-09 09:25:11.593668831 +0000 UTC m=+1154.678708232" Mar 09 09:25:11 crc kubenswrapper[4861]: I0309 09:25:11.672154 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45ff338f-f912-422e-bb87-1db962b83fc4" path="/var/lib/kubelet/pods/45ff338f-f912-422e-bb87-1db962b83fc4/volumes" Mar 09 09:25:11 crc kubenswrapper[4861]: I0309 09:25:11.767950 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-8dfc7bfd5-qgpjh"] Mar 09 09:25:11 crc kubenswrapper[4861]: I0309 09:25:11.769675 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8dfc7bfd5-qgpjh" Mar 09 09:25:11 crc kubenswrapper[4861]: I0309 09:25:11.775613 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 09 09:25:11 crc kubenswrapper[4861]: I0309 09:25:11.780945 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-8dfc7bfd5-qgpjh"] Mar 09 09:25:11 crc kubenswrapper[4861]: I0309 09:25:11.800353 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 09 09:25:11 crc kubenswrapper[4861]: I0309 09:25:11.890108 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbbb2b98-1a7f-45f2-999d-8c4d1d879c01-public-tls-certs\") pod \"neutron-8dfc7bfd5-qgpjh\" (UID: \"bbbb2b98-1a7f-45f2-999d-8c4d1d879c01\") " pod="openstack/neutron-8dfc7bfd5-qgpjh" Mar 09 09:25:11 crc kubenswrapper[4861]: I0309 09:25:11.890486 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbbb2b98-1a7f-45f2-999d-8c4d1d879c01-combined-ca-bundle\") pod \"neutron-8dfc7bfd5-qgpjh\" (UID: \"bbbb2b98-1a7f-45f2-999d-8c4d1d879c01\") " pod="openstack/neutron-8dfc7bfd5-qgpjh" Mar 09 09:25:11 crc kubenswrapper[4861]: I0309 09:25:11.890510 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bbbb2b98-1a7f-45f2-999d-8c4d1d879c01-httpd-config\") pod \"neutron-8dfc7bfd5-qgpjh\" (UID: \"bbbb2b98-1a7f-45f2-999d-8c4d1d879c01\") " pod="openstack/neutron-8dfc7bfd5-qgpjh" Mar 09 09:25:11 crc kubenswrapper[4861]: I0309 09:25:11.890621 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bbbb2b98-1a7f-45f2-999d-8c4d1d879c01-config\") pod \"neutron-8dfc7bfd5-qgpjh\" (UID: \"bbbb2b98-1a7f-45f2-999d-8c4d1d879c01\") " pod="openstack/neutron-8dfc7bfd5-qgpjh" Mar 09 09:25:11 crc kubenswrapper[4861]: I0309 09:25:11.890670 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbbb2b98-1a7f-45f2-999d-8c4d1d879c01-ovndb-tls-certs\") pod \"neutron-8dfc7bfd5-qgpjh\" (UID: \"bbbb2b98-1a7f-45f2-999d-8c4d1d879c01\") " pod="openstack/neutron-8dfc7bfd5-qgpjh" Mar 09 09:25:11 crc kubenswrapper[4861]: I0309 09:25:11.890740 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbbb2b98-1a7f-45f2-999d-8c4d1d879c01-internal-tls-certs\") pod \"neutron-8dfc7bfd5-qgpjh\" (UID: \"bbbb2b98-1a7f-45f2-999d-8c4d1d879c01\") " pod="openstack/neutron-8dfc7bfd5-qgpjh" Mar 09 09:25:11 crc kubenswrapper[4861]: I0309 09:25:11.890765 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6pl2\" (UniqueName: \"kubernetes.io/projected/bbbb2b98-1a7f-45f2-999d-8c4d1d879c01-kube-api-access-c6pl2\") pod \"neutron-8dfc7bfd5-qgpjh\" (UID: \"bbbb2b98-1a7f-45f2-999d-8c4d1d879c01\") " pod="openstack/neutron-8dfc7bfd5-qgpjh" Mar 09 09:25:11 crc kubenswrapper[4861]: I0309 09:25:11.992856 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbbb2b98-1a7f-45f2-999d-8c4d1d879c01-ovndb-tls-certs\") pod \"neutron-8dfc7bfd5-qgpjh\" (UID: \"bbbb2b98-1a7f-45f2-999d-8c4d1d879c01\") " pod="openstack/neutron-8dfc7bfd5-qgpjh" Mar 09 09:25:11 crc kubenswrapper[4861]: I0309 09:25:11.992934 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbbb2b98-1a7f-45f2-999d-8c4d1d879c01-internal-tls-certs\") pod \"neutron-8dfc7bfd5-qgpjh\" (UID: \"bbbb2b98-1a7f-45f2-999d-8c4d1d879c01\") " pod="openstack/neutron-8dfc7bfd5-qgpjh" Mar 09 09:25:11 crc kubenswrapper[4861]: I0309 09:25:11.992965 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6pl2\" (UniqueName: \"kubernetes.io/projected/bbbb2b98-1a7f-45f2-999d-8c4d1d879c01-kube-api-access-c6pl2\") pod \"neutron-8dfc7bfd5-qgpjh\" (UID: \"bbbb2b98-1a7f-45f2-999d-8c4d1d879c01\") " pod="openstack/neutron-8dfc7bfd5-qgpjh" Mar 09 09:25:11 crc kubenswrapper[4861]: I0309 09:25:11.993023 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbbb2b98-1a7f-45f2-999d-8c4d1d879c01-public-tls-certs\") pod \"neutron-8dfc7bfd5-qgpjh\" (UID: \"bbbb2b98-1a7f-45f2-999d-8c4d1d879c01\") " pod="openstack/neutron-8dfc7bfd5-qgpjh" Mar 09 09:25:11 crc kubenswrapper[4861]: I0309 09:25:11.993085 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbbb2b98-1a7f-45f2-999d-8c4d1d879c01-combined-ca-bundle\") pod \"neutron-8dfc7bfd5-qgpjh\" (UID: \"bbbb2b98-1a7f-45f2-999d-8c4d1d879c01\") " pod="openstack/neutron-8dfc7bfd5-qgpjh" Mar 09 09:25:11 crc kubenswrapper[4861]: I0309 09:25:11.993108 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bbbb2b98-1a7f-45f2-999d-8c4d1d879c01-httpd-config\") pod \"neutron-8dfc7bfd5-qgpjh\" (UID: \"bbbb2b98-1a7f-45f2-999d-8c4d1d879c01\") " pod="openstack/neutron-8dfc7bfd5-qgpjh" Mar 09 09:25:11 crc kubenswrapper[4861]: I0309 09:25:11.993190 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bbbb2b98-1a7f-45f2-999d-8c4d1d879c01-config\") pod \"neutron-8dfc7bfd5-qgpjh\" (UID: \"bbbb2b98-1a7f-45f2-999d-8c4d1d879c01\") " pod="openstack/neutron-8dfc7bfd5-qgpjh" Mar 09 09:25:11 crc kubenswrapper[4861]: I0309 09:25:11.998914 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/bbbb2b98-1a7f-45f2-999d-8c4d1d879c01-config\") pod \"neutron-8dfc7bfd5-qgpjh\" (UID: \"bbbb2b98-1a7f-45f2-999d-8c4d1d879c01\") " pod="openstack/neutron-8dfc7bfd5-qgpjh" Mar 09 09:25:12 crc kubenswrapper[4861]: I0309 09:25:11.999777 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbbb2b98-1a7f-45f2-999d-8c4d1d879c01-ovndb-tls-certs\") pod \"neutron-8dfc7bfd5-qgpjh\" (UID: \"bbbb2b98-1a7f-45f2-999d-8c4d1d879c01\") " pod="openstack/neutron-8dfc7bfd5-qgpjh" Mar 09 09:25:12 crc kubenswrapper[4861]: I0309 09:25:12.002894 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbbb2b98-1a7f-45f2-999d-8c4d1d879c01-public-tls-certs\") pod \"neutron-8dfc7bfd5-qgpjh\" (UID: \"bbbb2b98-1a7f-45f2-999d-8c4d1d879c01\") " pod="openstack/neutron-8dfc7bfd5-qgpjh" Mar 09 09:25:12 crc kubenswrapper[4861]: I0309 09:25:12.005056 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbbb2b98-1a7f-45f2-999d-8c4d1d879c01-combined-ca-bundle\") pod \"neutron-8dfc7bfd5-qgpjh\" (UID: \"bbbb2b98-1a7f-45f2-999d-8c4d1d879c01\") " pod="openstack/neutron-8dfc7bfd5-qgpjh" Mar 09 09:25:12 crc kubenswrapper[4861]: I0309 09:25:12.005655 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bbbb2b98-1a7f-45f2-999d-8c4d1d879c01-httpd-config\") pod \"neutron-8dfc7bfd5-qgpjh\" (UID: \"bbbb2b98-1a7f-45f2-999d-8c4d1d879c01\") " pod="openstack/neutron-8dfc7bfd5-qgpjh" Mar 09 09:25:12 crc kubenswrapper[4861]: I0309 09:25:12.009920 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbbb2b98-1a7f-45f2-999d-8c4d1d879c01-internal-tls-certs\") pod \"neutron-8dfc7bfd5-qgpjh\" (UID: \"bbbb2b98-1a7f-45f2-999d-8c4d1d879c01\") " pod="openstack/neutron-8dfc7bfd5-qgpjh" Mar 09 09:25:12 crc kubenswrapper[4861]: I0309 09:25:12.019981 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6pl2\" (UniqueName: \"kubernetes.io/projected/bbbb2b98-1a7f-45f2-999d-8c4d1d879c01-kube-api-access-c6pl2\") pod \"neutron-8dfc7bfd5-qgpjh\" (UID: \"bbbb2b98-1a7f-45f2-999d-8c4d1d879c01\") " pod="openstack/neutron-8dfc7bfd5-qgpjh" Mar 09 09:25:12 crc kubenswrapper[4861]: I0309 09:25:12.086116 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8dfc7bfd5-qgpjh" Mar 09 09:25:12 crc kubenswrapper[4861]: I0309 09:25:12.086177 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-679c9c695-9vt85" Mar 09 09:25:12 crc kubenswrapper[4861]: I0309 09:25:12.567537 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ace45eda-d816-4b28-9be4-88ba845234d5","Type":"ContainerStarted","Data":"2c32fdf4263718449b53d53ce307c950e72cf7461ab5c95d7dad1d0a312e2f13"} Mar 09 09:25:12 crc kubenswrapper[4861]: I0309 09:25:12.578322 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9e5dae99-f3a2-4a51-afff-03137206590b","Type":"ContainerStarted","Data":"a17bbbb0f7b68ee288f2295b49a11ef2e38b14a8f66bfadd653ef247d1307af2"} Mar 09 09:25:12 crc kubenswrapper[4861]: I0309 09:25:12.610701 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=15.610657901 podStartE2EDuration="15.610657901s" podCreationTimestamp="2026-03-09 09:24:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:25:12.59626096 +0000 UTC m=+1155.681300371" watchObservedRunningTime="2026-03-09 09:25:12.610657901 +0000 UTC m=+1155.695697312" Mar 09 09:25:13 crc kubenswrapper[4861]: I0309 09:25:13.202971 4861 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7b684fb9f5-wg2v4" podUID="45ff338f-f912-422e-bb87-1db962b83fc4" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.130:5353: i/o timeout" Mar 09 09:25:13 crc kubenswrapper[4861]: I0309 09:25:13.291960 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-8dfc7bfd5-qgpjh"] Mar 09 09:25:13 crc kubenswrapper[4861]: I0309 09:25:13.588335 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9e5dae99-f3a2-4a51-afff-03137206590b","Type":"ContainerStarted","Data":"7c80a17f9d55ef9b8146e4fefece494e587566ef43b98f62834271c346cdc933"} Mar 09 09:25:13 crc kubenswrapper[4861]: I0309 09:25:13.591652 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3b893f89-a9bc-4a39-bd26-b394cbb0a374","Type":"ContainerStarted","Data":"e15fa3a3dc10d161f1fc033b6f91e18abcb8ed972cc20b55f2cdece20ece59af"} Mar 09 09:25:13 crc kubenswrapper[4861]: I0309 09:25:13.593238 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8dfc7bfd5-qgpjh" event={"ID":"bbbb2b98-1a7f-45f2-999d-8c4d1d879c01","Type":"ContainerStarted","Data":"178eefe1348bf47a8a4ff069f73aee67d2a9d41c9f6f1b2a8fc901ce1b9cb77c"} Mar 09 09:25:13 crc kubenswrapper[4861]: I0309 09:25:13.595792 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7859c7799c-cztvv" event={"ID":"2d07e6c0-377a-44f4-a1d8-f984474200f6","Type":"ContainerStarted","Data":"5c0fdd5e3bf5d45ccd03d9486f5f6cd479ed4ab1c3641e8d2275e0f89a4d8465"} Mar 09 09:25:13 crc kubenswrapper[4861]: I0309 09:25:13.615060 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7859c7799c-cztvv" podStartSLOduration=4.615043182 podStartE2EDuration="4.615043182s" podCreationTimestamp="2026-03-09 09:25:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:25:13.613516497 +0000 UTC m=+1156.698555898" watchObservedRunningTime="2026-03-09 09:25:13.615043182 +0000 UTC m=+1156.700082583" Mar 09 09:25:14 crc kubenswrapper[4861]: I0309 09:25:14.484091 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7859c7799c-cztvv" Mar 09 09:25:14 crc kubenswrapper[4861]: I0309 09:25:14.605821 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8dfc7bfd5-qgpjh" event={"ID":"bbbb2b98-1a7f-45f2-999d-8c4d1d879c01","Type":"ContainerStarted","Data":"2acb7c830dfedd1fba55a973d93130f09024212c84ce99b6a2152b544d5cbd8a"} Mar 09 09:25:14 crc kubenswrapper[4861]: I0309 09:25:14.605875 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8dfc7bfd5-qgpjh" event={"ID":"bbbb2b98-1a7f-45f2-999d-8c4d1d879c01","Type":"ContainerStarted","Data":"62d9f61865c0aba71cc143cf38566dfd023bf5ccc20494aec11ffb3b58f48655"} Mar 09 09:25:14 crc kubenswrapper[4861]: I0309 09:25:14.606143 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-8dfc7bfd5-qgpjh" Mar 09 09:25:14 crc kubenswrapper[4861]: I0309 09:25:14.606209 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="9e5dae99-f3a2-4a51-afff-03137206590b" containerName="glance-log" containerID="cri-o://a17bbbb0f7b68ee288f2295b49a11ef2e38b14a8f66bfadd653ef247d1307af2" gracePeriod=30 Mar 09 09:25:14 crc kubenswrapper[4861]: I0309 09:25:14.606248 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="9e5dae99-f3a2-4a51-afff-03137206590b" containerName="glance-httpd" containerID="cri-o://7c80a17f9d55ef9b8146e4fefece494e587566ef43b98f62834271c346cdc933" gracePeriod=30 Mar 09 09:25:14 crc kubenswrapper[4861]: I0309 09:25:14.659158 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-8dfc7bfd5-qgpjh" podStartSLOduration=3.659136413 podStartE2EDuration="3.659136413s" podCreationTimestamp="2026-03-09 09:25:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:25:14.634501263 +0000 UTC m=+1157.719540694" watchObservedRunningTime="2026-03-09 09:25:14.659136413 +0000 UTC m=+1157.744175834" Mar 09 09:25:14 crc kubenswrapper[4861]: I0309 09:25:14.667563 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=29.667542388 podStartE2EDuration="29.667542388s" podCreationTimestamp="2026-03-09 09:24:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:25:14.662901592 +0000 UTC m=+1157.747941013" watchObservedRunningTime="2026-03-09 09:25:14.667542388 +0000 UTC m=+1157.752581789" Mar 09 09:25:15 crc kubenswrapper[4861]: I0309 09:25:15.616149 4861 generic.go:334] "Generic (PLEG): container finished" podID="9e5dae99-f3a2-4a51-afff-03137206590b" containerID="a17bbbb0f7b68ee288f2295b49a11ef2e38b14a8f66bfadd653ef247d1307af2" exitCode=143 Mar 09 09:25:15 crc kubenswrapper[4861]: I0309 09:25:15.616595 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9e5dae99-f3a2-4a51-afff-03137206590b","Type":"ContainerDied","Data":"a17bbbb0f7b68ee288f2295b49a11ef2e38b14a8f66bfadd653ef247d1307af2"} Mar 09 09:25:15 crc kubenswrapper[4861]: I0309 09:25:15.642246 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 09 09:25:15 crc kubenswrapper[4861]: I0309 09:25:15.642299 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 09 09:25:16 crc kubenswrapper[4861]: I0309 09:25:16.457399 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 09:25:16 crc kubenswrapper[4861]: I0309 09:25:16.526546 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e5dae99-f3a2-4a51-afff-03137206590b-config-data\") pod \"9e5dae99-f3a2-4a51-afff-03137206590b\" (UID: \"9e5dae99-f3a2-4a51-afff-03137206590b\") " Mar 09 09:25:16 crc kubenswrapper[4861]: I0309 09:25:16.526706 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e5dae99-f3a2-4a51-afff-03137206590b-combined-ca-bundle\") pod \"9e5dae99-f3a2-4a51-afff-03137206590b\" (UID: \"9e5dae99-f3a2-4a51-afff-03137206590b\") " Mar 09 09:25:16 crc kubenswrapper[4861]: I0309 09:25:16.526733 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"9e5dae99-f3a2-4a51-afff-03137206590b\" (UID: \"9e5dae99-f3a2-4a51-afff-03137206590b\") " Mar 09 09:25:16 crc kubenswrapper[4861]: I0309 09:25:16.526779 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e5dae99-f3a2-4a51-afff-03137206590b-logs\") pod \"9e5dae99-f3a2-4a51-afff-03137206590b\" (UID: \"9e5dae99-f3a2-4a51-afff-03137206590b\") " Mar 09 09:25:16 crc kubenswrapper[4861]: I0309 09:25:16.526811 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e5dae99-f3a2-4a51-afff-03137206590b-scripts\") pod \"9e5dae99-f3a2-4a51-afff-03137206590b\" (UID: \"9e5dae99-f3a2-4a51-afff-03137206590b\") " Mar 09 09:25:16 crc kubenswrapper[4861]: I0309 09:25:16.526832 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7nq4\" (UniqueName: \"kubernetes.io/projected/9e5dae99-f3a2-4a51-afff-03137206590b-kube-api-access-m7nq4\") pod \"9e5dae99-f3a2-4a51-afff-03137206590b\" (UID: \"9e5dae99-f3a2-4a51-afff-03137206590b\") " Mar 09 09:25:16 crc kubenswrapper[4861]: I0309 09:25:16.526888 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e5dae99-f3a2-4a51-afff-03137206590b-public-tls-certs\") pod \"9e5dae99-f3a2-4a51-afff-03137206590b\" (UID: \"9e5dae99-f3a2-4a51-afff-03137206590b\") " Mar 09 09:25:16 crc kubenswrapper[4861]: I0309 09:25:16.526931 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9e5dae99-f3a2-4a51-afff-03137206590b-httpd-run\") pod \"9e5dae99-f3a2-4a51-afff-03137206590b\" (UID: \"9e5dae99-f3a2-4a51-afff-03137206590b\") " Mar 09 09:25:16 crc kubenswrapper[4861]: I0309 09:25:16.529545 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e5dae99-f3a2-4a51-afff-03137206590b-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9e5dae99-f3a2-4a51-afff-03137206590b" (UID: "9e5dae99-f3a2-4a51-afff-03137206590b"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:25:16 crc kubenswrapper[4861]: I0309 09:25:16.530224 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e5dae99-f3a2-4a51-afff-03137206590b-logs" (OuterVolumeSpecName: "logs") pod "9e5dae99-f3a2-4a51-afff-03137206590b" (UID: "9e5dae99-f3a2-4a51-afff-03137206590b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:25:16 crc kubenswrapper[4861]: I0309 09:25:16.537433 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "9e5dae99-f3a2-4a51-afff-03137206590b" (UID: "9e5dae99-f3a2-4a51-afff-03137206590b"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 09 09:25:16 crc kubenswrapper[4861]: I0309 09:25:16.537767 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e5dae99-f3a2-4a51-afff-03137206590b-scripts" (OuterVolumeSpecName: "scripts") pod "9e5dae99-f3a2-4a51-afff-03137206590b" (UID: "9e5dae99-f3a2-4a51-afff-03137206590b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:25:16 crc kubenswrapper[4861]: I0309 09:25:16.560677 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e5dae99-f3a2-4a51-afff-03137206590b-kube-api-access-m7nq4" (OuterVolumeSpecName: "kube-api-access-m7nq4") pod "9e5dae99-f3a2-4a51-afff-03137206590b" (UID: "9e5dae99-f3a2-4a51-afff-03137206590b"). InnerVolumeSpecName "kube-api-access-m7nq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:25:16 crc kubenswrapper[4861]: I0309 09:25:16.609214 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e5dae99-f3a2-4a51-afff-03137206590b-config-data" (OuterVolumeSpecName: "config-data") pod "9e5dae99-f3a2-4a51-afff-03137206590b" (UID: "9e5dae99-f3a2-4a51-afff-03137206590b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:25:16 crc kubenswrapper[4861]: I0309 09:25:16.610785 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e5dae99-f3a2-4a51-afff-03137206590b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9e5dae99-f3a2-4a51-afff-03137206590b" (UID: "9e5dae99-f3a2-4a51-afff-03137206590b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:25:16 crc kubenswrapper[4861]: I0309 09:25:16.611520 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e5dae99-f3a2-4a51-afff-03137206590b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9e5dae99-f3a2-4a51-afff-03137206590b" (UID: "9e5dae99-f3a2-4a51-afff-03137206590b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:25:16 crc kubenswrapper[4861]: I0309 09:25:16.631537 4861 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e5dae99-f3a2-4a51-afff-03137206590b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:16 crc kubenswrapper[4861]: I0309 09:25:16.631593 4861 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Mar 09 09:25:16 crc kubenswrapper[4861]: I0309 09:25:16.631604 4861 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e5dae99-f3a2-4a51-afff-03137206590b-logs\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:16 crc kubenswrapper[4861]: I0309 09:25:16.631615 4861 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e5dae99-f3a2-4a51-afff-03137206590b-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:16 crc kubenswrapper[4861]: I0309 09:25:16.631623 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7nq4\" (UniqueName: \"kubernetes.io/projected/9e5dae99-f3a2-4a51-afff-03137206590b-kube-api-access-m7nq4\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:16 crc kubenswrapper[4861]: I0309 09:25:16.631635 4861 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9e5dae99-f3a2-4a51-afff-03137206590b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:16 crc kubenswrapper[4861]: I0309 09:25:16.631643 4861 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9e5dae99-f3a2-4a51-afff-03137206590b-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:16 crc kubenswrapper[4861]: I0309 09:25:16.631652 4861 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e5dae99-f3a2-4a51-afff-03137206590b-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:16 crc kubenswrapper[4861]: I0309 09:25:16.649139 4861 generic.go:334] "Generic (PLEG): container finished" podID="9e5dae99-f3a2-4a51-afff-03137206590b" containerID="7c80a17f9d55ef9b8146e4fefece494e587566ef43b98f62834271c346cdc933" exitCode=0 Mar 09 09:25:16 crc kubenswrapper[4861]: I0309 09:25:16.649187 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9e5dae99-f3a2-4a51-afff-03137206590b","Type":"ContainerDied","Data":"7c80a17f9d55ef9b8146e4fefece494e587566ef43b98f62834271c346cdc933"} Mar 09 09:25:16 crc kubenswrapper[4861]: I0309 09:25:16.649218 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9e5dae99-f3a2-4a51-afff-03137206590b","Type":"ContainerDied","Data":"2deec7142447462bd112a4138018b5f2d9c271dae8d6f5b8dff5b6643785c500"} Mar 09 09:25:16 crc kubenswrapper[4861]: I0309 09:25:16.649239 4861 scope.go:117] "RemoveContainer" containerID="7c80a17f9d55ef9b8146e4fefece494e587566ef43b98f62834271c346cdc933" Mar 09 09:25:16 crc kubenswrapper[4861]: I0309 09:25:16.649390 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 09:25:16 crc kubenswrapper[4861]: I0309 09:25:16.673556 4861 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Mar 09 09:25:16 crc kubenswrapper[4861]: I0309 09:25:16.717939 4861 scope.go:117] "RemoveContainer" containerID="a17bbbb0f7b68ee288f2295b49a11ef2e38b14a8f66bfadd653ef247d1307af2" Mar 09 09:25:16 crc kubenswrapper[4861]: I0309 09:25:16.725626 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 09:25:16 crc kubenswrapper[4861]: I0309 09:25:16.735337 4861 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:16 crc kubenswrapper[4861]: I0309 09:25:16.779668 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 09:25:16 crc kubenswrapper[4861]: I0309 09:25:16.803571 4861 scope.go:117] "RemoveContainer" containerID="7c80a17f9d55ef9b8146e4fefece494e587566ef43b98f62834271c346cdc933" Mar 09 09:25:16 crc kubenswrapper[4861]: E0309 09:25:16.804005 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c80a17f9d55ef9b8146e4fefece494e587566ef43b98f62834271c346cdc933\": container with ID starting with 7c80a17f9d55ef9b8146e4fefece494e587566ef43b98f62834271c346cdc933 not found: ID does not exist" containerID="7c80a17f9d55ef9b8146e4fefece494e587566ef43b98f62834271c346cdc933" Mar 09 09:25:16 crc kubenswrapper[4861]: I0309 09:25:16.804047 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c80a17f9d55ef9b8146e4fefece494e587566ef43b98f62834271c346cdc933"} err="failed to get container status \"7c80a17f9d55ef9b8146e4fefece494e587566ef43b98f62834271c346cdc933\": rpc error: code = NotFound desc = could not find container \"7c80a17f9d55ef9b8146e4fefece494e587566ef43b98f62834271c346cdc933\": container with ID starting with 7c80a17f9d55ef9b8146e4fefece494e587566ef43b98f62834271c346cdc933 not found: ID does not exist" Mar 09 09:25:16 crc kubenswrapper[4861]: I0309 09:25:16.804073 4861 scope.go:117] "RemoveContainer" containerID="a17bbbb0f7b68ee288f2295b49a11ef2e38b14a8f66bfadd653ef247d1307af2" Mar 09 09:25:16 crc kubenswrapper[4861]: E0309 09:25:16.804356 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a17bbbb0f7b68ee288f2295b49a11ef2e38b14a8f66bfadd653ef247d1307af2\": container with ID starting with a17bbbb0f7b68ee288f2295b49a11ef2e38b14a8f66bfadd653ef247d1307af2 not found: ID does not exist" containerID="a17bbbb0f7b68ee288f2295b49a11ef2e38b14a8f66bfadd653ef247d1307af2" Mar 09 09:25:16 crc kubenswrapper[4861]: I0309 09:25:16.804458 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a17bbbb0f7b68ee288f2295b49a11ef2e38b14a8f66bfadd653ef247d1307af2"} err="failed to get container status \"a17bbbb0f7b68ee288f2295b49a11ef2e38b14a8f66bfadd653ef247d1307af2\": rpc error: code = NotFound desc = could not find container \"a17bbbb0f7b68ee288f2295b49a11ef2e38b14a8f66bfadd653ef247d1307af2\": container with ID starting with a17bbbb0f7b68ee288f2295b49a11ef2e38b14a8f66bfadd653ef247d1307af2 not found: ID does not exist" Mar 09 09:25:16 crc kubenswrapper[4861]: I0309 09:25:16.815445 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 09:25:16 crc kubenswrapper[4861]: E0309 09:25:16.815825 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e5dae99-f3a2-4a51-afff-03137206590b" containerName="glance-log" Mar 09 09:25:16 crc kubenswrapper[4861]: I0309 09:25:16.815837 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e5dae99-f3a2-4a51-afff-03137206590b" containerName="glance-log" Mar 09 09:25:16 crc kubenswrapper[4861]: E0309 09:25:16.815847 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e5dae99-f3a2-4a51-afff-03137206590b" containerName="glance-httpd" Mar 09 09:25:16 crc kubenswrapper[4861]: I0309 09:25:16.815854 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e5dae99-f3a2-4a51-afff-03137206590b" containerName="glance-httpd" Mar 09 09:25:16 crc kubenswrapper[4861]: I0309 09:25:16.816011 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e5dae99-f3a2-4a51-afff-03137206590b" containerName="glance-log" Mar 09 09:25:16 crc kubenswrapper[4861]: I0309 09:25:16.816047 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e5dae99-f3a2-4a51-afff-03137206590b" containerName="glance-httpd" Mar 09 09:25:16 crc kubenswrapper[4861]: I0309 09:25:16.816979 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 09:25:16 crc kubenswrapper[4861]: I0309 09:25:16.823691 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 09 09:25:16 crc kubenswrapper[4861]: I0309 09:25:16.823817 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 09 09:25:16 crc kubenswrapper[4861]: I0309 09:25:16.832111 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 09:25:16 crc kubenswrapper[4861]: I0309 09:25:16.948944 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv4jh\" (UniqueName: \"kubernetes.io/projected/628d8d0c-948a-4878-ac3f-d1c35befe1d0-kube-api-access-hv4jh\") pod \"glance-default-external-api-0\" (UID: \"628d8d0c-948a-4878-ac3f-d1c35befe1d0\") " pod="openstack/glance-default-external-api-0" Mar 09 09:25:16 crc kubenswrapper[4861]: I0309 09:25:16.949050 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"628d8d0c-948a-4878-ac3f-d1c35befe1d0\") " pod="openstack/glance-default-external-api-0" Mar 09 09:25:16 crc kubenswrapper[4861]: I0309 09:25:16.949118 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/628d8d0c-948a-4878-ac3f-d1c35befe1d0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"628d8d0c-948a-4878-ac3f-d1c35befe1d0\") " pod="openstack/glance-default-external-api-0" Mar 09 09:25:16 crc kubenswrapper[4861]: I0309 09:25:16.949176 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/628d8d0c-948a-4878-ac3f-d1c35befe1d0-logs\") pod \"glance-default-external-api-0\" (UID: \"628d8d0c-948a-4878-ac3f-d1c35befe1d0\") " pod="openstack/glance-default-external-api-0" Mar 09 09:25:16 crc kubenswrapper[4861]: I0309 09:25:16.949217 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/628d8d0c-948a-4878-ac3f-d1c35befe1d0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"628d8d0c-948a-4878-ac3f-d1c35befe1d0\") " pod="openstack/glance-default-external-api-0" Mar 09 09:25:16 crc kubenswrapper[4861]: I0309 09:25:16.949303 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/628d8d0c-948a-4878-ac3f-d1c35befe1d0-scripts\") pod \"glance-default-external-api-0\" (UID: \"628d8d0c-948a-4878-ac3f-d1c35befe1d0\") " pod="openstack/glance-default-external-api-0" Mar 09 09:25:16 crc kubenswrapper[4861]: I0309 09:25:16.949509 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/628d8d0c-948a-4878-ac3f-d1c35befe1d0-config-data\") pod \"glance-default-external-api-0\" (UID: \"628d8d0c-948a-4878-ac3f-d1c35befe1d0\") " pod="openstack/glance-default-external-api-0" Mar 09 09:25:16 crc kubenswrapper[4861]: I0309 09:25:16.949625 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/628d8d0c-948a-4878-ac3f-d1c35befe1d0-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"628d8d0c-948a-4878-ac3f-d1c35befe1d0\") " pod="openstack/glance-default-external-api-0" Mar 09 09:25:17 crc kubenswrapper[4861]: I0309 09:25:17.051583 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hv4jh\" (UniqueName: \"kubernetes.io/projected/628d8d0c-948a-4878-ac3f-d1c35befe1d0-kube-api-access-hv4jh\") pod \"glance-default-external-api-0\" (UID: \"628d8d0c-948a-4878-ac3f-d1c35befe1d0\") " pod="openstack/glance-default-external-api-0" Mar 09 09:25:17 crc kubenswrapper[4861]: I0309 09:25:17.051668 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"628d8d0c-948a-4878-ac3f-d1c35befe1d0\") " pod="openstack/glance-default-external-api-0" Mar 09 09:25:17 crc kubenswrapper[4861]: I0309 09:25:17.051740 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/628d8d0c-948a-4878-ac3f-d1c35befe1d0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"628d8d0c-948a-4878-ac3f-d1c35befe1d0\") " pod="openstack/glance-default-external-api-0" Mar 09 09:25:17 crc kubenswrapper[4861]: I0309 09:25:17.051777 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/628d8d0c-948a-4878-ac3f-d1c35befe1d0-logs\") pod \"glance-default-external-api-0\" (UID: \"628d8d0c-948a-4878-ac3f-d1c35befe1d0\") " pod="openstack/glance-default-external-api-0" Mar 09 09:25:17 crc kubenswrapper[4861]: I0309 09:25:17.051913 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/628d8d0c-948a-4878-ac3f-d1c35befe1d0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"628d8d0c-948a-4878-ac3f-d1c35befe1d0\") " pod="openstack/glance-default-external-api-0" Mar 09 09:25:17 crc kubenswrapper[4861]: I0309 09:25:17.051987 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/628d8d0c-948a-4878-ac3f-d1c35befe1d0-scripts\") pod \"glance-default-external-api-0\" (UID: \"628d8d0c-948a-4878-ac3f-d1c35befe1d0\") " pod="openstack/glance-default-external-api-0" Mar 09 09:25:17 crc kubenswrapper[4861]: I0309 09:25:17.052048 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/628d8d0c-948a-4878-ac3f-d1c35befe1d0-config-data\") pod \"glance-default-external-api-0\" (UID: \"628d8d0c-948a-4878-ac3f-d1c35befe1d0\") " pod="openstack/glance-default-external-api-0" Mar 09 09:25:17 crc kubenswrapper[4861]: I0309 09:25:17.052119 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/628d8d0c-948a-4878-ac3f-d1c35befe1d0-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"628d8d0c-948a-4878-ac3f-d1c35befe1d0\") " pod="openstack/glance-default-external-api-0" Mar 09 09:25:17 crc kubenswrapper[4861]: I0309 09:25:17.052283 4861 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"628d8d0c-948a-4878-ac3f-d1c35befe1d0\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-external-api-0" Mar 09 09:25:17 crc kubenswrapper[4861]: I0309 09:25:17.052538 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/628d8d0c-948a-4878-ac3f-d1c35befe1d0-logs\") pod \"glance-default-external-api-0\" (UID: \"628d8d0c-948a-4878-ac3f-d1c35befe1d0\") " pod="openstack/glance-default-external-api-0" Mar 09 09:25:17 crc kubenswrapper[4861]: I0309 09:25:17.054571 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/628d8d0c-948a-4878-ac3f-d1c35befe1d0-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"628d8d0c-948a-4878-ac3f-d1c35befe1d0\") " pod="openstack/glance-default-external-api-0" Mar 09 09:25:17 crc kubenswrapper[4861]: I0309 09:25:17.062908 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/628d8d0c-948a-4878-ac3f-d1c35befe1d0-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"628d8d0c-948a-4878-ac3f-d1c35befe1d0\") " pod="openstack/glance-default-external-api-0" Mar 09 09:25:17 crc kubenswrapper[4861]: I0309 09:25:17.065133 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/628d8d0c-948a-4878-ac3f-d1c35befe1d0-scripts\") pod \"glance-default-external-api-0\" (UID: \"628d8d0c-948a-4878-ac3f-d1c35befe1d0\") " pod="openstack/glance-default-external-api-0" Mar 09 09:25:17 crc kubenswrapper[4861]: I0309 09:25:17.067415 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/628d8d0c-948a-4878-ac3f-d1c35befe1d0-config-data\") pod \"glance-default-external-api-0\" (UID: \"628d8d0c-948a-4878-ac3f-d1c35befe1d0\") " pod="openstack/glance-default-external-api-0" Mar 09 09:25:17 crc kubenswrapper[4861]: I0309 09:25:17.069328 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/628d8d0c-948a-4878-ac3f-d1c35befe1d0-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"628d8d0c-948a-4878-ac3f-d1c35befe1d0\") " pod="openstack/glance-default-external-api-0" Mar 09 09:25:17 crc kubenswrapper[4861]: I0309 09:25:17.074207 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv4jh\" (UniqueName: \"kubernetes.io/projected/628d8d0c-948a-4878-ac3f-d1c35befe1d0-kube-api-access-hv4jh\") pod \"glance-default-external-api-0\" (UID: \"628d8d0c-948a-4878-ac3f-d1c35befe1d0\") " pod="openstack/glance-default-external-api-0" Mar 09 09:25:17 crc kubenswrapper[4861]: I0309 09:25:17.098801 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"628d8d0c-948a-4878-ac3f-d1c35befe1d0\") " pod="openstack/glance-default-external-api-0" Mar 09 09:25:17 crc kubenswrapper[4861]: I0309 09:25:17.148822 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 09:25:17 crc kubenswrapper[4861]: I0309 09:25:17.515510 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 09:25:17 crc kubenswrapper[4861]: I0309 09:25:17.652629 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 09 09:25:17 crc kubenswrapper[4861]: I0309 09:25:17.652680 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 09 09:25:17 crc kubenswrapper[4861]: I0309 09:25:17.675823 4861 generic.go:334] "Generic (PLEG): container finished" podID="c9820d89-3a89-4982-8520-f23dd0d099ad" containerID="5c1402523e38b6a059b942534e04454c0efa21ed13ded8af6b33ce288f6f6dbc" exitCode=0 Mar 09 09:25:17 crc kubenswrapper[4861]: I0309 09:25:17.678992 4861 generic.go:334] "Generic (PLEG): container finished" podID="bbd4cfda-d65e-4915-be68-f207820fe15b" containerID="c6030e0696081dc572663a630ff4a6ab639c92c512e4f4ee1d80fd3780338ec9" exitCode=0 Mar 09 09:25:17 crc kubenswrapper[4861]: I0309 09:25:17.711068 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e5dae99-f3a2-4a51-afff-03137206590b" path="/var/lib/kubelet/pods/9e5dae99-f3a2-4a51-afff-03137206590b/volumes" Mar 09 09:25:17 crc kubenswrapper[4861]: I0309 09:25:17.712011 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-k46zw" event={"ID":"c9820d89-3a89-4982-8520-f23dd0d099ad","Type":"ContainerDied","Data":"5c1402523e38b6a059b942534e04454c0efa21ed13ded8af6b33ce288f6f6dbc"} Mar 09 09:25:17 crc kubenswrapper[4861]: I0309 09:25:17.712052 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bcd5c" event={"ID":"bbd4cfda-d65e-4915-be68-f207820fe15b","Type":"ContainerDied","Data":"c6030e0696081dc572663a630ff4a6ab639c92c512e4f4ee1d80fd3780338ec9"} Mar 09 09:25:17 crc kubenswrapper[4861]: I0309 09:25:17.712069 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"628d8d0c-948a-4878-ac3f-d1c35befe1d0","Type":"ContainerStarted","Data":"29982d9b7b3a7bb1491811d7268ce1c3239582d260e5cf3190922ddb419fa90c"} Mar 09 09:25:17 crc kubenswrapper[4861]: I0309 09:25:17.720614 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 09 09:25:17 crc kubenswrapper[4861]: I0309 09:25:17.723281 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 09 09:25:18 crc kubenswrapper[4861]: I0309 09:25:18.709695 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"628d8d0c-948a-4878-ac3f-d1c35befe1d0","Type":"ContainerStarted","Data":"66cc06ff1175b455f0e6e180f5bb9076e45beb7156969699f7fcd6f894ab0395"} Mar 09 09:25:18 crc kubenswrapper[4861]: I0309 09:25:18.710097 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 09 09:25:18 crc kubenswrapper[4861]: I0309 09:25:18.710115 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 09 09:25:18 crc kubenswrapper[4861]: I0309 09:25:18.854648 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-7bb4db8c4-sxjc7" Mar 09 09:25:18 crc kubenswrapper[4861]: I0309 09:25:18.855044 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7bb4db8c4-sxjc7" Mar 09 09:25:18 crc kubenswrapper[4861]: I0309 09:25:18.989294 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5487f4d458-lnthc" Mar 09 09:25:18 crc kubenswrapper[4861]: I0309 09:25:18.989782 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5487f4d458-lnthc" Mar 09 09:25:19 crc kubenswrapper[4861]: I0309 09:25:19.486614 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7859c7799c-cztvv" Mar 09 09:25:19 crc kubenswrapper[4861]: I0309 09:25:19.557799 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-ccd7c9f8f-j7b59"] Mar 09 09:25:19 crc kubenswrapper[4861]: I0309 09:25:19.558185 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-ccd7c9f8f-j7b59" podUID="ebf4ebf5-adc0-48b4-b80b-0b0d88e64910" containerName="dnsmasq-dns" containerID="cri-o://2f04fc77f99198d1100f898ca09e71671b5dce06e42013b8980d3efc2cff2c97" gracePeriod=10 Mar 09 09:25:19 crc kubenswrapper[4861]: I0309 09:25:19.731135 4861 generic.go:334] "Generic (PLEG): container finished" podID="ebf4ebf5-adc0-48b4-b80b-0b0d88e64910" containerID="2f04fc77f99198d1100f898ca09e71671b5dce06e42013b8980d3efc2cff2c97" exitCode=0 Mar 09 09:25:19 crc kubenswrapper[4861]: I0309 09:25:19.731486 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ccd7c9f8f-j7b59" event={"ID":"ebf4ebf5-adc0-48b4-b80b-0b0d88e64910","Type":"ContainerDied","Data":"2f04fc77f99198d1100f898ca09e71671b5dce06e42013b8980d3efc2cff2c97"} Mar 09 09:25:20 crc kubenswrapper[4861]: I0309 09:25:20.360529 4861 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-ccd7c9f8f-j7b59" podUID="ebf4ebf5-adc0-48b4-b80b-0b0d88e64910" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.148:5353: connect: connection refused" Mar 09 09:25:20 crc kubenswrapper[4861]: I0309 09:25:20.830389 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bcd5c" Mar 09 09:25:20 crc kubenswrapper[4861]: I0309 09:25:20.931415 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bbd4cfda-d65e-4915-be68-f207820fe15b-credential-keys\") pod \"bbd4cfda-d65e-4915-be68-f207820fe15b\" (UID: \"bbd4cfda-d65e-4915-be68-f207820fe15b\") " Mar 09 09:25:20 crc kubenswrapper[4861]: I0309 09:25:20.931471 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bbd4cfda-d65e-4915-be68-f207820fe15b-fernet-keys\") pod \"bbd4cfda-d65e-4915-be68-f207820fe15b\" (UID: \"bbd4cfda-d65e-4915-be68-f207820fe15b\") " Mar 09 09:25:20 crc kubenswrapper[4861]: I0309 09:25:20.931510 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbd4cfda-d65e-4915-be68-f207820fe15b-scripts\") pod \"bbd4cfda-d65e-4915-be68-f207820fe15b\" (UID: \"bbd4cfda-d65e-4915-be68-f207820fe15b\") " Mar 09 09:25:20 crc kubenswrapper[4861]: I0309 09:25:20.931582 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbd4cfda-d65e-4915-be68-f207820fe15b-config-data\") pod \"bbd4cfda-d65e-4915-be68-f207820fe15b\" (UID: \"bbd4cfda-d65e-4915-be68-f207820fe15b\") " Mar 09 09:25:20 crc kubenswrapper[4861]: I0309 09:25:20.931659 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkqzw\" (UniqueName: \"kubernetes.io/projected/bbd4cfda-d65e-4915-be68-f207820fe15b-kube-api-access-wkqzw\") pod \"bbd4cfda-d65e-4915-be68-f207820fe15b\" (UID: \"bbd4cfda-d65e-4915-be68-f207820fe15b\") " Mar 09 09:25:20 crc kubenswrapper[4861]: I0309 09:25:20.931758 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbd4cfda-d65e-4915-be68-f207820fe15b-combined-ca-bundle\") pod \"bbd4cfda-d65e-4915-be68-f207820fe15b\" (UID: \"bbd4cfda-d65e-4915-be68-f207820fe15b\") " Mar 09 09:25:20 crc kubenswrapper[4861]: I0309 09:25:20.937685 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbd4cfda-d65e-4915-be68-f207820fe15b-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "bbd4cfda-d65e-4915-be68-f207820fe15b" (UID: "bbd4cfda-d65e-4915-be68-f207820fe15b"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:25:20 crc kubenswrapper[4861]: I0309 09:25:20.938534 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbd4cfda-d65e-4915-be68-f207820fe15b-scripts" (OuterVolumeSpecName: "scripts") pod "bbd4cfda-d65e-4915-be68-f207820fe15b" (UID: "bbd4cfda-d65e-4915-be68-f207820fe15b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:25:20 crc kubenswrapper[4861]: I0309 09:25:20.946754 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbd4cfda-d65e-4915-be68-f207820fe15b-kube-api-access-wkqzw" (OuterVolumeSpecName: "kube-api-access-wkqzw") pod "bbd4cfda-d65e-4915-be68-f207820fe15b" (UID: "bbd4cfda-d65e-4915-be68-f207820fe15b"). InnerVolumeSpecName "kube-api-access-wkqzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:25:20 crc kubenswrapper[4861]: I0309 09:25:20.957806 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbd4cfda-d65e-4915-be68-f207820fe15b-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "bbd4cfda-d65e-4915-be68-f207820fe15b" (UID: "bbd4cfda-d65e-4915-be68-f207820fe15b"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:25:20 crc kubenswrapper[4861]: I0309 09:25:20.987773 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbd4cfda-d65e-4915-be68-f207820fe15b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bbd4cfda-d65e-4915-be68-f207820fe15b" (UID: "bbd4cfda-d65e-4915-be68-f207820fe15b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:25:21 crc kubenswrapper[4861]: I0309 09:25:21.014113 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbd4cfda-d65e-4915-be68-f207820fe15b-config-data" (OuterVolumeSpecName: "config-data") pod "bbd4cfda-d65e-4915-be68-f207820fe15b" (UID: "bbd4cfda-d65e-4915-be68-f207820fe15b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:25:21 crc kubenswrapper[4861]: I0309 09:25:21.033199 4861 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbd4cfda-d65e-4915-be68-f207820fe15b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:21 crc kubenswrapper[4861]: I0309 09:25:21.033226 4861 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/bbd4cfda-d65e-4915-be68-f207820fe15b-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:21 crc kubenswrapper[4861]: I0309 09:25:21.033236 4861 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bbd4cfda-d65e-4915-be68-f207820fe15b-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:21 crc kubenswrapper[4861]: I0309 09:25:21.033245 4861 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbd4cfda-d65e-4915-be68-f207820fe15b-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:21 crc kubenswrapper[4861]: I0309 09:25:21.033254 4861 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbd4cfda-d65e-4915-be68-f207820fe15b-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:21 crc kubenswrapper[4861]: I0309 09:25:21.033262 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkqzw\" (UniqueName: \"kubernetes.io/projected/bbd4cfda-d65e-4915-be68-f207820fe15b-kube-api-access-wkqzw\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:21 crc kubenswrapper[4861]: I0309 09:25:21.471458 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 09 09:25:21 crc kubenswrapper[4861]: I0309 09:25:21.471604 4861 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 09 09:25:21 crc kubenswrapper[4861]: I0309 09:25:21.750738 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bcd5c" event={"ID":"bbd4cfda-d65e-4915-be68-f207820fe15b","Type":"ContainerDied","Data":"aa465b4db79ae0fd4dc729905d48e8b93afb616aa69f4df6f25e4d3febb78146"} Mar 09 09:25:21 crc kubenswrapper[4861]: I0309 09:25:21.750831 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa465b4db79ae0fd4dc729905d48e8b93afb616aa69f4df6f25e4d3febb78146" Mar 09 09:25:21 crc kubenswrapper[4861]: I0309 09:25:21.750785 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bcd5c" Mar 09 09:25:21 crc kubenswrapper[4861]: I0309 09:25:21.948966 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5cc5bc567f-7k86v"] Mar 09 09:25:21 crc kubenswrapper[4861]: E0309 09:25:21.949348 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbd4cfda-d65e-4915-be68-f207820fe15b" containerName="keystone-bootstrap" Mar 09 09:25:21 crc kubenswrapper[4861]: I0309 09:25:21.949361 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbd4cfda-d65e-4915-be68-f207820fe15b" containerName="keystone-bootstrap" Mar 09 09:25:21 crc kubenswrapper[4861]: I0309 09:25:21.949617 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbd4cfda-d65e-4915-be68-f207820fe15b" containerName="keystone-bootstrap" Mar 09 09:25:21 crc kubenswrapper[4861]: I0309 09:25:21.950158 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5cc5bc567f-7k86v" Mar 09 09:25:21 crc kubenswrapper[4861]: I0309 09:25:21.952065 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 09 09:25:21 crc kubenswrapper[4861]: I0309 09:25:21.953223 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 09 09:25:21 crc kubenswrapper[4861]: I0309 09:25:21.953433 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 09 09:25:21 crc kubenswrapper[4861]: I0309 09:25:21.954284 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 09 09:25:21 crc kubenswrapper[4861]: I0309 09:25:21.954517 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 09 09:25:21 crc kubenswrapper[4861]: I0309 09:25:21.956076 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-sk77l" Mar 09 09:25:21 crc kubenswrapper[4861]: I0309 09:25:21.969209 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5cc5bc567f-7k86v"] Mar 09 09:25:22 crc kubenswrapper[4861]: I0309 09:25:22.055001 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/596fb22d-649e-4e00-b847-71b506786832-config-data\") pod \"keystone-5cc5bc567f-7k86v\" (UID: \"596fb22d-649e-4e00-b847-71b506786832\") " pod="openstack/keystone-5cc5bc567f-7k86v" Mar 09 09:25:22 crc kubenswrapper[4861]: I0309 09:25:22.055056 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9wst\" (UniqueName: \"kubernetes.io/projected/596fb22d-649e-4e00-b847-71b506786832-kube-api-access-p9wst\") pod \"keystone-5cc5bc567f-7k86v\" (UID: \"596fb22d-649e-4e00-b847-71b506786832\") " pod="openstack/keystone-5cc5bc567f-7k86v" Mar 09 09:25:22 crc kubenswrapper[4861]: I0309 09:25:22.055084 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/596fb22d-649e-4e00-b847-71b506786832-fernet-keys\") pod \"keystone-5cc5bc567f-7k86v\" (UID: \"596fb22d-649e-4e00-b847-71b506786832\") " pod="openstack/keystone-5cc5bc567f-7k86v" Mar 09 09:25:22 crc kubenswrapper[4861]: I0309 09:25:22.055152 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/596fb22d-649e-4e00-b847-71b506786832-scripts\") pod \"keystone-5cc5bc567f-7k86v\" (UID: \"596fb22d-649e-4e00-b847-71b506786832\") " pod="openstack/keystone-5cc5bc567f-7k86v" Mar 09 09:25:22 crc kubenswrapper[4861]: I0309 09:25:22.055198 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/596fb22d-649e-4e00-b847-71b506786832-internal-tls-certs\") pod \"keystone-5cc5bc567f-7k86v\" (UID: \"596fb22d-649e-4e00-b847-71b506786832\") " pod="openstack/keystone-5cc5bc567f-7k86v" Mar 09 09:25:22 crc kubenswrapper[4861]: I0309 09:25:22.055292 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/596fb22d-649e-4e00-b847-71b506786832-public-tls-certs\") pod \"keystone-5cc5bc567f-7k86v\" (UID: \"596fb22d-649e-4e00-b847-71b506786832\") " pod="openstack/keystone-5cc5bc567f-7k86v" Mar 09 09:25:22 crc kubenswrapper[4861]: I0309 09:25:22.055611 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/596fb22d-649e-4e00-b847-71b506786832-combined-ca-bundle\") pod \"keystone-5cc5bc567f-7k86v\" (UID: \"596fb22d-649e-4e00-b847-71b506786832\") " pod="openstack/keystone-5cc5bc567f-7k86v" Mar 09 09:25:22 crc kubenswrapper[4861]: I0309 09:25:22.055717 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/596fb22d-649e-4e00-b847-71b506786832-credential-keys\") pod \"keystone-5cc5bc567f-7k86v\" (UID: \"596fb22d-649e-4e00-b847-71b506786832\") " pod="openstack/keystone-5cc5bc567f-7k86v" Mar 09 09:25:22 crc kubenswrapper[4861]: I0309 09:25:22.128857 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 09 09:25:22 crc kubenswrapper[4861]: I0309 09:25:22.159698 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/596fb22d-649e-4e00-b847-71b506786832-config-data\") pod \"keystone-5cc5bc567f-7k86v\" (UID: \"596fb22d-649e-4e00-b847-71b506786832\") " pod="openstack/keystone-5cc5bc567f-7k86v" Mar 09 09:25:22 crc kubenswrapper[4861]: I0309 09:25:22.159875 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9wst\" (UniqueName: \"kubernetes.io/projected/596fb22d-649e-4e00-b847-71b506786832-kube-api-access-p9wst\") pod \"keystone-5cc5bc567f-7k86v\" (UID: \"596fb22d-649e-4e00-b847-71b506786832\") " pod="openstack/keystone-5cc5bc567f-7k86v" Mar 09 09:25:22 crc kubenswrapper[4861]: I0309 09:25:22.159974 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/596fb22d-649e-4e00-b847-71b506786832-fernet-keys\") pod \"keystone-5cc5bc567f-7k86v\" (UID: \"596fb22d-649e-4e00-b847-71b506786832\") " pod="openstack/keystone-5cc5bc567f-7k86v" Mar 09 09:25:22 crc kubenswrapper[4861]: I0309 09:25:22.167176 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/596fb22d-649e-4e00-b847-71b506786832-fernet-keys\") pod \"keystone-5cc5bc567f-7k86v\" (UID: \"596fb22d-649e-4e00-b847-71b506786832\") " pod="openstack/keystone-5cc5bc567f-7k86v" Mar 09 09:25:22 crc kubenswrapper[4861]: I0309 09:25:22.169579 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/596fb22d-649e-4e00-b847-71b506786832-scripts\") pod \"keystone-5cc5bc567f-7k86v\" (UID: \"596fb22d-649e-4e00-b847-71b506786832\") " pod="openstack/keystone-5cc5bc567f-7k86v" Mar 09 09:25:22 crc kubenswrapper[4861]: I0309 09:25:22.169640 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/596fb22d-649e-4e00-b847-71b506786832-internal-tls-certs\") pod \"keystone-5cc5bc567f-7k86v\" (UID: \"596fb22d-649e-4e00-b847-71b506786832\") " pod="openstack/keystone-5cc5bc567f-7k86v" Mar 09 09:25:22 crc kubenswrapper[4861]: I0309 09:25:22.169786 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/596fb22d-649e-4e00-b847-71b506786832-public-tls-certs\") pod \"keystone-5cc5bc567f-7k86v\" (UID: \"596fb22d-649e-4e00-b847-71b506786832\") " pod="openstack/keystone-5cc5bc567f-7k86v" Mar 09 09:25:22 crc kubenswrapper[4861]: I0309 09:25:22.169956 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/596fb22d-649e-4e00-b847-71b506786832-combined-ca-bundle\") pod \"keystone-5cc5bc567f-7k86v\" (UID: \"596fb22d-649e-4e00-b847-71b506786832\") " pod="openstack/keystone-5cc5bc567f-7k86v" Mar 09 09:25:22 crc kubenswrapper[4861]: I0309 09:25:22.170023 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/596fb22d-649e-4e00-b847-71b506786832-credential-keys\") pod \"keystone-5cc5bc567f-7k86v\" (UID: \"596fb22d-649e-4e00-b847-71b506786832\") " pod="openstack/keystone-5cc5bc567f-7k86v" Mar 09 09:25:22 crc kubenswrapper[4861]: I0309 09:25:22.177348 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/596fb22d-649e-4e00-b847-71b506786832-combined-ca-bundle\") pod \"keystone-5cc5bc567f-7k86v\" (UID: \"596fb22d-649e-4e00-b847-71b506786832\") " pod="openstack/keystone-5cc5bc567f-7k86v" Mar 09 09:25:22 crc kubenswrapper[4861]: I0309 09:25:22.189013 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9wst\" (UniqueName: \"kubernetes.io/projected/596fb22d-649e-4e00-b847-71b506786832-kube-api-access-p9wst\") pod \"keystone-5cc5bc567f-7k86v\" (UID: \"596fb22d-649e-4e00-b847-71b506786832\") " pod="openstack/keystone-5cc5bc567f-7k86v" Mar 09 09:25:22 crc kubenswrapper[4861]: I0309 09:25:22.193077 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/596fb22d-649e-4e00-b847-71b506786832-internal-tls-certs\") pod \"keystone-5cc5bc567f-7k86v\" (UID: \"596fb22d-649e-4e00-b847-71b506786832\") " pod="openstack/keystone-5cc5bc567f-7k86v" Mar 09 09:25:22 crc kubenswrapper[4861]: I0309 09:25:22.206796 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/596fb22d-649e-4e00-b847-71b506786832-scripts\") pod \"keystone-5cc5bc567f-7k86v\" (UID: \"596fb22d-649e-4e00-b847-71b506786832\") " pod="openstack/keystone-5cc5bc567f-7k86v" Mar 09 09:25:22 crc kubenswrapper[4861]: I0309 09:25:22.213317 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/596fb22d-649e-4e00-b847-71b506786832-public-tls-certs\") pod \"keystone-5cc5bc567f-7k86v\" (UID: \"596fb22d-649e-4e00-b847-71b506786832\") " pod="openstack/keystone-5cc5bc567f-7k86v" Mar 09 09:25:22 crc kubenswrapper[4861]: I0309 09:25:22.214012 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/596fb22d-649e-4e00-b847-71b506786832-credential-keys\") pod \"keystone-5cc5bc567f-7k86v\" (UID: \"596fb22d-649e-4e00-b847-71b506786832\") " pod="openstack/keystone-5cc5bc567f-7k86v" Mar 09 09:25:22 crc kubenswrapper[4861]: I0309 09:25:22.217576 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/596fb22d-649e-4e00-b847-71b506786832-config-data\") pod \"keystone-5cc5bc567f-7k86v\" (UID: \"596fb22d-649e-4e00-b847-71b506786832\") " pod="openstack/keystone-5cc5bc567f-7k86v" Mar 09 09:25:22 crc kubenswrapper[4861]: I0309 09:25:22.278518 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5cc5bc567f-7k86v" Mar 09 09:25:23 crc kubenswrapper[4861]: I0309 09:25:23.164165 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-k46zw" Mar 09 09:25:23 crc kubenswrapper[4861]: I0309 09:25:23.190710 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9820d89-3a89-4982-8520-f23dd0d099ad-scripts\") pod \"c9820d89-3a89-4982-8520-f23dd0d099ad\" (UID: \"c9820d89-3a89-4982-8520-f23dd0d099ad\") " Mar 09 09:25:23 crc kubenswrapper[4861]: I0309 09:25:23.191313 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9820d89-3a89-4982-8520-f23dd0d099ad-logs\") pod \"c9820d89-3a89-4982-8520-f23dd0d099ad\" (UID: \"c9820d89-3a89-4982-8520-f23dd0d099ad\") " Mar 09 09:25:23 crc kubenswrapper[4861]: I0309 09:25:23.191394 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9820d89-3a89-4982-8520-f23dd0d099ad-config-data\") pod \"c9820d89-3a89-4982-8520-f23dd0d099ad\" (UID: \"c9820d89-3a89-4982-8520-f23dd0d099ad\") " Mar 09 09:25:23 crc kubenswrapper[4861]: I0309 09:25:23.191419 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9820d89-3a89-4982-8520-f23dd0d099ad-combined-ca-bundle\") pod \"c9820d89-3a89-4982-8520-f23dd0d099ad\" (UID: \"c9820d89-3a89-4982-8520-f23dd0d099ad\") " Mar 09 09:25:23 crc kubenswrapper[4861]: I0309 09:25:23.191457 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldwqp\" (UniqueName: \"kubernetes.io/projected/c9820d89-3a89-4982-8520-f23dd0d099ad-kube-api-access-ldwqp\") pod \"c9820d89-3a89-4982-8520-f23dd0d099ad\" (UID: \"c9820d89-3a89-4982-8520-f23dd0d099ad\") " Mar 09 09:25:23 crc kubenswrapper[4861]: I0309 09:25:23.192980 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9820d89-3a89-4982-8520-f23dd0d099ad-logs" (OuterVolumeSpecName: "logs") pod "c9820d89-3a89-4982-8520-f23dd0d099ad" (UID: "c9820d89-3a89-4982-8520-f23dd0d099ad"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:25:23 crc kubenswrapper[4861]: I0309 09:25:23.197684 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9820d89-3a89-4982-8520-f23dd0d099ad-scripts" (OuterVolumeSpecName: "scripts") pod "c9820d89-3a89-4982-8520-f23dd0d099ad" (UID: "c9820d89-3a89-4982-8520-f23dd0d099ad"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:25:23 crc kubenswrapper[4861]: I0309 09:25:23.269561 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9820d89-3a89-4982-8520-f23dd0d099ad-config-data" (OuterVolumeSpecName: "config-data") pod "c9820d89-3a89-4982-8520-f23dd0d099ad" (UID: "c9820d89-3a89-4982-8520-f23dd0d099ad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:25:23 crc kubenswrapper[4861]: I0309 09:25:23.294637 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9820d89-3a89-4982-8520-f23dd0d099ad-kube-api-access-ldwqp" (OuterVolumeSpecName: "kube-api-access-ldwqp") pod "c9820d89-3a89-4982-8520-f23dd0d099ad" (UID: "c9820d89-3a89-4982-8520-f23dd0d099ad"). InnerVolumeSpecName "kube-api-access-ldwqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:25:23 crc kubenswrapper[4861]: I0309 09:25:23.295963 4861 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9820d89-3a89-4982-8520-f23dd0d099ad-logs\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:23 crc kubenswrapper[4861]: I0309 09:25:23.296072 4861 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9820d89-3a89-4982-8520-f23dd0d099ad-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:23 crc kubenswrapper[4861]: I0309 09:25:23.296789 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldwqp\" (UniqueName: \"kubernetes.io/projected/c9820d89-3a89-4982-8520-f23dd0d099ad-kube-api-access-ldwqp\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:23 crc kubenswrapper[4861]: I0309 09:25:23.296853 4861 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9820d89-3a89-4982-8520-f23dd0d099ad-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:23 crc kubenswrapper[4861]: I0309 09:25:23.322867 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9820d89-3a89-4982-8520-f23dd0d099ad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c9820d89-3a89-4982-8520-f23dd0d099ad" (UID: "c9820d89-3a89-4982-8520-f23dd0d099ad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:25:23 crc kubenswrapper[4861]: I0309 09:25:23.401016 4861 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9820d89-3a89-4982-8520-f23dd0d099ad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:23 crc kubenswrapper[4861]: I0309 09:25:23.499880 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5cc5bc567f-7k86v"] Mar 09 09:25:23 crc kubenswrapper[4861]: I0309 09:25:23.504573 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ccd7c9f8f-j7b59" Mar 09 09:25:23 crc kubenswrapper[4861]: I0309 09:25:23.605540 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhcns\" (UniqueName: \"kubernetes.io/projected/ebf4ebf5-adc0-48b4-b80b-0b0d88e64910-kube-api-access-qhcns\") pod \"ebf4ebf5-adc0-48b4-b80b-0b0d88e64910\" (UID: \"ebf4ebf5-adc0-48b4-b80b-0b0d88e64910\") " Mar 09 09:25:23 crc kubenswrapper[4861]: I0309 09:25:23.605672 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ebf4ebf5-adc0-48b4-b80b-0b0d88e64910-ovsdbserver-sb\") pod \"ebf4ebf5-adc0-48b4-b80b-0b0d88e64910\" (UID: \"ebf4ebf5-adc0-48b4-b80b-0b0d88e64910\") " Mar 09 09:25:23 crc kubenswrapper[4861]: I0309 09:25:23.605733 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebf4ebf5-adc0-48b4-b80b-0b0d88e64910-config\") pod \"ebf4ebf5-adc0-48b4-b80b-0b0d88e64910\" (UID: \"ebf4ebf5-adc0-48b4-b80b-0b0d88e64910\") " Mar 09 09:25:23 crc kubenswrapper[4861]: I0309 09:25:23.605754 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ebf4ebf5-adc0-48b4-b80b-0b0d88e64910-ovsdbserver-nb\") pod \"ebf4ebf5-adc0-48b4-b80b-0b0d88e64910\" (UID: \"ebf4ebf5-adc0-48b4-b80b-0b0d88e64910\") " Mar 09 09:25:23 crc kubenswrapper[4861]: I0309 09:25:23.605833 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ebf4ebf5-adc0-48b4-b80b-0b0d88e64910-dns-swift-storage-0\") pod \"ebf4ebf5-adc0-48b4-b80b-0b0d88e64910\" (UID: \"ebf4ebf5-adc0-48b4-b80b-0b0d88e64910\") " Mar 09 09:25:23 crc kubenswrapper[4861]: I0309 09:25:23.605892 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ebf4ebf5-adc0-48b4-b80b-0b0d88e64910-dns-svc\") pod \"ebf4ebf5-adc0-48b4-b80b-0b0d88e64910\" (UID: \"ebf4ebf5-adc0-48b4-b80b-0b0d88e64910\") " Mar 09 09:25:23 crc kubenswrapper[4861]: I0309 09:25:23.645967 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebf4ebf5-adc0-48b4-b80b-0b0d88e64910-kube-api-access-qhcns" (OuterVolumeSpecName: "kube-api-access-qhcns") pod "ebf4ebf5-adc0-48b4-b80b-0b0d88e64910" (UID: "ebf4ebf5-adc0-48b4-b80b-0b0d88e64910"). InnerVolumeSpecName "kube-api-access-qhcns". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:25:23 crc kubenswrapper[4861]: I0309 09:25:23.709185 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhcns\" (UniqueName: \"kubernetes.io/projected/ebf4ebf5-adc0-48b4-b80b-0b0d88e64910-kube-api-access-qhcns\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:23 crc kubenswrapper[4861]: I0309 09:25:23.796980 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-k46zw" Mar 09 09:25:23 crc kubenswrapper[4861]: I0309 09:25:23.801388 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ccd7c9f8f-j7b59" Mar 09 09:25:23 crc kubenswrapper[4861]: I0309 09:25:23.883249 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5cc5bc567f-7k86v" event={"ID":"596fb22d-649e-4e00-b847-71b506786832","Type":"ContainerStarted","Data":"838a9b9bc47e3187d006a5de27c27c93c21223223a8e5d0132e973398b2ae949"} Mar 09 09:25:23 crc kubenswrapper[4861]: I0309 09:25:23.883304 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-k46zw" event={"ID":"c9820d89-3a89-4982-8520-f23dd0d099ad","Type":"ContainerDied","Data":"86fea03e1d5fad3480ce069e34284c1552eba9c6e60d62cca203f18c0a193adf"} Mar 09 09:25:23 crc kubenswrapper[4861]: I0309 09:25:23.883320 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86fea03e1d5fad3480ce069e34284c1552eba9c6e60d62cca203f18c0a193adf" Mar 09 09:25:23 crc kubenswrapper[4861]: I0309 09:25:23.883329 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ccd7c9f8f-j7b59" event={"ID":"ebf4ebf5-adc0-48b4-b80b-0b0d88e64910","Type":"ContainerDied","Data":"2f89109730897ac555a52f2c10697385d9bd7efabbb87dd18441d5e0f0ce854b"} Mar 09 09:25:23 crc kubenswrapper[4861]: I0309 09:25:23.883362 4861 scope.go:117] "RemoveContainer" containerID="2f04fc77f99198d1100f898ca09e71671b5dce06e42013b8980d3efc2cff2c97" Mar 09 09:25:23 crc kubenswrapper[4861]: I0309 09:25:23.934551 4861 scope.go:117] "RemoveContainer" containerID="9a8ce14b29f228323a6005799d32951adce6e6653badb4b08b86fea8ceed6624" Mar 09 09:25:23 crc kubenswrapper[4861]: I0309 09:25:23.940937 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebf4ebf5-adc0-48b4-b80b-0b0d88e64910-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ebf4ebf5-adc0-48b4-b80b-0b0d88e64910" (UID: "ebf4ebf5-adc0-48b4-b80b-0b0d88e64910"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:25:23 crc kubenswrapper[4861]: I0309 09:25:23.959313 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebf4ebf5-adc0-48b4-b80b-0b0d88e64910-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ebf4ebf5-adc0-48b4-b80b-0b0d88e64910" (UID: "ebf4ebf5-adc0-48b4-b80b-0b0d88e64910"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:25:24 crc kubenswrapper[4861]: I0309 09:25:24.018848 4861 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ebf4ebf5-adc0-48b4-b80b-0b0d88e64910-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:24 crc kubenswrapper[4861]: I0309 09:25:24.019096 4861 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ebf4ebf5-adc0-48b4-b80b-0b0d88e64910-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:24 crc kubenswrapper[4861]: I0309 09:25:24.047786 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebf4ebf5-adc0-48b4-b80b-0b0d88e64910-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ebf4ebf5-adc0-48b4-b80b-0b0d88e64910" (UID: "ebf4ebf5-adc0-48b4-b80b-0b0d88e64910"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:25:24 crc kubenswrapper[4861]: I0309 09:25:24.054027 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebf4ebf5-adc0-48b4-b80b-0b0d88e64910-config" (OuterVolumeSpecName: "config") pod "ebf4ebf5-adc0-48b4-b80b-0b0d88e64910" (UID: "ebf4ebf5-adc0-48b4-b80b-0b0d88e64910"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:25:24 crc kubenswrapper[4861]: I0309 09:25:24.055563 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebf4ebf5-adc0-48b4-b80b-0b0d88e64910-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ebf4ebf5-adc0-48b4-b80b-0b0d88e64910" (UID: "ebf4ebf5-adc0-48b4-b80b-0b0d88e64910"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:25:24 crc kubenswrapper[4861]: I0309 09:25:24.120531 4861 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebf4ebf5-adc0-48b4-b80b-0b0d88e64910-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:24 crc kubenswrapper[4861]: I0309 09:25:24.120931 4861 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ebf4ebf5-adc0-48b4-b80b-0b0d88e64910-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:24 crc kubenswrapper[4861]: I0309 09:25:24.120946 4861 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ebf4ebf5-adc0-48b4-b80b-0b0d88e64910-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:24 crc kubenswrapper[4861]: I0309 09:25:24.147840 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-ccd7c9f8f-j7b59"] Mar 09 09:25:24 crc kubenswrapper[4861]: I0309 09:25:24.155443 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-ccd7c9f8f-j7b59"] Mar 09 09:25:24 crc kubenswrapper[4861]: I0309 09:25:24.390818 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-678fb94c4b-9x5d2"] Mar 09 09:25:24 crc kubenswrapper[4861]: E0309 09:25:24.391222 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebf4ebf5-adc0-48b4-b80b-0b0d88e64910" containerName="dnsmasq-dns" Mar 09 09:25:24 crc kubenswrapper[4861]: I0309 09:25:24.391241 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebf4ebf5-adc0-48b4-b80b-0b0d88e64910" containerName="dnsmasq-dns" Mar 09 09:25:24 crc kubenswrapper[4861]: E0309 09:25:24.391278 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9820d89-3a89-4982-8520-f23dd0d099ad" containerName="placement-db-sync" Mar 09 09:25:24 crc kubenswrapper[4861]: I0309 09:25:24.391288 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9820d89-3a89-4982-8520-f23dd0d099ad" containerName="placement-db-sync" Mar 09 09:25:24 crc kubenswrapper[4861]: E0309 09:25:24.391315 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebf4ebf5-adc0-48b4-b80b-0b0d88e64910" containerName="init" Mar 09 09:25:24 crc kubenswrapper[4861]: I0309 09:25:24.391325 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebf4ebf5-adc0-48b4-b80b-0b0d88e64910" containerName="init" Mar 09 09:25:24 crc kubenswrapper[4861]: I0309 09:25:24.395766 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebf4ebf5-adc0-48b4-b80b-0b0d88e64910" containerName="dnsmasq-dns" Mar 09 09:25:24 crc kubenswrapper[4861]: I0309 09:25:24.395822 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9820d89-3a89-4982-8520-f23dd0d099ad" containerName="placement-db-sync" Mar 09 09:25:24 crc kubenswrapper[4861]: I0309 09:25:24.397310 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-678fb94c4b-9x5d2" Mar 09 09:25:24 crc kubenswrapper[4861]: I0309 09:25:24.402807 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 09 09:25:24 crc kubenswrapper[4861]: I0309 09:25:24.403669 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 09 09:25:24 crc kubenswrapper[4861]: I0309 09:25:24.403777 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 09 09:25:24 crc kubenswrapper[4861]: I0309 09:25:24.403920 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-6mklb" Mar 09 09:25:24 crc kubenswrapper[4861]: I0309 09:25:24.404038 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 09 09:25:24 crc kubenswrapper[4861]: I0309 09:25:24.404569 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-678fb94c4b-9x5d2"] Mar 09 09:25:24 crc kubenswrapper[4861]: I0309 09:25:24.429801 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80b2797f-285c-4a23-9385-b4845acb2820-combined-ca-bundle\") pod \"placement-678fb94c4b-9x5d2\" (UID: \"80b2797f-285c-4a23-9385-b4845acb2820\") " pod="openstack/placement-678fb94c4b-9x5d2" Mar 09 09:25:24 crc kubenswrapper[4861]: I0309 09:25:24.429854 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80b2797f-285c-4a23-9385-b4845acb2820-config-data\") pod \"placement-678fb94c4b-9x5d2\" (UID: \"80b2797f-285c-4a23-9385-b4845acb2820\") " pod="openstack/placement-678fb94c4b-9x5d2" Mar 09 09:25:24 crc kubenswrapper[4861]: I0309 09:25:24.430042 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80b2797f-285c-4a23-9385-b4845acb2820-scripts\") pod \"placement-678fb94c4b-9x5d2\" (UID: \"80b2797f-285c-4a23-9385-b4845acb2820\") " pod="openstack/placement-678fb94c4b-9x5d2" Mar 09 09:25:24 crc kubenswrapper[4861]: I0309 09:25:24.430147 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/80b2797f-285c-4a23-9385-b4845acb2820-internal-tls-certs\") pod \"placement-678fb94c4b-9x5d2\" (UID: \"80b2797f-285c-4a23-9385-b4845acb2820\") " pod="openstack/placement-678fb94c4b-9x5d2" Mar 09 09:25:24 crc kubenswrapper[4861]: I0309 09:25:24.430282 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80b2797f-285c-4a23-9385-b4845acb2820-logs\") pod \"placement-678fb94c4b-9x5d2\" (UID: \"80b2797f-285c-4a23-9385-b4845acb2820\") " pod="openstack/placement-678fb94c4b-9x5d2" Mar 09 09:25:24 crc kubenswrapper[4861]: I0309 09:25:24.430360 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/80b2797f-285c-4a23-9385-b4845acb2820-public-tls-certs\") pod \"placement-678fb94c4b-9x5d2\" (UID: \"80b2797f-285c-4a23-9385-b4845acb2820\") " pod="openstack/placement-678fb94c4b-9x5d2" Mar 09 09:25:24 crc kubenswrapper[4861]: I0309 09:25:24.431009 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btgpk\" (UniqueName: \"kubernetes.io/projected/80b2797f-285c-4a23-9385-b4845acb2820-kube-api-access-btgpk\") pod \"placement-678fb94c4b-9x5d2\" (UID: \"80b2797f-285c-4a23-9385-b4845acb2820\") " pod="openstack/placement-678fb94c4b-9x5d2" Mar 09 09:25:24 crc kubenswrapper[4861]: I0309 09:25:24.532590 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/80b2797f-285c-4a23-9385-b4845acb2820-public-tls-certs\") pod \"placement-678fb94c4b-9x5d2\" (UID: \"80b2797f-285c-4a23-9385-b4845acb2820\") " pod="openstack/placement-678fb94c4b-9x5d2" Mar 09 09:25:24 crc kubenswrapper[4861]: I0309 09:25:24.532661 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btgpk\" (UniqueName: \"kubernetes.io/projected/80b2797f-285c-4a23-9385-b4845acb2820-kube-api-access-btgpk\") pod \"placement-678fb94c4b-9x5d2\" (UID: \"80b2797f-285c-4a23-9385-b4845acb2820\") " pod="openstack/placement-678fb94c4b-9x5d2" Mar 09 09:25:24 crc kubenswrapper[4861]: I0309 09:25:24.532706 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80b2797f-285c-4a23-9385-b4845acb2820-combined-ca-bundle\") pod \"placement-678fb94c4b-9x5d2\" (UID: \"80b2797f-285c-4a23-9385-b4845acb2820\") " pod="openstack/placement-678fb94c4b-9x5d2" Mar 09 09:25:24 crc kubenswrapper[4861]: I0309 09:25:24.532732 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80b2797f-285c-4a23-9385-b4845acb2820-config-data\") pod \"placement-678fb94c4b-9x5d2\" (UID: \"80b2797f-285c-4a23-9385-b4845acb2820\") " pod="openstack/placement-678fb94c4b-9x5d2" Mar 09 09:25:24 crc kubenswrapper[4861]: I0309 09:25:24.532777 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80b2797f-285c-4a23-9385-b4845acb2820-scripts\") pod \"placement-678fb94c4b-9x5d2\" (UID: \"80b2797f-285c-4a23-9385-b4845acb2820\") " pod="openstack/placement-678fb94c4b-9x5d2" Mar 09 09:25:24 crc kubenswrapper[4861]: I0309 09:25:24.532815 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/80b2797f-285c-4a23-9385-b4845acb2820-internal-tls-certs\") pod \"placement-678fb94c4b-9x5d2\" (UID: \"80b2797f-285c-4a23-9385-b4845acb2820\") " pod="openstack/placement-678fb94c4b-9x5d2" Mar 09 09:25:24 crc kubenswrapper[4861]: I0309 09:25:24.532863 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80b2797f-285c-4a23-9385-b4845acb2820-logs\") pod \"placement-678fb94c4b-9x5d2\" (UID: \"80b2797f-285c-4a23-9385-b4845acb2820\") " pod="openstack/placement-678fb94c4b-9x5d2" Mar 09 09:25:24 crc kubenswrapper[4861]: I0309 09:25:24.533295 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80b2797f-285c-4a23-9385-b4845acb2820-logs\") pod \"placement-678fb94c4b-9x5d2\" (UID: \"80b2797f-285c-4a23-9385-b4845acb2820\") " pod="openstack/placement-678fb94c4b-9x5d2" Mar 09 09:25:24 crc kubenswrapper[4861]: I0309 09:25:24.538557 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/80b2797f-285c-4a23-9385-b4845acb2820-public-tls-certs\") pod \"placement-678fb94c4b-9x5d2\" (UID: \"80b2797f-285c-4a23-9385-b4845acb2820\") " pod="openstack/placement-678fb94c4b-9x5d2" Mar 09 09:25:24 crc kubenswrapper[4861]: I0309 09:25:24.539877 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/80b2797f-285c-4a23-9385-b4845acb2820-internal-tls-certs\") pod \"placement-678fb94c4b-9x5d2\" (UID: \"80b2797f-285c-4a23-9385-b4845acb2820\") " pod="openstack/placement-678fb94c4b-9x5d2" Mar 09 09:25:24 crc kubenswrapper[4861]: I0309 09:25:24.541318 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80b2797f-285c-4a23-9385-b4845acb2820-combined-ca-bundle\") pod \"placement-678fb94c4b-9x5d2\" (UID: \"80b2797f-285c-4a23-9385-b4845acb2820\") " pod="openstack/placement-678fb94c4b-9x5d2" Mar 09 09:25:24 crc kubenswrapper[4861]: I0309 09:25:24.542089 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80b2797f-285c-4a23-9385-b4845acb2820-config-data\") pod \"placement-678fb94c4b-9x5d2\" (UID: \"80b2797f-285c-4a23-9385-b4845acb2820\") " pod="openstack/placement-678fb94c4b-9x5d2" Mar 09 09:25:24 crc kubenswrapper[4861]: I0309 09:25:24.546987 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80b2797f-285c-4a23-9385-b4845acb2820-scripts\") pod \"placement-678fb94c4b-9x5d2\" (UID: \"80b2797f-285c-4a23-9385-b4845acb2820\") " pod="openstack/placement-678fb94c4b-9x5d2" Mar 09 09:25:24 crc kubenswrapper[4861]: I0309 09:25:24.552931 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btgpk\" (UniqueName: \"kubernetes.io/projected/80b2797f-285c-4a23-9385-b4845acb2820-kube-api-access-btgpk\") pod \"placement-678fb94c4b-9x5d2\" (UID: \"80b2797f-285c-4a23-9385-b4845acb2820\") " pod="openstack/placement-678fb94c4b-9x5d2" Mar 09 09:25:24 crc kubenswrapper[4861]: I0309 09:25:24.606128 4861 patch_prober.go:28] interesting pod/machine-config-daemon-5g7gc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:25:24 crc kubenswrapper[4861]: I0309 09:25:24.606476 4861 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:25:24 crc kubenswrapper[4861]: I0309 09:25:24.771543 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-678fb94c4b-9x5d2" Mar 09 09:25:24 crc kubenswrapper[4861]: I0309 09:25:24.846937 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3b893f89-a9bc-4a39-bd26-b394cbb0a374","Type":"ContainerStarted","Data":"aaf3ac9de5dfcab3f64973950fb19a6ea2cf57e74602e7bedfbcec4ae460e46e"} Mar 09 09:25:24 crc kubenswrapper[4861]: I0309 09:25:24.858175 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"628d8d0c-948a-4878-ac3f-d1c35befe1d0","Type":"ContainerStarted","Data":"69366dc012006a67b0c8071919516dc593bdd15df6f0b37444ccb5fd57bd1e59"} Mar 09 09:25:24 crc kubenswrapper[4861]: I0309 09:25:24.862040 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5cc5bc567f-7k86v" event={"ID":"596fb22d-649e-4e00-b847-71b506786832","Type":"ContainerStarted","Data":"f6e5ec2495d448b1017e2e47acd7c40f91a4bba4cefe29caca735cf2e8ac4d80"} Mar 09 09:25:24 crc kubenswrapper[4861]: I0309 09:25:24.862216 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-5cc5bc567f-7k86v" Mar 09 09:25:24 crc kubenswrapper[4861]: I0309 09:25:24.867276 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-8wrdd" event={"ID":"44c35b48-50b9-4dd8-846a-99714c14d3ab","Type":"ContainerStarted","Data":"29b2b6555a7395967ad28cf5a8555c4db17c2735decaf0c6d5b435f52f5c0ac1"} Mar 09 09:25:24 crc kubenswrapper[4861]: I0309 09:25:24.883981 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=8.883964409 podStartE2EDuration="8.883964409s" podCreationTimestamp="2026-03-09 09:25:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:25:24.88294375 +0000 UTC m=+1167.967983161" watchObservedRunningTime="2026-03-09 09:25:24.883964409 +0000 UTC m=+1167.969003810" Mar 09 09:25:24 crc kubenswrapper[4861]: I0309 09:25:24.933598 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-8wrdd" podStartSLOduration=4.190227402 podStartE2EDuration="45.933578288s" podCreationTimestamp="2026-03-09 09:24:39 +0000 UTC" firstStartedPulling="2026-03-09 09:24:41.324481684 +0000 UTC m=+1124.409521085" lastFinishedPulling="2026-03-09 09:25:23.06783257 +0000 UTC m=+1166.152871971" observedRunningTime="2026-03-09 09:25:24.914595833 +0000 UTC m=+1167.999635244" watchObservedRunningTime="2026-03-09 09:25:24.933578288 +0000 UTC m=+1168.018617689" Mar 09 09:25:24 crc kubenswrapper[4861]: I0309 09:25:24.955708 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-5cc5bc567f-7k86v" podStartSLOduration=3.955690463 podStartE2EDuration="3.955690463s" podCreationTimestamp="2026-03-09 09:25:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:25:24.942732304 +0000 UTC m=+1168.027771725" watchObservedRunningTime="2026-03-09 09:25:24.955690463 +0000 UTC m=+1168.040729864" Mar 09 09:25:25 crc kubenswrapper[4861]: I0309 09:25:25.297738 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-678fb94c4b-9x5d2"] Mar 09 09:25:25 crc kubenswrapper[4861]: I0309 09:25:25.674897 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebf4ebf5-adc0-48b4-b80b-0b0d88e64910" path="/var/lib/kubelet/pods/ebf4ebf5-adc0-48b4-b80b-0b0d88e64910/volumes" Mar 09 09:25:25 crc kubenswrapper[4861]: I0309 09:25:25.878174 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-sr27s" event={"ID":"deb8e24b-1a6f-4173-9a5f-62974b0331a5","Type":"ContainerStarted","Data":"a1b48cb736714ed720e92ad5dd691e482e8fb6a283685c7dcd91aabfe74ed413"} Mar 09 09:25:25 crc kubenswrapper[4861]: I0309 09:25:25.880799 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-678fb94c4b-9x5d2" event={"ID":"80b2797f-285c-4a23-9385-b4845acb2820","Type":"ContainerStarted","Data":"304f29203288f519c318547b054654530d46f5440bce1f8faca431fd2db37581"} Mar 09 09:25:25 crc kubenswrapper[4861]: I0309 09:25:25.880847 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-678fb94c4b-9x5d2" event={"ID":"80b2797f-285c-4a23-9385-b4845acb2820","Type":"ContainerStarted","Data":"945011aca918c5871e9e2d8feb605009f04f18e753b83bf22102745cc2896094"} Mar 09 09:25:25 crc kubenswrapper[4861]: I0309 09:25:25.880859 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-678fb94c4b-9x5d2" event={"ID":"80b2797f-285c-4a23-9385-b4845acb2820","Type":"ContainerStarted","Data":"5c818059086488a516772903b3f3992a57d4f63a0d0b5af335a4ada3fdf94562"} Mar 09 09:25:25 crc kubenswrapper[4861]: I0309 09:25:25.882211 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-678fb94c4b-9x5d2" Mar 09 09:25:25 crc kubenswrapper[4861]: I0309 09:25:25.882328 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-678fb94c4b-9x5d2" Mar 09 09:25:25 crc kubenswrapper[4861]: I0309 09:25:25.920146 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-sr27s" podStartSLOduration=2.503629954 podStartE2EDuration="46.920128478s" podCreationTimestamp="2026-03-09 09:24:39 +0000 UTC" firstStartedPulling="2026-03-09 09:24:40.701527588 +0000 UTC m=+1123.786566989" lastFinishedPulling="2026-03-09 09:25:25.118026112 +0000 UTC m=+1168.203065513" observedRunningTime="2026-03-09 09:25:25.897016093 +0000 UTC m=+1168.982055484" watchObservedRunningTime="2026-03-09 09:25:25.920128478 +0000 UTC m=+1169.005167889" Mar 09 09:25:25 crc kubenswrapper[4861]: I0309 09:25:25.923445 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-678fb94c4b-9x5d2" podStartSLOduration=1.923427855 podStartE2EDuration="1.923427855s" podCreationTimestamp="2026-03-09 09:25:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:25:25.914446782 +0000 UTC m=+1168.999486203" watchObservedRunningTime="2026-03-09 09:25:25.923427855 +0000 UTC m=+1169.008467266" Mar 09 09:25:27 crc kubenswrapper[4861]: I0309 09:25:27.150026 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 09 09:25:27 crc kubenswrapper[4861]: I0309 09:25:27.150495 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 09 09:25:27 crc kubenswrapper[4861]: I0309 09:25:27.195743 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 09 09:25:27 crc kubenswrapper[4861]: I0309 09:25:27.199645 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 09 09:25:27 crc kubenswrapper[4861]: I0309 09:25:27.901965 4861 generic.go:334] "Generic (PLEG): container finished" podID="44c35b48-50b9-4dd8-846a-99714c14d3ab" containerID="29b2b6555a7395967ad28cf5a8555c4db17c2735decaf0c6d5b435f52f5c0ac1" exitCode=0 Mar 09 09:25:27 crc kubenswrapper[4861]: I0309 09:25:27.902064 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-8wrdd" event={"ID":"44c35b48-50b9-4dd8-846a-99714c14d3ab","Type":"ContainerDied","Data":"29b2b6555a7395967ad28cf5a8555c4db17c2735decaf0c6d5b435f52f5c0ac1"} Mar 09 09:25:27 crc kubenswrapper[4861]: I0309 09:25:27.902386 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 09 09:25:27 crc kubenswrapper[4861]: I0309 09:25:27.902411 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 09 09:25:28 crc kubenswrapper[4861]: I0309 09:25:28.855017 4861 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-7bb4db8c4-sxjc7" podUID="71492031-e589-409e-b8c8-c0a1194b97ed" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.153:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.153:8443: connect: connection refused" Mar 09 09:25:28 crc kubenswrapper[4861]: I0309 09:25:28.990933 4861 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5487f4d458-lnthc" podUID="9049886d-2460-47fe-ac82-2dfde4858bd0" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.154:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.154:8443: connect: connection refused" Mar 09 09:25:29 crc kubenswrapper[4861]: I0309 09:25:29.946803 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 09 09:25:30 crc kubenswrapper[4861]: I0309 09:25:30.932050 4861 generic.go:334] "Generic (PLEG): container finished" podID="deb8e24b-1a6f-4173-9a5f-62974b0331a5" containerID="a1b48cb736714ed720e92ad5dd691e482e8fb6a283685c7dcd91aabfe74ed413" exitCode=0 Mar 09 09:25:30 crc kubenswrapper[4861]: I0309 09:25:30.932104 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-sr27s" event={"ID":"deb8e24b-1a6f-4173-9a5f-62974b0331a5","Type":"ContainerDied","Data":"a1b48cb736714ed720e92ad5dd691e482e8fb6a283685c7dcd91aabfe74ed413"} Mar 09 09:25:32 crc kubenswrapper[4861]: I0309 09:25:32.038131 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 09 09:25:33 crc kubenswrapper[4861]: I0309 09:25:33.275278 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-sr27s" Mar 09 09:25:33 crc kubenswrapper[4861]: I0309 09:25:33.281526 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-8wrdd" Mar 09 09:25:33 crc kubenswrapper[4861]: I0309 09:25:33.425021 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/deb8e24b-1a6f-4173-9a5f-62974b0331a5-scripts\") pod \"deb8e24b-1a6f-4173-9a5f-62974b0331a5\" (UID: \"deb8e24b-1a6f-4173-9a5f-62974b0331a5\") " Mar 09 09:25:33 crc kubenswrapper[4861]: I0309 09:25:33.425111 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/deb8e24b-1a6f-4173-9a5f-62974b0331a5-db-sync-config-data\") pod \"deb8e24b-1a6f-4173-9a5f-62974b0331a5\" (UID: \"deb8e24b-1a6f-4173-9a5f-62974b0331a5\") " Mar 09 09:25:33 crc kubenswrapper[4861]: I0309 09:25:33.425162 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/44c35b48-50b9-4dd8-846a-99714c14d3ab-db-sync-config-data\") pod \"44c35b48-50b9-4dd8-846a-99714c14d3ab\" (UID: \"44c35b48-50b9-4dd8-846a-99714c14d3ab\") " Mar 09 09:25:33 crc kubenswrapper[4861]: I0309 09:25:33.425236 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/deb8e24b-1a6f-4173-9a5f-62974b0331a5-etc-machine-id\") pod \"deb8e24b-1a6f-4173-9a5f-62974b0331a5\" (UID: \"deb8e24b-1a6f-4173-9a5f-62974b0331a5\") " Mar 09 09:25:33 crc kubenswrapper[4861]: I0309 09:25:33.425275 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deb8e24b-1a6f-4173-9a5f-62974b0331a5-combined-ca-bundle\") pod \"deb8e24b-1a6f-4173-9a5f-62974b0331a5\" (UID: \"deb8e24b-1a6f-4173-9a5f-62974b0331a5\") " Mar 09 09:25:33 crc kubenswrapper[4861]: I0309 09:25:33.425350 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44c35b48-50b9-4dd8-846a-99714c14d3ab-combined-ca-bundle\") pod \"44c35b48-50b9-4dd8-846a-99714c14d3ab\" (UID: \"44c35b48-50b9-4dd8-846a-99714c14d3ab\") " Mar 09 09:25:33 crc kubenswrapper[4861]: I0309 09:25:33.425480 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4drhb\" (UniqueName: \"kubernetes.io/projected/44c35b48-50b9-4dd8-846a-99714c14d3ab-kube-api-access-4drhb\") pod \"44c35b48-50b9-4dd8-846a-99714c14d3ab\" (UID: \"44c35b48-50b9-4dd8-846a-99714c14d3ab\") " Mar 09 09:25:33 crc kubenswrapper[4861]: I0309 09:25:33.425548 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtc75\" (UniqueName: \"kubernetes.io/projected/deb8e24b-1a6f-4173-9a5f-62974b0331a5-kube-api-access-vtc75\") pod \"deb8e24b-1a6f-4173-9a5f-62974b0331a5\" (UID: \"deb8e24b-1a6f-4173-9a5f-62974b0331a5\") " Mar 09 09:25:33 crc kubenswrapper[4861]: I0309 09:25:33.425581 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/deb8e24b-1a6f-4173-9a5f-62974b0331a5-config-data\") pod \"deb8e24b-1a6f-4173-9a5f-62974b0331a5\" (UID: \"deb8e24b-1a6f-4173-9a5f-62974b0331a5\") " Mar 09 09:25:33 crc kubenswrapper[4861]: I0309 09:25:33.426024 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/deb8e24b-1a6f-4173-9a5f-62974b0331a5-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "deb8e24b-1a6f-4173-9a5f-62974b0331a5" (UID: "deb8e24b-1a6f-4173-9a5f-62974b0331a5"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:25:33 crc kubenswrapper[4861]: I0309 09:25:33.427948 4861 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/deb8e24b-1a6f-4173-9a5f-62974b0331a5-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:33 crc kubenswrapper[4861]: I0309 09:25:33.431251 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/deb8e24b-1a6f-4173-9a5f-62974b0331a5-kube-api-access-vtc75" (OuterVolumeSpecName: "kube-api-access-vtc75") pod "deb8e24b-1a6f-4173-9a5f-62974b0331a5" (UID: "deb8e24b-1a6f-4173-9a5f-62974b0331a5"). InnerVolumeSpecName "kube-api-access-vtc75". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:25:33 crc kubenswrapper[4861]: I0309 09:25:33.431596 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44c35b48-50b9-4dd8-846a-99714c14d3ab-kube-api-access-4drhb" (OuterVolumeSpecName: "kube-api-access-4drhb") pod "44c35b48-50b9-4dd8-846a-99714c14d3ab" (UID: "44c35b48-50b9-4dd8-846a-99714c14d3ab"). InnerVolumeSpecName "kube-api-access-4drhb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:25:33 crc kubenswrapper[4861]: I0309 09:25:33.432233 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44c35b48-50b9-4dd8-846a-99714c14d3ab-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "44c35b48-50b9-4dd8-846a-99714c14d3ab" (UID: "44c35b48-50b9-4dd8-846a-99714c14d3ab"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:25:33 crc kubenswrapper[4861]: I0309 09:25:33.432501 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/deb8e24b-1a6f-4173-9a5f-62974b0331a5-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "deb8e24b-1a6f-4173-9a5f-62974b0331a5" (UID: "deb8e24b-1a6f-4173-9a5f-62974b0331a5"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:25:33 crc kubenswrapper[4861]: I0309 09:25:33.436768 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/deb8e24b-1a6f-4173-9a5f-62974b0331a5-scripts" (OuterVolumeSpecName: "scripts") pod "deb8e24b-1a6f-4173-9a5f-62974b0331a5" (UID: "deb8e24b-1a6f-4173-9a5f-62974b0331a5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:25:33 crc kubenswrapper[4861]: I0309 09:25:33.459625 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44c35b48-50b9-4dd8-846a-99714c14d3ab-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "44c35b48-50b9-4dd8-846a-99714c14d3ab" (UID: "44c35b48-50b9-4dd8-846a-99714c14d3ab"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:25:33 crc kubenswrapper[4861]: I0309 09:25:33.467107 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/deb8e24b-1a6f-4173-9a5f-62974b0331a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "deb8e24b-1a6f-4173-9a5f-62974b0331a5" (UID: "deb8e24b-1a6f-4173-9a5f-62974b0331a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:25:33 crc kubenswrapper[4861]: I0309 09:25:33.517382 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/deb8e24b-1a6f-4173-9a5f-62974b0331a5-config-data" (OuterVolumeSpecName: "config-data") pod "deb8e24b-1a6f-4173-9a5f-62974b0331a5" (UID: "deb8e24b-1a6f-4173-9a5f-62974b0331a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:25:33 crc kubenswrapper[4861]: I0309 09:25:33.530483 4861 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/deb8e24b-1a6f-4173-9a5f-62974b0331a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:33 crc kubenswrapper[4861]: I0309 09:25:33.530540 4861 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44c35b48-50b9-4dd8-846a-99714c14d3ab-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:33 crc kubenswrapper[4861]: I0309 09:25:33.530554 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4drhb\" (UniqueName: \"kubernetes.io/projected/44c35b48-50b9-4dd8-846a-99714c14d3ab-kube-api-access-4drhb\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:33 crc kubenswrapper[4861]: I0309 09:25:33.530567 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtc75\" (UniqueName: \"kubernetes.io/projected/deb8e24b-1a6f-4173-9a5f-62974b0331a5-kube-api-access-vtc75\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:33 crc kubenswrapper[4861]: I0309 09:25:33.530578 4861 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/deb8e24b-1a6f-4173-9a5f-62974b0331a5-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:33 crc kubenswrapper[4861]: I0309 09:25:33.530585 4861 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/deb8e24b-1a6f-4173-9a5f-62974b0331a5-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:33 crc kubenswrapper[4861]: I0309 09:25:33.530593 4861 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/deb8e24b-1a6f-4173-9a5f-62974b0331a5-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:33 crc kubenswrapper[4861]: I0309 09:25:33.530601 4861 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/44c35b48-50b9-4dd8-846a-99714c14d3ab-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:33 crc kubenswrapper[4861]: I0309 09:25:33.957991 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-sr27s" event={"ID":"deb8e24b-1a6f-4173-9a5f-62974b0331a5","Type":"ContainerDied","Data":"d1519d1ea655a3e17773abb735d5d6d4849433a94b524594ee840073b82eb9fa"} Mar 09 09:25:33 crc kubenswrapper[4861]: I0309 09:25:33.958054 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1519d1ea655a3e17773abb735d5d6d4849433a94b524594ee840073b82eb9fa" Mar 09 09:25:33 crc kubenswrapper[4861]: I0309 09:25:33.958013 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-sr27s" Mar 09 09:25:33 crc kubenswrapper[4861]: I0309 09:25:33.959643 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-8wrdd" event={"ID":"44c35b48-50b9-4dd8-846a-99714c14d3ab","Type":"ContainerDied","Data":"81e0a3d1569509e91b9bda5c83c35a42e9a081d1d969caa44317edbe2b0b1cb1"} Mar 09 09:25:33 crc kubenswrapper[4861]: I0309 09:25:33.959676 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81e0a3d1569509e91b9bda5c83c35a42e9a081d1d969caa44317edbe2b0b1cb1" Mar 09 09:25:33 crc kubenswrapper[4861]: I0309 09:25:33.959718 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-8wrdd" Mar 09 09:25:33 crc kubenswrapper[4861]: I0309 09:25:33.963156 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3b893f89-a9bc-4a39-bd26-b394cbb0a374","Type":"ContainerStarted","Data":"c24b27beb8691b93bafbdd946919b6e0c5ec4a90322fb39f8123df16c4a121c2"} Mar 09 09:25:33 crc kubenswrapper[4861]: I0309 09:25:33.963321 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3b893f89-a9bc-4a39-bd26-b394cbb0a374" containerName="ceilometer-central-agent" containerID="cri-o://7c1078a757c55399b615079346367a3c7063e060d616a6c7295df26e889b255e" gracePeriod=30 Mar 09 09:25:33 crc kubenswrapper[4861]: I0309 09:25:33.963474 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 09:25:33 crc kubenswrapper[4861]: I0309 09:25:33.963549 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3b893f89-a9bc-4a39-bd26-b394cbb0a374" containerName="proxy-httpd" containerID="cri-o://c24b27beb8691b93bafbdd946919b6e0c5ec4a90322fb39f8123df16c4a121c2" gracePeriod=30 Mar 09 09:25:33 crc kubenswrapper[4861]: I0309 09:25:33.963615 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3b893f89-a9bc-4a39-bd26-b394cbb0a374" containerName="sg-core" containerID="cri-o://aaf3ac9de5dfcab3f64973950fb19a6ea2cf57e74602e7bedfbcec4ae460e46e" gracePeriod=30 Mar 09 09:25:33 crc kubenswrapper[4861]: I0309 09:25:33.963687 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3b893f89-a9bc-4a39-bd26-b394cbb0a374" containerName="ceilometer-notification-agent" containerID="cri-o://e15fa3a3dc10d161f1fc033b6f91e18abcb8ed972cc20b55f2cdece20ece59af" gracePeriod=30 Mar 09 09:25:33 crc kubenswrapper[4861]: I0309 09:25:33.997161 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.00772131 podStartE2EDuration="54.997142993s" podCreationTimestamp="2026-03-09 09:24:39 +0000 UTC" firstStartedPulling="2026-03-09 09:24:41.327334647 +0000 UTC m=+1124.412374048" lastFinishedPulling="2026-03-09 09:25:33.31675633 +0000 UTC m=+1176.401795731" observedRunningTime="2026-03-09 09:25:33.996931707 +0000 UTC m=+1177.081971108" watchObservedRunningTime="2026-03-09 09:25:33.997142993 +0000 UTC m=+1177.082182394" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.599305 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 09:25:34 crc kubenswrapper[4861]: E0309 09:25:34.599916 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44c35b48-50b9-4dd8-846a-99714c14d3ab" containerName="barbican-db-sync" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.599928 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="44c35b48-50b9-4dd8-846a-99714c14d3ab" containerName="barbican-db-sync" Mar 09 09:25:34 crc kubenswrapper[4861]: E0309 09:25:34.599950 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deb8e24b-1a6f-4173-9a5f-62974b0331a5" containerName="cinder-db-sync" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.599956 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="deb8e24b-1a6f-4173-9a5f-62974b0331a5" containerName="cinder-db-sync" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.600117 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="44c35b48-50b9-4dd8-846a-99714c14d3ab" containerName="barbican-db-sync" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.600136 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="deb8e24b-1a6f-4173-9a5f-62974b0331a5" containerName="cinder-db-sync" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.601014 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.602965 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.603203 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.603458 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.603698 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-w94sx" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.659312 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7b6689bdbc-t6phd"] Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.667977 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7b6689bdbc-t6phd" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.673004 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.673252 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.673418 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-rqcfw" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.690258 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7b6689bdbc-t6phd"] Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.704930 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.742412 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-7f66785d8-vkcmq"] Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.744322 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7f66785d8-vkcmq" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.751225 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.754681 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31e86fa6-cfa9-488f-b87f-ea95b05f8531-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"31e86fa6-cfa9-488f-b87f-ea95b05f8531\") " pod="openstack/cinder-scheduler-0" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.754803 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31e86fa6-cfa9-488f-b87f-ea95b05f8531-scripts\") pod \"cinder-scheduler-0\" (UID: \"31e86fa6-cfa9-488f-b87f-ea95b05f8531\") " pod="openstack/cinder-scheduler-0" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.754872 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42q9t\" (UniqueName: \"kubernetes.io/projected/31e86fa6-cfa9-488f-b87f-ea95b05f8531-kube-api-access-42q9t\") pod \"cinder-scheduler-0\" (UID: \"31e86fa6-cfa9-488f-b87f-ea95b05f8531\") " pod="openstack/cinder-scheduler-0" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.754921 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31e86fa6-cfa9-488f-b87f-ea95b05f8531-config-data\") pod \"cinder-scheduler-0\" (UID: \"31e86fa6-cfa9-488f-b87f-ea95b05f8531\") " pod="openstack/cinder-scheduler-0" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.754956 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/31e86fa6-cfa9-488f-b87f-ea95b05f8531-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"31e86fa6-cfa9-488f-b87f-ea95b05f8531\") " pod="openstack/cinder-scheduler-0" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.754976 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/31e86fa6-cfa9-488f-b87f-ea95b05f8531-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"31e86fa6-cfa9-488f-b87f-ea95b05f8531\") " pod="openstack/cinder-scheduler-0" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.812509 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7f66785d8-vkcmq"] Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.832142 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-69579b58d9-pzk8q"] Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.833610 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69579b58d9-pzk8q" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.851590 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69579b58d9-pzk8q"] Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.856991 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31e86fa6-cfa9-488f-b87f-ea95b05f8531-scripts\") pod \"cinder-scheduler-0\" (UID: \"31e86fa6-cfa9-488f-b87f-ea95b05f8531\") " pod="openstack/cinder-scheduler-0" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.857067 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98c58\" (UniqueName: \"kubernetes.io/projected/8bc3e378-d567-4ba4-b135-1393faa1dbc6-kube-api-access-98c58\") pod \"barbican-keystone-listener-7f66785d8-vkcmq\" (UID: \"8bc3e378-d567-4ba4-b135-1393faa1dbc6\") " pod="openstack/barbican-keystone-listener-7f66785d8-vkcmq" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.857097 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/82a35d2d-6934-4c56-a62d-db22ac36a6be-logs\") pod \"barbican-worker-7b6689bdbc-t6phd\" (UID: \"82a35d2d-6934-4c56-a62d-db22ac36a6be\") " pod="openstack/barbican-worker-7b6689bdbc-t6phd" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.857141 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8bc3e378-d567-4ba4-b135-1393faa1dbc6-config-data-custom\") pod \"barbican-keystone-listener-7f66785d8-vkcmq\" (UID: \"8bc3e378-d567-4ba4-b135-1393faa1dbc6\") " pod="openstack/barbican-keystone-listener-7f66785d8-vkcmq" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.857204 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42q9t\" (UniqueName: \"kubernetes.io/projected/31e86fa6-cfa9-488f-b87f-ea95b05f8531-kube-api-access-42q9t\") pod \"cinder-scheduler-0\" (UID: \"31e86fa6-cfa9-488f-b87f-ea95b05f8531\") " pod="openstack/cinder-scheduler-0" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.857231 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bc3e378-d567-4ba4-b135-1393faa1dbc6-combined-ca-bundle\") pod \"barbican-keystone-listener-7f66785d8-vkcmq\" (UID: \"8bc3e378-d567-4ba4-b135-1393faa1dbc6\") " pod="openstack/barbican-keystone-listener-7f66785d8-vkcmq" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.857259 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82a35d2d-6934-4c56-a62d-db22ac36a6be-config-data\") pod \"barbican-worker-7b6689bdbc-t6phd\" (UID: \"82a35d2d-6934-4c56-a62d-db22ac36a6be\") " pod="openstack/barbican-worker-7b6689bdbc-t6phd" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.857284 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31e86fa6-cfa9-488f-b87f-ea95b05f8531-config-data\") pod \"cinder-scheduler-0\" (UID: \"31e86fa6-cfa9-488f-b87f-ea95b05f8531\") " pod="openstack/cinder-scheduler-0" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.857301 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82a35d2d-6934-4c56-a62d-db22ac36a6be-combined-ca-bundle\") pod \"barbican-worker-7b6689bdbc-t6phd\" (UID: \"82a35d2d-6934-4c56-a62d-db22ac36a6be\") " pod="openstack/barbican-worker-7b6689bdbc-t6phd" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.857335 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/31e86fa6-cfa9-488f-b87f-ea95b05f8531-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"31e86fa6-cfa9-488f-b87f-ea95b05f8531\") " pod="openstack/cinder-scheduler-0" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.857350 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/31e86fa6-cfa9-488f-b87f-ea95b05f8531-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"31e86fa6-cfa9-488f-b87f-ea95b05f8531\") " pod="openstack/cinder-scheduler-0" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.857407 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31e86fa6-cfa9-488f-b87f-ea95b05f8531-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"31e86fa6-cfa9-488f-b87f-ea95b05f8531\") " pod="openstack/cinder-scheduler-0" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.857426 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qx25j\" (UniqueName: \"kubernetes.io/projected/82a35d2d-6934-4c56-a62d-db22ac36a6be-kube-api-access-qx25j\") pod \"barbican-worker-7b6689bdbc-t6phd\" (UID: \"82a35d2d-6934-4c56-a62d-db22ac36a6be\") " pod="openstack/barbican-worker-7b6689bdbc-t6phd" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.857451 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/82a35d2d-6934-4c56-a62d-db22ac36a6be-config-data-custom\") pod \"barbican-worker-7b6689bdbc-t6phd\" (UID: \"82a35d2d-6934-4c56-a62d-db22ac36a6be\") " pod="openstack/barbican-worker-7b6689bdbc-t6phd" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.857473 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8bc3e378-d567-4ba4-b135-1393faa1dbc6-logs\") pod \"barbican-keystone-listener-7f66785d8-vkcmq\" (UID: \"8bc3e378-d567-4ba4-b135-1393faa1dbc6\") " pod="openstack/barbican-keystone-listener-7f66785d8-vkcmq" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.857498 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bc3e378-d567-4ba4-b135-1393faa1dbc6-config-data\") pod \"barbican-keystone-listener-7f66785d8-vkcmq\" (UID: \"8bc3e378-d567-4ba4-b135-1393faa1dbc6\") " pod="openstack/barbican-keystone-listener-7f66785d8-vkcmq" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.858907 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/31e86fa6-cfa9-488f-b87f-ea95b05f8531-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"31e86fa6-cfa9-488f-b87f-ea95b05f8531\") " pod="openstack/cinder-scheduler-0" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.876946 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31e86fa6-cfa9-488f-b87f-ea95b05f8531-scripts\") pod \"cinder-scheduler-0\" (UID: \"31e86fa6-cfa9-488f-b87f-ea95b05f8531\") " pod="openstack/cinder-scheduler-0" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.877495 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31e86fa6-cfa9-488f-b87f-ea95b05f8531-config-data\") pod \"cinder-scheduler-0\" (UID: \"31e86fa6-cfa9-488f-b87f-ea95b05f8531\") " pod="openstack/cinder-scheduler-0" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.886092 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/31e86fa6-cfa9-488f-b87f-ea95b05f8531-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"31e86fa6-cfa9-488f-b87f-ea95b05f8531\") " pod="openstack/cinder-scheduler-0" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.893068 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69579b58d9-pzk8q"] Mar 09 09:25:34 crc kubenswrapper[4861]: E0309 09:25:34.894169 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc dns-swift-storage-0 kube-api-access-wf7dm ovsdbserver-nb ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-69579b58d9-pzk8q" podUID="e9567f8e-d157-445e-9370-1152c4923621" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.895266 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42q9t\" (UniqueName: \"kubernetes.io/projected/31e86fa6-cfa9-488f-b87f-ea95b05f8531-kube-api-access-42q9t\") pod \"cinder-scheduler-0\" (UID: \"31e86fa6-cfa9-488f-b87f-ea95b05f8531\") " pod="openstack/cinder-scheduler-0" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.897069 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31e86fa6-cfa9-488f-b87f-ea95b05f8531-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"31e86fa6-cfa9-488f-b87f-ea95b05f8531\") " pod="openstack/cinder-scheduler-0" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.904430 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b8fcc65cc-466c6"] Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.906032 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b8fcc65cc-466c6" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.916811 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.958513 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e9567f8e-d157-445e-9370-1152c4923621-dns-svc\") pod \"dnsmasq-dns-69579b58d9-pzk8q\" (UID: \"e9567f8e-d157-445e-9370-1152c4923621\") " pod="openstack/dnsmasq-dns-69579b58d9-pzk8q" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.965745 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e9567f8e-d157-445e-9370-1152c4923621-ovsdbserver-nb\") pod \"dnsmasq-dns-69579b58d9-pzk8q\" (UID: \"e9567f8e-d157-445e-9370-1152c4923621\") " pod="openstack/dnsmasq-dns-69579b58d9-pzk8q" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.965800 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e9567f8e-d157-445e-9370-1152c4923621-ovsdbserver-sb\") pod \"dnsmasq-dns-69579b58d9-pzk8q\" (UID: \"e9567f8e-d157-445e-9370-1152c4923621\") " pod="openstack/dnsmasq-dns-69579b58d9-pzk8q" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.965836 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98c58\" (UniqueName: \"kubernetes.io/projected/8bc3e378-d567-4ba4-b135-1393faa1dbc6-kube-api-access-98c58\") pod \"barbican-keystone-listener-7f66785d8-vkcmq\" (UID: \"8bc3e378-d567-4ba4-b135-1393faa1dbc6\") " pod="openstack/barbican-keystone-listener-7f66785d8-vkcmq" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.965861 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/82a35d2d-6934-4c56-a62d-db22ac36a6be-logs\") pod \"barbican-worker-7b6689bdbc-t6phd\" (UID: \"82a35d2d-6934-4c56-a62d-db22ac36a6be\") " pod="openstack/barbican-worker-7b6689bdbc-t6phd" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.965907 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8bc3e378-d567-4ba4-b135-1393faa1dbc6-config-data-custom\") pod \"barbican-keystone-listener-7f66785d8-vkcmq\" (UID: \"8bc3e378-d567-4ba4-b135-1393faa1dbc6\") " pod="openstack/barbican-keystone-listener-7f66785d8-vkcmq" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.965976 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bc3e378-d567-4ba4-b135-1393faa1dbc6-combined-ca-bundle\") pod \"barbican-keystone-listener-7f66785d8-vkcmq\" (UID: \"8bc3e378-d567-4ba4-b135-1393faa1dbc6\") " pod="openstack/barbican-keystone-listener-7f66785d8-vkcmq" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.965994 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9567f8e-d157-445e-9370-1152c4923621-config\") pod \"dnsmasq-dns-69579b58d9-pzk8q\" (UID: \"e9567f8e-d157-445e-9370-1152c4923621\") " pod="openstack/dnsmasq-dns-69579b58d9-pzk8q" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.966023 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e9567f8e-d157-445e-9370-1152c4923621-dns-swift-storage-0\") pod \"dnsmasq-dns-69579b58d9-pzk8q\" (UID: \"e9567f8e-d157-445e-9370-1152c4923621\") " pod="openstack/dnsmasq-dns-69579b58d9-pzk8q" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.966044 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82a35d2d-6934-4c56-a62d-db22ac36a6be-config-data\") pod \"barbican-worker-7b6689bdbc-t6phd\" (UID: \"82a35d2d-6934-4c56-a62d-db22ac36a6be\") " pod="openstack/barbican-worker-7b6689bdbc-t6phd" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.966100 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82a35d2d-6934-4c56-a62d-db22ac36a6be-combined-ca-bundle\") pod \"barbican-worker-7b6689bdbc-t6phd\" (UID: \"82a35d2d-6934-4c56-a62d-db22ac36a6be\") " pod="openstack/barbican-worker-7b6689bdbc-t6phd" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.966177 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qx25j\" (UniqueName: \"kubernetes.io/projected/82a35d2d-6934-4c56-a62d-db22ac36a6be-kube-api-access-qx25j\") pod \"barbican-worker-7b6689bdbc-t6phd\" (UID: \"82a35d2d-6934-4c56-a62d-db22ac36a6be\") " pod="openstack/barbican-worker-7b6689bdbc-t6phd" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.966212 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/82a35d2d-6934-4c56-a62d-db22ac36a6be-config-data-custom\") pod \"barbican-worker-7b6689bdbc-t6phd\" (UID: \"82a35d2d-6934-4c56-a62d-db22ac36a6be\") " pod="openstack/barbican-worker-7b6689bdbc-t6phd" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.966237 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8bc3e378-d567-4ba4-b135-1393faa1dbc6-logs\") pod \"barbican-keystone-listener-7f66785d8-vkcmq\" (UID: \"8bc3e378-d567-4ba4-b135-1393faa1dbc6\") " pod="openstack/barbican-keystone-listener-7f66785d8-vkcmq" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.966269 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wf7dm\" (UniqueName: \"kubernetes.io/projected/e9567f8e-d157-445e-9370-1152c4923621-kube-api-access-wf7dm\") pod \"dnsmasq-dns-69579b58d9-pzk8q\" (UID: \"e9567f8e-d157-445e-9370-1152c4923621\") " pod="openstack/dnsmasq-dns-69579b58d9-pzk8q" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.966303 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bc3e378-d567-4ba4-b135-1393faa1dbc6-config-data\") pod \"barbican-keystone-listener-7f66785d8-vkcmq\" (UID: \"8bc3e378-d567-4ba4-b135-1393faa1dbc6\") " pod="openstack/barbican-keystone-listener-7f66785d8-vkcmq" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.977272 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b8fcc65cc-466c6"] Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.981273 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.995456 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8bc3e378-d567-4ba4-b135-1393faa1dbc6-logs\") pod \"barbican-keystone-listener-7f66785d8-vkcmq\" (UID: \"8bc3e378-d567-4ba4-b135-1393faa1dbc6\") " pod="openstack/barbican-keystone-listener-7f66785d8-vkcmq" Mar 09 09:25:34 crc kubenswrapper[4861]: I0309 09:25:34.995791 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/82a35d2d-6934-4c56-a62d-db22ac36a6be-logs\") pod \"barbican-worker-7b6689bdbc-t6phd\" (UID: \"82a35d2d-6934-4c56-a62d-db22ac36a6be\") " pod="openstack/barbican-worker-7b6689bdbc-t6phd" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.009422 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8bc3e378-d567-4ba4-b135-1393faa1dbc6-config-data-custom\") pod \"barbican-keystone-listener-7f66785d8-vkcmq\" (UID: \"8bc3e378-d567-4ba4-b135-1393faa1dbc6\") " pod="openstack/barbican-keystone-listener-7f66785d8-vkcmq" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.011193 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qx25j\" (UniqueName: \"kubernetes.io/projected/82a35d2d-6934-4c56-a62d-db22ac36a6be-kube-api-access-qx25j\") pod \"barbican-worker-7b6689bdbc-t6phd\" (UID: \"82a35d2d-6934-4c56-a62d-db22ac36a6be\") " pod="openstack/barbican-worker-7b6689bdbc-t6phd" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.011742 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bc3e378-d567-4ba4-b135-1393faa1dbc6-combined-ca-bundle\") pod \"barbican-keystone-listener-7f66785d8-vkcmq\" (UID: \"8bc3e378-d567-4ba4-b135-1393faa1dbc6\") " pod="openstack/barbican-keystone-listener-7f66785d8-vkcmq" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.012066 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82a35d2d-6934-4c56-a62d-db22ac36a6be-config-data\") pod \"barbican-worker-7b6689bdbc-t6phd\" (UID: \"82a35d2d-6934-4c56-a62d-db22ac36a6be\") " pod="openstack/barbican-worker-7b6689bdbc-t6phd" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.018032 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/82a35d2d-6934-4c56-a62d-db22ac36a6be-config-data-custom\") pod \"barbican-worker-7b6689bdbc-t6phd\" (UID: \"82a35d2d-6934-4c56-a62d-db22ac36a6be\") " pod="openstack/barbican-worker-7b6689bdbc-t6phd" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.018696 4861 generic.go:334] "Generic (PLEG): container finished" podID="3b893f89-a9bc-4a39-bd26-b394cbb0a374" containerID="c24b27beb8691b93bafbdd946919b6e0c5ec4a90322fb39f8123df16c4a121c2" exitCode=0 Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.018730 4861 generic.go:334] "Generic (PLEG): container finished" podID="3b893f89-a9bc-4a39-bd26-b394cbb0a374" containerID="aaf3ac9de5dfcab3f64973950fb19a6ea2cf57e74602e7bedfbcec4ae460e46e" exitCode=2 Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.018742 4861 generic.go:334] "Generic (PLEG): container finished" podID="3b893f89-a9bc-4a39-bd26-b394cbb0a374" containerID="7c1078a757c55399b615079346367a3c7063e060d616a6c7295df26e889b255e" exitCode=0 Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.018818 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69579b58d9-pzk8q" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.028767 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.028816 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3b893f89-a9bc-4a39-bd26-b394cbb0a374","Type":"ContainerDied","Data":"c24b27beb8691b93bafbdd946919b6e0c5ec4a90322fb39f8123df16c4a121c2"} Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.028852 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3b893f89-a9bc-4a39-bd26-b394cbb0a374","Type":"ContainerDied","Data":"aaf3ac9de5dfcab3f64973950fb19a6ea2cf57e74602e7bedfbcec4ae460e46e"} Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.028867 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3b893f89-a9bc-4a39-bd26-b394cbb0a374","Type":"ContainerDied","Data":"7c1078a757c55399b615079346367a3c7063e060d616a6c7295df26e889b255e"} Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.028984 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.034036 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.038397 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69579b58d9-pzk8q" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.040246 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bc3e378-d567-4ba4-b135-1393faa1dbc6-config-data\") pod \"barbican-keystone-listener-7f66785d8-vkcmq\" (UID: \"8bc3e378-d567-4ba4-b135-1393faa1dbc6\") " pod="openstack/barbican-keystone-listener-7f66785d8-vkcmq" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.040315 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82a35d2d-6934-4c56-a62d-db22ac36a6be-combined-ca-bundle\") pod \"barbican-worker-7b6689bdbc-t6phd\" (UID: \"82a35d2d-6934-4c56-a62d-db22ac36a6be\") " pod="openstack/barbican-worker-7b6689bdbc-t6phd" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.059754 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-764b486b9b-tk6j5"] Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.063417 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-764b486b9b-tk6j5" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.065606 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.068802 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e9567f8e-d157-445e-9370-1152c4923621-dns-svc\") pod \"dnsmasq-dns-69579b58d9-pzk8q\" (UID: \"e9567f8e-d157-445e-9370-1152c4923621\") " pod="openstack/dnsmasq-dns-69579b58d9-pzk8q" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.069623 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bldjp\" (UniqueName: \"kubernetes.io/projected/cc825c65-a951-464c-87d3-3e3bedee3e50-kube-api-access-bldjp\") pod \"dnsmasq-dns-7b8fcc65cc-466c6\" (UID: \"cc825c65-a951-464c-87d3-3e3bedee3e50\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-466c6" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.069667 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e9567f8e-d157-445e-9370-1152c4923621-ovsdbserver-nb\") pod \"dnsmasq-dns-69579b58d9-pzk8q\" (UID: \"e9567f8e-d157-445e-9370-1152c4923621\") " pod="openstack/dnsmasq-dns-69579b58d9-pzk8q" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.069692 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e9567f8e-d157-445e-9370-1152c4923621-ovsdbserver-sb\") pod \"dnsmasq-dns-69579b58d9-pzk8q\" (UID: \"e9567f8e-d157-445e-9370-1152c4923621\") " pod="openstack/dnsmasq-dns-69579b58d9-pzk8q" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.069741 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc825c65-a951-464c-87d3-3e3bedee3e50-dns-svc\") pod \"dnsmasq-dns-7b8fcc65cc-466c6\" (UID: \"cc825c65-a951-464c-87d3-3e3bedee3e50\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-466c6" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.069759 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc825c65-a951-464c-87d3-3e3bedee3e50-ovsdbserver-nb\") pod \"dnsmasq-dns-7b8fcc65cc-466c6\" (UID: \"cc825c65-a951-464c-87d3-3e3bedee3e50\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-466c6" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.069795 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9567f8e-d157-445e-9370-1152c4923621-config\") pod \"dnsmasq-dns-69579b58d9-pzk8q\" (UID: \"e9567f8e-d157-445e-9370-1152c4923621\") " pod="openstack/dnsmasq-dns-69579b58d9-pzk8q" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.069821 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e9567f8e-d157-445e-9370-1152c4923621-dns-swift-storage-0\") pod \"dnsmasq-dns-69579b58d9-pzk8q\" (UID: \"e9567f8e-d157-445e-9370-1152c4923621\") " pod="openstack/dnsmasq-dns-69579b58d9-pzk8q" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.069854 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cc825c65-a951-464c-87d3-3e3bedee3e50-dns-swift-storage-0\") pod \"dnsmasq-dns-7b8fcc65cc-466c6\" (UID: \"cc825c65-a951-464c-87d3-3e3bedee3e50\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-466c6" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.069881 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc825c65-a951-464c-87d3-3e3bedee3e50-config\") pod \"dnsmasq-dns-7b8fcc65cc-466c6\" (UID: \"cc825c65-a951-464c-87d3-3e3bedee3e50\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-466c6" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.069901 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc825c65-a951-464c-87d3-3e3bedee3e50-ovsdbserver-sb\") pod \"dnsmasq-dns-7b8fcc65cc-466c6\" (UID: \"cc825c65-a951-464c-87d3-3e3bedee3e50\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-466c6" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.069955 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wf7dm\" (UniqueName: \"kubernetes.io/projected/e9567f8e-d157-445e-9370-1152c4923621-kube-api-access-wf7dm\") pod \"dnsmasq-dns-69579b58d9-pzk8q\" (UID: \"e9567f8e-d157-445e-9370-1152c4923621\") " pod="openstack/dnsmasq-dns-69579b58d9-pzk8q" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.070941 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e9567f8e-d157-445e-9370-1152c4923621-ovsdbserver-nb\") pod \"dnsmasq-dns-69579b58d9-pzk8q\" (UID: \"e9567f8e-d157-445e-9370-1152c4923621\") " pod="openstack/dnsmasq-dns-69579b58d9-pzk8q" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.071465 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e9567f8e-d157-445e-9370-1152c4923621-ovsdbserver-sb\") pod \"dnsmasq-dns-69579b58d9-pzk8q\" (UID: \"e9567f8e-d157-445e-9370-1152c4923621\") " pod="openstack/dnsmasq-dns-69579b58d9-pzk8q" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.071974 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9567f8e-d157-445e-9370-1152c4923621-config\") pod \"dnsmasq-dns-69579b58d9-pzk8q\" (UID: \"e9567f8e-d157-445e-9370-1152c4923621\") " pod="openstack/dnsmasq-dns-69579b58d9-pzk8q" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.072500 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e9567f8e-d157-445e-9370-1152c4923621-dns-swift-storage-0\") pod \"dnsmasq-dns-69579b58d9-pzk8q\" (UID: \"e9567f8e-d157-445e-9370-1152c4923621\") " pod="openstack/dnsmasq-dns-69579b58d9-pzk8q" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.073001 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e9567f8e-d157-445e-9370-1152c4923621-dns-svc\") pod \"dnsmasq-dns-69579b58d9-pzk8q\" (UID: \"e9567f8e-d157-445e-9370-1152c4923621\") " pod="openstack/dnsmasq-dns-69579b58d9-pzk8q" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.074040 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98c58\" (UniqueName: \"kubernetes.io/projected/8bc3e378-d567-4ba4-b135-1393faa1dbc6-kube-api-access-98c58\") pod \"barbican-keystone-listener-7f66785d8-vkcmq\" (UID: \"8bc3e378-d567-4ba4-b135-1393faa1dbc6\") " pod="openstack/barbican-keystone-listener-7f66785d8-vkcmq" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.075020 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7f66785d8-vkcmq" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.121187 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wf7dm\" (UniqueName: \"kubernetes.io/projected/e9567f8e-d157-445e-9370-1152c4923621-kube-api-access-wf7dm\") pod \"dnsmasq-dns-69579b58d9-pzk8q\" (UID: \"e9567f8e-d157-445e-9370-1152c4923621\") " pod="openstack/dnsmasq-dns-69579b58d9-pzk8q" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.123840 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-764b486b9b-tk6j5"] Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.171339 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e9567f8e-d157-445e-9370-1152c4923621-dns-swift-storage-0\") pod \"e9567f8e-d157-445e-9370-1152c4923621\" (UID: \"e9567f8e-d157-445e-9370-1152c4923621\") " Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.174786 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9567f8e-d157-445e-9370-1152c4923621-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e9567f8e-d157-445e-9370-1152c4923621" (UID: "e9567f8e-d157-445e-9370-1152c4923621"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.179743 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wf7dm\" (UniqueName: \"kubernetes.io/projected/e9567f8e-d157-445e-9370-1152c4923621-kube-api-access-wf7dm\") pod \"e9567f8e-d157-445e-9370-1152c4923621\" (UID: \"e9567f8e-d157-445e-9370-1152c4923621\") " Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.179944 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e9567f8e-d157-445e-9370-1152c4923621-dns-svc\") pod \"e9567f8e-d157-445e-9370-1152c4923621\" (UID: \"e9567f8e-d157-445e-9370-1152c4923621\") " Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.179982 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9567f8e-d157-445e-9370-1152c4923621-config\") pod \"e9567f8e-d157-445e-9370-1152c4923621\" (UID: \"e9567f8e-d157-445e-9370-1152c4923621\") " Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.180011 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e9567f8e-d157-445e-9370-1152c4923621-ovsdbserver-nb\") pod \"e9567f8e-d157-445e-9370-1152c4923621\" (UID: \"e9567f8e-d157-445e-9370-1152c4923621\") " Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.180061 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e9567f8e-d157-445e-9370-1152c4923621-ovsdbserver-sb\") pod \"e9567f8e-d157-445e-9370-1152c4923621\" (UID: \"e9567f8e-d157-445e-9370-1152c4923621\") " Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.180360 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a-config-data\") pod \"cinder-api-0\" (UID: \"3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a\") " pod="openstack/cinder-api-0" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.180449 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8k2kc\" (UniqueName: \"kubernetes.io/projected/3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a-kube-api-access-8k2kc\") pod \"cinder-api-0\" (UID: \"3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a\") " pod="openstack/cinder-api-0" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.180471 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a\") " pod="openstack/cinder-api-0" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.180502 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48fbe4a1-81ab-4a46-8150-821bc8afa220-logs\") pod \"barbican-api-764b486b9b-tk6j5\" (UID: \"48fbe4a1-81ab-4a46-8150-821bc8afa220\") " pod="openstack/barbican-api-764b486b9b-tk6j5" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.180534 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a\") " pod="openstack/cinder-api-0" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.180586 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bldjp\" (UniqueName: \"kubernetes.io/projected/cc825c65-a951-464c-87d3-3e3bedee3e50-kube-api-access-bldjp\") pod \"dnsmasq-dns-7b8fcc65cc-466c6\" (UID: \"cc825c65-a951-464c-87d3-3e3bedee3e50\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-466c6" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.180618 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48fbe4a1-81ab-4a46-8150-821bc8afa220-config-data-custom\") pod \"barbican-api-764b486b9b-tk6j5\" (UID: \"48fbe4a1-81ab-4a46-8150-821bc8afa220\") " pod="openstack/barbican-api-764b486b9b-tk6j5" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.180722 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc825c65-a951-464c-87d3-3e3bedee3e50-dns-svc\") pod \"dnsmasq-dns-7b8fcc65cc-466c6\" (UID: \"cc825c65-a951-464c-87d3-3e3bedee3e50\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-466c6" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.180753 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc825c65-a951-464c-87d3-3e3bedee3e50-ovsdbserver-nb\") pod \"dnsmasq-dns-7b8fcc65cc-466c6\" (UID: \"cc825c65-a951-464c-87d3-3e3bedee3e50\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-466c6" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.180786 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a-logs\") pod \"cinder-api-0\" (UID: \"3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a\") " pod="openstack/cinder-api-0" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.180857 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a-config-data-custom\") pod \"cinder-api-0\" (UID: \"3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a\") " pod="openstack/cinder-api-0" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.180905 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cc825c65-a951-464c-87d3-3e3bedee3e50-dns-swift-storage-0\") pod \"dnsmasq-dns-7b8fcc65cc-466c6\" (UID: \"cc825c65-a951-464c-87d3-3e3bedee3e50\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-466c6" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.180941 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a-scripts\") pod \"cinder-api-0\" (UID: \"3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a\") " pod="openstack/cinder-api-0" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.180967 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48fbe4a1-81ab-4a46-8150-821bc8afa220-combined-ca-bundle\") pod \"barbican-api-764b486b9b-tk6j5\" (UID: \"48fbe4a1-81ab-4a46-8150-821bc8afa220\") " pod="openstack/barbican-api-764b486b9b-tk6j5" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.180998 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc825c65-a951-464c-87d3-3e3bedee3e50-config\") pod \"dnsmasq-dns-7b8fcc65cc-466c6\" (UID: \"cc825c65-a951-464c-87d3-3e3bedee3e50\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-466c6" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.181031 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc825c65-a951-464c-87d3-3e3bedee3e50-ovsdbserver-sb\") pod \"dnsmasq-dns-7b8fcc65cc-466c6\" (UID: \"cc825c65-a951-464c-87d3-3e3bedee3e50\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-466c6" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.181061 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m78c\" (UniqueName: \"kubernetes.io/projected/48fbe4a1-81ab-4a46-8150-821bc8afa220-kube-api-access-2m78c\") pod \"barbican-api-764b486b9b-tk6j5\" (UID: \"48fbe4a1-81ab-4a46-8150-821bc8afa220\") " pod="openstack/barbican-api-764b486b9b-tk6j5" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.181100 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48fbe4a1-81ab-4a46-8150-821bc8afa220-config-data\") pod \"barbican-api-764b486b9b-tk6j5\" (UID: \"48fbe4a1-81ab-4a46-8150-821bc8afa220\") " pod="openstack/barbican-api-764b486b9b-tk6j5" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.181195 4861 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e9567f8e-d157-445e-9370-1152c4923621-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.183757 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9567f8e-d157-445e-9370-1152c4923621-config" (OuterVolumeSpecName: "config") pod "e9567f8e-d157-445e-9370-1152c4923621" (UID: "e9567f8e-d157-445e-9370-1152c4923621"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.183921 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9567f8e-d157-445e-9370-1152c4923621-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e9567f8e-d157-445e-9370-1152c4923621" (UID: "e9567f8e-d157-445e-9370-1152c4923621"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.185000 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc825c65-a951-464c-87d3-3e3bedee3e50-ovsdbserver-nb\") pod \"dnsmasq-dns-7b8fcc65cc-466c6\" (UID: \"cc825c65-a951-464c-87d3-3e3bedee3e50\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-466c6" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.185036 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc825c65-a951-464c-87d3-3e3bedee3e50-dns-svc\") pod \"dnsmasq-dns-7b8fcc65cc-466c6\" (UID: \"cc825c65-a951-464c-87d3-3e3bedee3e50\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-466c6" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.185532 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc825c65-a951-464c-87d3-3e3bedee3e50-config\") pod \"dnsmasq-dns-7b8fcc65cc-466c6\" (UID: \"cc825c65-a951-464c-87d3-3e3bedee3e50\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-466c6" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.185546 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9567f8e-d157-445e-9370-1152c4923621-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e9567f8e-d157-445e-9370-1152c4923621" (UID: "e9567f8e-d157-445e-9370-1152c4923621"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.187358 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cc825c65-a951-464c-87d3-3e3bedee3e50-dns-swift-storage-0\") pod \"dnsmasq-dns-7b8fcc65cc-466c6\" (UID: \"cc825c65-a951-464c-87d3-3e3bedee3e50\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-466c6" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.188006 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc825c65-a951-464c-87d3-3e3bedee3e50-ovsdbserver-sb\") pod \"dnsmasq-dns-7b8fcc65cc-466c6\" (UID: \"cc825c65-a951-464c-87d3-3e3bedee3e50\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-466c6" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.195109 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9567f8e-d157-445e-9370-1152c4923621-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e9567f8e-d157-445e-9370-1152c4923621" (UID: "e9567f8e-d157-445e-9370-1152c4923621"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.200677 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bldjp\" (UniqueName: \"kubernetes.io/projected/cc825c65-a951-464c-87d3-3e3bedee3e50-kube-api-access-bldjp\") pod \"dnsmasq-dns-7b8fcc65cc-466c6\" (UID: \"cc825c65-a951-464c-87d3-3e3bedee3e50\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-466c6" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.216028 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9567f8e-d157-445e-9370-1152c4923621-kube-api-access-wf7dm" (OuterVolumeSpecName: "kube-api-access-wf7dm") pod "e9567f8e-d157-445e-9370-1152c4923621" (UID: "e9567f8e-d157-445e-9370-1152c4923621"). InnerVolumeSpecName "kube-api-access-wf7dm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.283426 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48fbe4a1-81ab-4a46-8150-821bc8afa220-config-data\") pod \"barbican-api-764b486b9b-tk6j5\" (UID: \"48fbe4a1-81ab-4a46-8150-821bc8afa220\") " pod="openstack/barbican-api-764b486b9b-tk6j5" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.283494 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a-config-data\") pod \"cinder-api-0\" (UID: \"3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a\") " pod="openstack/cinder-api-0" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.283530 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8k2kc\" (UniqueName: \"kubernetes.io/projected/3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a-kube-api-access-8k2kc\") pod \"cinder-api-0\" (UID: \"3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a\") " pod="openstack/cinder-api-0" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.283547 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a\") " pod="openstack/cinder-api-0" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.283565 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48fbe4a1-81ab-4a46-8150-821bc8afa220-logs\") pod \"barbican-api-764b486b9b-tk6j5\" (UID: \"48fbe4a1-81ab-4a46-8150-821bc8afa220\") " pod="openstack/barbican-api-764b486b9b-tk6j5" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.283582 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a\") " pod="openstack/cinder-api-0" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.283612 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48fbe4a1-81ab-4a46-8150-821bc8afa220-config-data-custom\") pod \"barbican-api-764b486b9b-tk6j5\" (UID: \"48fbe4a1-81ab-4a46-8150-821bc8afa220\") " pod="openstack/barbican-api-764b486b9b-tk6j5" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.283660 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a-logs\") pod \"cinder-api-0\" (UID: \"3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a\") " pod="openstack/cinder-api-0" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.283691 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a-config-data-custom\") pod \"cinder-api-0\" (UID: \"3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a\") " pod="openstack/cinder-api-0" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.283720 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a-scripts\") pod \"cinder-api-0\" (UID: \"3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a\") " pod="openstack/cinder-api-0" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.283736 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48fbe4a1-81ab-4a46-8150-821bc8afa220-combined-ca-bundle\") pod \"barbican-api-764b486b9b-tk6j5\" (UID: \"48fbe4a1-81ab-4a46-8150-821bc8afa220\") " pod="openstack/barbican-api-764b486b9b-tk6j5" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.283762 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2m78c\" (UniqueName: \"kubernetes.io/projected/48fbe4a1-81ab-4a46-8150-821bc8afa220-kube-api-access-2m78c\") pod \"barbican-api-764b486b9b-tk6j5\" (UID: \"48fbe4a1-81ab-4a46-8150-821bc8afa220\") " pod="openstack/barbican-api-764b486b9b-tk6j5" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.283806 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wf7dm\" (UniqueName: \"kubernetes.io/projected/e9567f8e-d157-445e-9370-1152c4923621-kube-api-access-wf7dm\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.283817 4861 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e9567f8e-d157-445e-9370-1152c4923621-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.283826 4861 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e9567f8e-d157-445e-9370-1152c4923621-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.283834 4861 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e9567f8e-d157-445e-9370-1152c4923621-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.283843 4861 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e9567f8e-d157-445e-9370-1152c4923621-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.283979 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a\") " pod="openstack/cinder-api-0" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.284485 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48fbe4a1-81ab-4a46-8150-821bc8afa220-logs\") pod \"barbican-api-764b486b9b-tk6j5\" (UID: \"48fbe4a1-81ab-4a46-8150-821bc8afa220\") " pod="openstack/barbican-api-764b486b9b-tk6j5" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.284757 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a-logs\") pod \"cinder-api-0\" (UID: \"3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a\") " pod="openstack/cinder-api-0" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.293327 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a-config-data\") pod \"cinder-api-0\" (UID: \"3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a\") " pod="openstack/cinder-api-0" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.295707 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48fbe4a1-81ab-4a46-8150-821bc8afa220-config-data\") pod \"barbican-api-764b486b9b-tk6j5\" (UID: \"48fbe4a1-81ab-4a46-8150-821bc8afa220\") " pod="openstack/barbican-api-764b486b9b-tk6j5" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.296310 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48fbe4a1-81ab-4a46-8150-821bc8afa220-config-data-custom\") pod \"barbican-api-764b486b9b-tk6j5\" (UID: \"48fbe4a1-81ab-4a46-8150-821bc8afa220\") " pod="openstack/barbican-api-764b486b9b-tk6j5" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.299626 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a-scripts\") pod \"cinder-api-0\" (UID: \"3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a\") " pod="openstack/cinder-api-0" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.299916 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a\") " pod="openstack/cinder-api-0" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.300106 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48fbe4a1-81ab-4a46-8150-821bc8afa220-combined-ca-bundle\") pod \"barbican-api-764b486b9b-tk6j5\" (UID: \"48fbe4a1-81ab-4a46-8150-821bc8afa220\") " pod="openstack/barbican-api-764b486b9b-tk6j5" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.302067 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a-config-data-custom\") pod \"cinder-api-0\" (UID: \"3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a\") " pod="openstack/cinder-api-0" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.302499 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m78c\" (UniqueName: \"kubernetes.io/projected/48fbe4a1-81ab-4a46-8150-821bc8afa220-kube-api-access-2m78c\") pod \"barbican-api-764b486b9b-tk6j5\" (UID: \"48fbe4a1-81ab-4a46-8150-821bc8afa220\") " pod="openstack/barbican-api-764b486b9b-tk6j5" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.310784 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8k2kc\" (UniqueName: \"kubernetes.io/projected/3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a-kube-api-access-8k2kc\") pod \"cinder-api-0\" (UID: \"3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a\") " pod="openstack/cinder-api-0" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.326843 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7b6689bdbc-t6phd" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.330982 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b8fcc65cc-466c6" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.365154 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.462494 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-764b486b9b-tk6j5" Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.557242 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.692944 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7f66785d8-vkcmq"] Mar 09 09:25:35 crc kubenswrapper[4861]: I0309 09:25:35.905767 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b8fcc65cc-466c6"] Mar 09 09:25:35 crc kubenswrapper[4861]: W0309 09:25:35.914879 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc825c65_a951_464c_87d3_3e3bedee3e50.slice/crio-2d5edf0d00d8c4bf40fb34ce56be026dce7ad18836868225a59e3de054ee7146 WatchSource:0}: Error finding container 2d5edf0d00d8c4bf40fb34ce56be026dce7ad18836868225a59e3de054ee7146: Status 404 returned error can't find the container with id 2d5edf0d00d8c4bf40fb34ce56be026dce7ad18836868225a59e3de054ee7146 Mar 09 09:25:36 crc kubenswrapper[4861]: I0309 09:25:36.029590 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7b6689bdbc-t6phd"] Mar 09 09:25:36 crc kubenswrapper[4861]: W0309 09:25:36.030549 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82a35d2d_6934_4c56_a62d_db22ac36a6be.slice/crio-8b4a3676da6f202fd54fb859b6d8183ebed6a0a85afda29e960a1d883cd488a8 WatchSource:0}: Error finding container 8b4a3676da6f202fd54fb859b6d8183ebed6a0a85afda29e960a1d883cd488a8: Status 404 returned error can't find the container with id 8b4a3676da6f202fd54fb859b6d8183ebed6a0a85afda29e960a1d883cd488a8 Mar 09 09:25:36 crc kubenswrapper[4861]: I0309 09:25:36.035299 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b8fcc65cc-466c6" event={"ID":"cc825c65-a951-464c-87d3-3e3bedee3e50","Type":"ContainerStarted","Data":"2d5edf0d00d8c4bf40fb34ce56be026dce7ad18836868225a59e3de054ee7146"} Mar 09 09:25:36 crc kubenswrapper[4861]: I0309 09:25:36.039316 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7f66785d8-vkcmq" event={"ID":"8bc3e378-d567-4ba4-b135-1393faa1dbc6","Type":"ContainerStarted","Data":"bcca2260a33ec3050e8e1d508be541ffeef6289224115bffe0a3078aeeff1308"} Mar 09 09:25:36 crc kubenswrapper[4861]: I0309 09:25:36.041564 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-764b486b9b-tk6j5"] Mar 09 09:25:36 crc kubenswrapper[4861]: I0309 09:25:36.044049 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"31e86fa6-cfa9-488f-b87f-ea95b05f8531","Type":"ContainerStarted","Data":"d304f13f1ce386d3891eec10d13001ba01107ff8c2ea4da0ef9b8320624c256f"} Mar 09 09:25:36 crc kubenswrapper[4861]: I0309 09:25:36.044088 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69579b58d9-pzk8q" Mar 09 09:25:36 crc kubenswrapper[4861]: I0309 09:25:36.117809 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69579b58d9-pzk8q"] Mar 09 09:25:36 crc kubenswrapper[4861]: I0309 09:25:36.131443 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-69579b58d9-pzk8q"] Mar 09 09:25:36 crc kubenswrapper[4861]: I0309 09:25:36.148816 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 09 09:25:36 crc kubenswrapper[4861]: W0309 09:25:36.316716 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d0b7d4f_14a7_4c8a_b6e6_b98cce229f7a.slice/crio-d4a623eedaab959360b90da881bac9206fbe0adfd2d7532b203502c47fcdb022 WatchSource:0}: Error finding container d4a623eedaab959360b90da881bac9206fbe0adfd2d7532b203502c47fcdb022: Status 404 returned error can't find the container with id d4a623eedaab959360b90da881bac9206fbe0adfd2d7532b203502c47fcdb022 Mar 09 09:25:37 crc kubenswrapper[4861]: I0309 09:25:37.075120 4861 generic.go:334] "Generic (PLEG): container finished" podID="cc825c65-a951-464c-87d3-3e3bedee3e50" containerID="76fdc8d787e68e9675eed7b6f837ce9957491cc93adce7380f304fc1866ed124" exitCode=0 Mar 09 09:25:37 crc kubenswrapper[4861]: I0309 09:25:37.075512 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b8fcc65cc-466c6" event={"ID":"cc825c65-a951-464c-87d3-3e3bedee3e50","Type":"ContainerDied","Data":"76fdc8d787e68e9675eed7b6f837ce9957491cc93adce7380f304fc1866ed124"} Mar 09 09:25:37 crc kubenswrapper[4861]: I0309 09:25:37.079188 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a","Type":"ContainerStarted","Data":"d4a623eedaab959360b90da881bac9206fbe0adfd2d7532b203502c47fcdb022"} Mar 09 09:25:37 crc kubenswrapper[4861]: I0309 09:25:37.080688 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 09 09:25:37 crc kubenswrapper[4861]: I0309 09:25:37.090014 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-764b486b9b-tk6j5" event={"ID":"48fbe4a1-81ab-4a46-8150-821bc8afa220","Type":"ContainerStarted","Data":"9eadb20d6f8b86d83be9007e709be1785f3dbbf280c5db1b3a3ee672323d43c0"} Mar 09 09:25:37 crc kubenswrapper[4861]: I0309 09:25:37.090078 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-764b486b9b-tk6j5" Mar 09 09:25:37 crc kubenswrapper[4861]: I0309 09:25:37.090116 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-764b486b9b-tk6j5" event={"ID":"48fbe4a1-81ab-4a46-8150-821bc8afa220","Type":"ContainerStarted","Data":"c72c0b34b04fcf6e7685f1ef59cd6abe37a5f56a9418f9ff457e6d47203e3c14"} Mar 09 09:25:37 crc kubenswrapper[4861]: I0309 09:25:37.090135 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-764b486b9b-tk6j5" Mar 09 09:25:37 crc kubenswrapper[4861]: I0309 09:25:37.093084 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-764b486b9b-tk6j5" event={"ID":"48fbe4a1-81ab-4a46-8150-821bc8afa220","Type":"ContainerStarted","Data":"3cace9b3826297e13b51f776c4a63465d7f30becfc9c4dce3918a85756fc792e"} Mar 09 09:25:37 crc kubenswrapper[4861]: I0309 09:25:37.108036 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7b6689bdbc-t6phd" event={"ID":"82a35d2d-6934-4c56-a62d-db22ac36a6be","Type":"ContainerStarted","Data":"8b4a3676da6f202fd54fb859b6d8183ebed6a0a85afda29e960a1d883cd488a8"} Mar 09 09:25:37 crc kubenswrapper[4861]: I0309 09:25:37.126526 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-764b486b9b-tk6j5" podStartSLOduration=3.126508339 podStartE2EDuration="3.126508339s" podCreationTimestamp="2026-03-09 09:25:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:25:37.115644442 +0000 UTC m=+1180.200683853" watchObservedRunningTime="2026-03-09 09:25:37.126508339 +0000 UTC m=+1180.211547740" Mar 09 09:25:37 crc kubenswrapper[4861]: I0309 09:25:37.674388 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9567f8e-d157-445e-9370-1152c4923621" path="/var/lib/kubelet/pods/e9567f8e-d157-445e-9370-1152c4923621/volumes" Mar 09 09:25:37 crc kubenswrapper[4861]: I0309 09:25:37.959330 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.051161 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b893f89-a9bc-4a39-bd26-b394cbb0a374-scripts\") pod \"3b893f89-a9bc-4a39-bd26-b394cbb0a374\" (UID: \"3b893f89-a9bc-4a39-bd26-b394cbb0a374\") " Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.051267 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b893f89-a9bc-4a39-bd26-b394cbb0a374-config-data\") pod \"3b893f89-a9bc-4a39-bd26-b394cbb0a374\" (UID: \"3b893f89-a9bc-4a39-bd26-b394cbb0a374\") " Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.051301 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b893f89-a9bc-4a39-bd26-b394cbb0a374-combined-ca-bundle\") pod \"3b893f89-a9bc-4a39-bd26-b394cbb0a374\" (UID: \"3b893f89-a9bc-4a39-bd26-b394cbb0a374\") " Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.051431 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jf4rm\" (UniqueName: \"kubernetes.io/projected/3b893f89-a9bc-4a39-bd26-b394cbb0a374-kube-api-access-jf4rm\") pod \"3b893f89-a9bc-4a39-bd26-b394cbb0a374\" (UID: \"3b893f89-a9bc-4a39-bd26-b394cbb0a374\") " Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.051479 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b893f89-a9bc-4a39-bd26-b394cbb0a374-log-httpd\") pod \"3b893f89-a9bc-4a39-bd26-b394cbb0a374\" (UID: \"3b893f89-a9bc-4a39-bd26-b394cbb0a374\") " Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.051586 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3b893f89-a9bc-4a39-bd26-b394cbb0a374-sg-core-conf-yaml\") pod \"3b893f89-a9bc-4a39-bd26-b394cbb0a374\" (UID: \"3b893f89-a9bc-4a39-bd26-b394cbb0a374\") " Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.051606 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b893f89-a9bc-4a39-bd26-b394cbb0a374-run-httpd\") pod \"3b893f89-a9bc-4a39-bd26-b394cbb0a374\" (UID: \"3b893f89-a9bc-4a39-bd26-b394cbb0a374\") " Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.052257 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b893f89-a9bc-4a39-bd26-b394cbb0a374-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3b893f89-a9bc-4a39-bd26-b394cbb0a374" (UID: "3b893f89-a9bc-4a39-bd26-b394cbb0a374"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.052332 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b893f89-a9bc-4a39-bd26-b394cbb0a374-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3b893f89-a9bc-4a39-bd26-b394cbb0a374" (UID: "3b893f89-a9bc-4a39-bd26-b394cbb0a374"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.060547 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b893f89-a9bc-4a39-bd26-b394cbb0a374-scripts" (OuterVolumeSpecName: "scripts") pod "3b893f89-a9bc-4a39-bd26-b394cbb0a374" (UID: "3b893f89-a9bc-4a39-bd26-b394cbb0a374"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.060701 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b893f89-a9bc-4a39-bd26-b394cbb0a374-kube-api-access-jf4rm" (OuterVolumeSpecName: "kube-api-access-jf4rm") pod "3b893f89-a9bc-4a39-bd26-b394cbb0a374" (UID: "3b893f89-a9bc-4a39-bd26-b394cbb0a374"). InnerVolumeSpecName "kube-api-access-jf4rm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.088811 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b893f89-a9bc-4a39-bd26-b394cbb0a374-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3b893f89-a9bc-4a39-bd26-b394cbb0a374" (UID: "3b893f89-a9bc-4a39-bd26-b394cbb0a374"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.124893 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"31e86fa6-cfa9-488f-b87f-ea95b05f8531","Type":"ContainerStarted","Data":"89ee5ee2a106daae472411239bf503ecb869ad61b83c5951d1bc49b5fedc1312"} Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.131630 4861 generic.go:334] "Generic (PLEG): container finished" podID="3b893f89-a9bc-4a39-bd26-b394cbb0a374" containerID="e15fa3a3dc10d161f1fc033b6f91e18abcb8ed972cc20b55f2cdece20ece59af" exitCode=0 Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.131704 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3b893f89-a9bc-4a39-bd26-b394cbb0a374","Type":"ContainerDied","Data":"e15fa3a3dc10d161f1fc033b6f91e18abcb8ed972cc20b55f2cdece20ece59af"} Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.131740 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3b893f89-a9bc-4a39-bd26-b394cbb0a374","Type":"ContainerDied","Data":"5c137fca2c97e53ec76c975c0721b95dead1743dd631b64a64772a241c8817b5"} Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.131761 4861 scope.go:117] "RemoveContainer" containerID="c24b27beb8691b93bafbdd946919b6e0c5ec4a90322fb39f8123df16c4a121c2" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.131954 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.145099 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b8fcc65cc-466c6" event={"ID":"cc825c65-a951-464c-87d3-3e3bedee3e50","Type":"ContainerStarted","Data":"0e949aca604757214fec9b436182444696e848dd9f9110091d4fb4c98eca8660"} Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.146437 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b8fcc65cc-466c6" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.149512 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a","Type":"ContainerStarted","Data":"2c79b4b0ca27df0f16a48c63dea824fd62d21f4930a8301169db649d33109e66"} Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.154549 4861 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3b893f89-a9bc-4a39-bd26-b394cbb0a374-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.154573 4861 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b893f89-a9bc-4a39-bd26-b394cbb0a374-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.154582 4861 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b893f89-a9bc-4a39-bd26-b394cbb0a374-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.154591 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jf4rm\" (UniqueName: \"kubernetes.io/projected/3b893f89-a9bc-4a39-bd26-b394cbb0a374-kube-api-access-jf4rm\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.154600 4861 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3b893f89-a9bc-4a39-bd26-b394cbb0a374-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.173164 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b8fcc65cc-466c6" podStartSLOduration=4.173146214 podStartE2EDuration="4.173146214s" podCreationTimestamp="2026-03-09 09:25:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:25:38.168431986 +0000 UTC m=+1181.253471377" watchObservedRunningTime="2026-03-09 09:25:38.173146214 +0000 UTC m=+1181.258185615" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.182795 4861 scope.go:117] "RemoveContainer" containerID="aaf3ac9de5dfcab3f64973950fb19a6ea2cf57e74602e7bedfbcec4ae460e46e" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.193484 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b893f89-a9bc-4a39-bd26-b394cbb0a374-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3b893f89-a9bc-4a39-bd26-b394cbb0a374" (UID: "3b893f89-a9bc-4a39-bd26-b394cbb0a374"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.197510 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b893f89-a9bc-4a39-bd26-b394cbb0a374-config-data" (OuterVolumeSpecName: "config-data") pod "3b893f89-a9bc-4a39-bd26-b394cbb0a374" (UID: "3b893f89-a9bc-4a39-bd26-b394cbb0a374"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.255882 4861 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b893f89-a9bc-4a39-bd26-b394cbb0a374-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.255914 4861 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b893f89-a9bc-4a39-bd26-b394cbb0a374-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.380805 4861 scope.go:117] "RemoveContainer" containerID="e15fa3a3dc10d161f1fc033b6f91e18abcb8ed972cc20b55f2cdece20ece59af" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.411078 4861 scope.go:117] "RemoveContainer" containerID="7c1078a757c55399b615079346367a3c7063e060d616a6c7295df26e889b255e" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.476400 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.479780 4861 scope.go:117] "RemoveContainer" containerID="c24b27beb8691b93bafbdd946919b6e0c5ec4a90322fb39f8123df16c4a121c2" Mar 09 09:25:38 crc kubenswrapper[4861]: E0309 09:25:38.480824 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c24b27beb8691b93bafbdd946919b6e0c5ec4a90322fb39f8123df16c4a121c2\": container with ID starting with c24b27beb8691b93bafbdd946919b6e0c5ec4a90322fb39f8123df16c4a121c2 not found: ID does not exist" containerID="c24b27beb8691b93bafbdd946919b6e0c5ec4a90322fb39f8123df16c4a121c2" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.480881 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c24b27beb8691b93bafbdd946919b6e0c5ec4a90322fb39f8123df16c4a121c2"} err="failed to get container status \"c24b27beb8691b93bafbdd946919b6e0c5ec4a90322fb39f8123df16c4a121c2\": rpc error: code = NotFound desc = could not find container \"c24b27beb8691b93bafbdd946919b6e0c5ec4a90322fb39f8123df16c4a121c2\": container with ID starting with c24b27beb8691b93bafbdd946919b6e0c5ec4a90322fb39f8123df16c4a121c2 not found: ID does not exist" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.480917 4861 scope.go:117] "RemoveContainer" containerID="aaf3ac9de5dfcab3f64973950fb19a6ea2cf57e74602e7bedfbcec4ae460e46e" Mar 09 09:25:38 crc kubenswrapper[4861]: E0309 09:25:38.484020 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aaf3ac9de5dfcab3f64973950fb19a6ea2cf57e74602e7bedfbcec4ae460e46e\": container with ID starting with aaf3ac9de5dfcab3f64973950fb19a6ea2cf57e74602e7bedfbcec4ae460e46e not found: ID does not exist" containerID="aaf3ac9de5dfcab3f64973950fb19a6ea2cf57e74602e7bedfbcec4ae460e46e" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.484087 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aaf3ac9de5dfcab3f64973950fb19a6ea2cf57e74602e7bedfbcec4ae460e46e"} err="failed to get container status \"aaf3ac9de5dfcab3f64973950fb19a6ea2cf57e74602e7bedfbcec4ae460e46e\": rpc error: code = NotFound desc = could not find container \"aaf3ac9de5dfcab3f64973950fb19a6ea2cf57e74602e7bedfbcec4ae460e46e\": container with ID starting with aaf3ac9de5dfcab3f64973950fb19a6ea2cf57e74602e7bedfbcec4ae460e46e not found: ID does not exist" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.484115 4861 scope.go:117] "RemoveContainer" containerID="e15fa3a3dc10d161f1fc033b6f91e18abcb8ed972cc20b55f2cdece20ece59af" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.485364 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:25:38 crc kubenswrapper[4861]: E0309 09:25:38.494695 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e15fa3a3dc10d161f1fc033b6f91e18abcb8ed972cc20b55f2cdece20ece59af\": container with ID starting with e15fa3a3dc10d161f1fc033b6f91e18abcb8ed972cc20b55f2cdece20ece59af not found: ID does not exist" containerID="e15fa3a3dc10d161f1fc033b6f91e18abcb8ed972cc20b55f2cdece20ece59af" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.494742 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e15fa3a3dc10d161f1fc033b6f91e18abcb8ed972cc20b55f2cdece20ece59af"} err="failed to get container status \"e15fa3a3dc10d161f1fc033b6f91e18abcb8ed972cc20b55f2cdece20ece59af\": rpc error: code = NotFound desc = could not find container \"e15fa3a3dc10d161f1fc033b6f91e18abcb8ed972cc20b55f2cdece20ece59af\": container with ID starting with e15fa3a3dc10d161f1fc033b6f91e18abcb8ed972cc20b55f2cdece20ece59af not found: ID does not exist" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.494768 4861 scope.go:117] "RemoveContainer" containerID="7c1078a757c55399b615079346367a3c7063e060d616a6c7295df26e889b255e" Mar 09 09:25:38 crc kubenswrapper[4861]: E0309 09:25:38.497027 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c1078a757c55399b615079346367a3c7063e060d616a6c7295df26e889b255e\": container with ID starting with 7c1078a757c55399b615079346367a3c7063e060d616a6c7295df26e889b255e not found: ID does not exist" containerID="7c1078a757c55399b615079346367a3c7063e060d616a6c7295df26e889b255e" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.497067 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c1078a757c55399b615079346367a3c7063e060d616a6c7295df26e889b255e"} err="failed to get container status \"7c1078a757c55399b615079346367a3c7063e060d616a6c7295df26e889b255e\": rpc error: code = NotFound desc = could not find container \"7c1078a757c55399b615079346367a3c7063e060d616a6c7295df26e889b255e\": container with ID starting with 7c1078a757c55399b615079346367a3c7063e060d616a6c7295df26e889b255e not found: ID does not exist" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.516245 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:25:38 crc kubenswrapper[4861]: E0309 09:25:38.516713 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b893f89-a9bc-4a39-bd26-b394cbb0a374" containerName="sg-core" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.516741 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b893f89-a9bc-4a39-bd26-b394cbb0a374" containerName="sg-core" Mar 09 09:25:38 crc kubenswrapper[4861]: E0309 09:25:38.516758 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b893f89-a9bc-4a39-bd26-b394cbb0a374" containerName="ceilometer-central-agent" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.516765 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b893f89-a9bc-4a39-bd26-b394cbb0a374" containerName="ceilometer-central-agent" Mar 09 09:25:38 crc kubenswrapper[4861]: E0309 09:25:38.516801 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b893f89-a9bc-4a39-bd26-b394cbb0a374" containerName="proxy-httpd" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.516807 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b893f89-a9bc-4a39-bd26-b394cbb0a374" containerName="proxy-httpd" Mar 09 09:25:38 crc kubenswrapper[4861]: E0309 09:25:38.516817 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b893f89-a9bc-4a39-bd26-b394cbb0a374" containerName="ceilometer-notification-agent" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.516823 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b893f89-a9bc-4a39-bd26-b394cbb0a374" containerName="ceilometer-notification-agent" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.517066 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b893f89-a9bc-4a39-bd26-b394cbb0a374" containerName="ceilometer-notification-agent" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.517091 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b893f89-a9bc-4a39-bd26-b394cbb0a374" containerName="sg-core" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.517104 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b893f89-a9bc-4a39-bd26-b394cbb0a374" containerName="proxy-httpd" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.517122 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b893f89-a9bc-4a39-bd26-b394cbb0a374" containerName="ceilometer-central-agent" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.525185 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.532679 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.532870 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.537275 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.560961 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1-log-httpd\") pod \"ceilometer-0\" (UID: \"8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1\") " pod="openstack/ceilometer-0" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.561029 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1\") " pod="openstack/ceilometer-0" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.561102 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsnsk\" (UniqueName: \"kubernetes.io/projected/8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1-kube-api-access-jsnsk\") pod \"ceilometer-0\" (UID: \"8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1\") " pod="openstack/ceilometer-0" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.561136 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1-scripts\") pod \"ceilometer-0\" (UID: \"8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1\") " pod="openstack/ceilometer-0" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.561152 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1\") " pod="openstack/ceilometer-0" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.561170 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1-config-data\") pod \"ceilometer-0\" (UID: \"8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1\") " pod="openstack/ceilometer-0" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.561198 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1-run-httpd\") pod \"ceilometer-0\" (UID: \"8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1\") " pod="openstack/ceilometer-0" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.662929 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1-log-httpd\") pod \"ceilometer-0\" (UID: \"8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1\") " pod="openstack/ceilometer-0" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.662992 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1\") " pod="openstack/ceilometer-0" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.663057 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsnsk\" (UniqueName: \"kubernetes.io/projected/8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1-kube-api-access-jsnsk\") pod \"ceilometer-0\" (UID: \"8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1\") " pod="openstack/ceilometer-0" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.663089 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1-scripts\") pod \"ceilometer-0\" (UID: \"8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1\") " pod="openstack/ceilometer-0" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.663105 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1\") " pod="openstack/ceilometer-0" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.663124 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1-config-data\") pod \"ceilometer-0\" (UID: \"8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1\") " pod="openstack/ceilometer-0" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.663177 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1-run-httpd\") pod \"ceilometer-0\" (UID: \"8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1\") " pod="openstack/ceilometer-0" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.664057 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1-run-httpd\") pod \"ceilometer-0\" (UID: \"8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1\") " pod="openstack/ceilometer-0" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.664279 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1-log-httpd\") pod \"ceilometer-0\" (UID: \"8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1\") " pod="openstack/ceilometer-0" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.671225 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1\") " pod="openstack/ceilometer-0" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.672901 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1-config-data\") pod \"ceilometer-0\" (UID: \"8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1\") " pod="openstack/ceilometer-0" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.673587 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1-scripts\") pod \"ceilometer-0\" (UID: \"8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1\") " pod="openstack/ceilometer-0" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.679868 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1\") " pod="openstack/ceilometer-0" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.702619 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsnsk\" (UniqueName: \"kubernetes.io/projected/8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1-kube-api-access-jsnsk\") pod \"ceilometer-0\" (UID: \"8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1\") " pod="openstack/ceilometer-0" Mar 09 09:25:38 crc kubenswrapper[4861]: I0309 09:25:38.845904 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:25:39 crc kubenswrapper[4861]: I0309 09:25:39.214074 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a","Type":"ContainerStarted","Data":"687d779e14aa31b5a5bfd8d961aa81863f1af619b3ce31c1804ce1b27e08e60d"} Mar 09 09:25:39 crc kubenswrapper[4861]: I0309 09:25:39.214316 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a" containerName="cinder-api-log" containerID="cri-o://2c79b4b0ca27df0f16a48c63dea824fd62d21f4930a8301169db649d33109e66" gracePeriod=30 Mar 09 09:25:39 crc kubenswrapper[4861]: I0309 09:25:39.214483 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 09 09:25:39 crc kubenswrapper[4861]: I0309 09:25:39.214580 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a" containerName="cinder-api" containerID="cri-o://687d779e14aa31b5a5bfd8d961aa81863f1af619b3ce31c1804ce1b27e08e60d" gracePeriod=30 Mar 09 09:25:39 crc kubenswrapper[4861]: I0309 09:25:39.247855 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7b6689bdbc-t6phd" event={"ID":"82a35d2d-6934-4c56-a62d-db22ac36a6be","Type":"ContainerStarted","Data":"eb12429930f757ebe14c7dc6296ef325845128846bc988eea6d244955066fc6b"} Mar 09 09:25:39 crc kubenswrapper[4861]: I0309 09:25:39.247909 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7b6689bdbc-t6phd" event={"ID":"82a35d2d-6934-4c56-a62d-db22ac36a6be","Type":"ContainerStarted","Data":"010afb1580a1657c6ef684effe9d30e3802211cfb3dbf3f421578b823e1b59ea"} Mar 09 09:25:39 crc kubenswrapper[4861]: I0309 09:25:39.270318 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.270301113 podStartE2EDuration="5.270301113s" podCreationTimestamp="2026-03-09 09:25:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:25:39.270145859 +0000 UTC m=+1182.355185280" watchObservedRunningTime="2026-03-09 09:25:39.270301113 +0000 UTC m=+1182.355340514" Mar 09 09:25:39 crc kubenswrapper[4861]: I0309 09:25:39.272731 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7f66785d8-vkcmq" event={"ID":"8bc3e378-d567-4ba4-b135-1393faa1dbc6","Type":"ContainerStarted","Data":"5d990d7e740ec67763ce6d9681961ae4023c305686a03ecf346097bd9d16413c"} Mar 09 09:25:39 crc kubenswrapper[4861]: I0309 09:25:39.272775 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7f66785d8-vkcmq" event={"ID":"8bc3e378-d567-4ba4-b135-1393faa1dbc6","Type":"ContainerStarted","Data":"b8cba48ff68e277cf6a2a88d38f10dbd16adcecdf0f9f90440819a13349842df"} Mar 09 09:25:39 crc kubenswrapper[4861]: I0309 09:25:39.321918 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"31e86fa6-cfa9-488f-b87f-ea95b05f8531","Type":"ContainerStarted","Data":"60e7294b8c9dd84642ac16dc7db39a6608018b69c3bca361fc06584d76e3e486"} Mar 09 09:25:39 crc kubenswrapper[4861]: I0309 09:25:39.437834 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7b6689bdbc-t6phd" podStartSLOduration=3.526399243 podStartE2EDuration="5.437814734s" podCreationTimestamp="2026-03-09 09:25:34 +0000 UTC" firstStartedPulling="2026-03-09 09:25:36.034718296 +0000 UTC m=+1179.119757687" lastFinishedPulling="2026-03-09 09:25:37.946133777 +0000 UTC m=+1181.031173178" observedRunningTime="2026-03-09 09:25:39.33563078 +0000 UTC m=+1182.420670191" watchObservedRunningTime="2026-03-09 09:25:39.437814734 +0000 UTC m=+1182.522854135" Mar 09 09:25:39 crc kubenswrapper[4861]: I0309 09:25:39.461827 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.666227099 podStartE2EDuration="5.461806824s" podCreationTimestamp="2026-03-09 09:25:34 +0000 UTC" firstStartedPulling="2026-03-09 09:25:35.576446248 +0000 UTC m=+1178.661485649" lastFinishedPulling="2026-03-09 09:25:36.372025963 +0000 UTC m=+1179.457065374" observedRunningTime="2026-03-09 09:25:39.384694853 +0000 UTC m=+1182.469734254" watchObservedRunningTime="2026-03-09 09:25:39.461806824 +0000 UTC m=+1182.546846215" Mar 09 09:25:39 crc kubenswrapper[4861]: I0309 09:25:39.471259 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-7f66785d8-vkcmq" podStartSLOduration=3.277356224 podStartE2EDuration="5.47124153s" podCreationTimestamp="2026-03-09 09:25:34 +0000 UTC" firstStartedPulling="2026-03-09 09:25:35.708930766 +0000 UTC m=+1178.793970167" lastFinishedPulling="2026-03-09 09:25:37.902816072 +0000 UTC m=+1180.987855473" observedRunningTime="2026-03-09 09:25:39.435103064 +0000 UTC m=+1182.520142465" watchObservedRunningTime="2026-03-09 09:25:39.47124153 +0000 UTC m=+1182.556280931" Mar 09 09:25:39 crc kubenswrapper[4861]: I0309 09:25:39.498117 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:25:39 crc kubenswrapper[4861]: I0309 09:25:39.637958 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7fb7d5546d-n665d" Mar 09 09:25:39 crc kubenswrapper[4861]: I0309 09:25:39.680603 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b893f89-a9bc-4a39-bd26-b394cbb0a374" path="/var/lib/kubelet/pods/3b893f89-a9bc-4a39-bd26-b394cbb0a374/volumes" Mar 09 09:25:39 crc kubenswrapper[4861]: I0309 09:25:39.917770 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 09 09:25:40 crc kubenswrapper[4861]: I0309 09:25:40.055512 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-8dfc7bfd5-qgpjh"] Mar 09 09:25:40 crc kubenswrapper[4861]: I0309 09:25:40.056167 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-8dfc7bfd5-qgpjh" podUID="bbbb2b98-1a7f-45f2-999d-8c4d1d879c01" containerName="neutron-api" containerID="cri-o://62d9f61865c0aba71cc143cf38566dfd023bf5ccc20494aec11ffb3b58f48655" gracePeriod=30 Mar 09 09:25:40 crc kubenswrapper[4861]: I0309 09:25:40.056531 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-8dfc7bfd5-qgpjh" podUID="bbbb2b98-1a7f-45f2-999d-8c4d1d879c01" containerName="neutron-httpd" containerID="cri-o://2acb7c830dfedd1fba55a973d93130f09024212c84ce99b6a2152b544d5cbd8a" gracePeriod=30 Mar 09 09:25:40 crc kubenswrapper[4861]: I0309 09:25:40.068042 4861 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-8dfc7bfd5-qgpjh" podUID="bbbb2b98-1a7f-45f2-999d-8c4d1d879c01" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.159:9696/\": read tcp 10.217.0.2:42138->10.217.0.159:9696: read: connection reset by peer" Mar 09 09:25:40 crc kubenswrapper[4861]: I0309 09:25:40.116435 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-665f4b6689-tfdk9"] Mar 09 09:25:40 crc kubenswrapper[4861]: I0309 09:25:40.123322 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-665f4b6689-tfdk9" Mar 09 09:25:40 crc kubenswrapper[4861]: I0309 09:25:40.142558 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-665f4b6689-tfdk9"] Mar 09 09:25:40 crc kubenswrapper[4861]: I0309 09:25:40.215667 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/db5464b8-011f-4569-a47e-36766fa6c72e-config\") pod \"neutron-665f4b6689-tfdk9\" (UID: \"db5464b8-011f-4569-a47e-36766fa6c72e\") " pod="openstack/neutron-665f4b6689-tfdk9" Mar 09 09:25:40 crc kubenswrapper[4861]: I0309 09:25:40.216661 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/db5464b8-011f-4569-a47e-36766fa6c72e-httpd-config\") pod \"neutron-665f4b6689-tfdk9\" (UID: \"db5464b8-011f-4569-a47e-36766fa6c72e\") " pod="openstack/neutron-665f4b6689-tfdk9" Mar 09 09:25:40 crc kubenswrapper[4861]: I0309 09:25:40.216774 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db5464b8-011f-4569-a47e-36766fa6c72e-combined-ca-bundle\") pod \"neutron-665f4b6689-tfdk9\" (UID: \"db5464b8-011f-4569-a47e-36766fa6c72e\") " pod="openstack/neutron-665f4b6689-tfdk9" Mar 09 09:25:40 crc kubenswrapper[4861]: I0309 09:25:40.216860 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/db5464b8-011f-4569-a47e-36766fa6c72e-public-tls-certs\") pod \"neutron-665f4b6689-tfdk9\" (UID: \"db5464b8-011f-4569-a47e-36766fa6c72e\") " pod="openstack/neutron-665f4b6689-tfdk9" Mar 09 09:25:40 crc kubenswrapper[4861]: I0309 09:25:40.217047 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shwh5\" (UniqueName: \"kubernetes.io/projected/db5464b8-011f-4569-a47e-36766fa6c72e-kube-api-access-shwh5\") pod \"neutron-665f4b6689-tfdk9\" (UID: \"db5464b8-011f-4569-a47e-36766fa6c72e\") " pod="openstack/neutron-665f4b6689-tfdk9" Mar 09 09:25:40 crc kubenswrapper[4861]: I0309 09:25:40.217242 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/db5464b8-011f-4569-a47e-36766fa6c72e-internal-tls-certs\") pod \"neutron-665f4b6689-tfdk9\" (UID: \"db5464b8-011f-4569-a47e-36766fa6c72e\") " pod="openstack/neutron-665f4b6689-tfdk9" Mar 09 09:25:40 crc kubenswrapper[4861]: I0309 09:25:40.217265 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/db5464b8-011f-4569-a47e-36766fa6c72e-ovndb-tls-certs\") pod \"neutron-665f4b6689-tfdk9\" (UID: \"db5464b8-011f-4569-a47e-36766fa6c72e\") " pod="openstack/neutron-665f4b6689-tfdk9" Mar 09 09:25:40 crc kubenswrapper[4861]: I0309 09:25:40.319408 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/db5464b8-011f-4569-a47e-36766fa6c72e-internal-tls-certs\") pod \"neutron-665f4b6689-tfdk9\" (UID: \"db5464b8-011f-4569-a47e-36766fa6c72e\") " pod="openstack/neutron-665f4b6689-tfdk9" Mar 09 09:25:40 crc kubenswrapper[4861]: I0309 09:25:40.319452 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/db5464b8-011f-4569-a47e-36766fa6c72e-ovndb-tls-certs\") pod \"neutron-665f4b6689-tfdk9\" (UID: \"db5464b8-011f-4569-a47e-36766fa6c72e\") " pod="openstack/neutron-665f4b6689-tfdk9" Mar 09 09:25:40 crc kubenswrapper[4861]: I0309 09:25:40.319563 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/db5464b8-011f-4569-a47e-36766fa6c72e-config\") pod \"neutron-665f4b6689-tfdk9\" (UID: \"db5464b8-011f-4569-a47e-36766fa6c72e\") " pod="openstack/neutron-665f4b6689-tfdk9" Mar 09 09:25:40 crc kubenswrapper[4861]: I0309 09:25:40.319581 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/db5464b8-011f-4569-a47e-36766fa6c72e-httpd-config\") pod \"neutron-665f4b6689-tfdk9\" (UID: \"db5464b8-011f-4569-a47e-36766fa6c72e\") " pod="openstack/neutron-665f4b6689-tfdk9" Mar 09 09:25:40 crc kubenswrapper[4861]: I0309 09:25:40.319600 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db5464b8-011f-4569-a47e-36766fa6c72e-combined-ca-bundle\") pod \"neutron-665f4b6689-tfdk9\" (UID: \"db5464b8-011f-4569-a47e-36766fa6c72e\") " pod="openstack/neutron-665f4b6689-tfdk9" Mar 09 09:25:40 crc kubenswrapper[4861]: I0309 09:25:40.319623 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/db5464b8-011f-4569-a47e-36766fa6c72e-public-tls-certs\") pod \"neutron-665f4b6689-tfdk9\" (UID: \"db5464b8-011f-4569-a47e-36766fa6c72e\") " pod="openstack/neutron-665f4b6689-tfdk9" Mar 09 09:25:40 crc kubenswrapper[4861]: I0309 09:25:40.319648 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shwh5\" (UniqueName: \"kubernetes.io/projected/db5464b8-011f-4569-a47e-36766fa6c72e-kube-api-access-shwh5\") pod \"neutron-665f4b6689-tfdk9\" (UID: \"db5464b8-011f-4569-a47e-36766fa6c72e\") " pod="openstack/neutron-665f4b6689-tfdk9" Mar 09 09:25:40 crc kubenswrapper[4861]: I0309 09:25:40.325691 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/db5464b8-011f-4569-a47e-36766fa6c72e-internal-tls-certs\") pod \"neutron-665f4b6689-tfdk9\" (UID: \"db5464b8-011f-4569-a47e-36766fa6c72e\") " pod="openstack/neutron-665f4b6689-tfdk9" Mar 09 09:25:40 crc kubenswrapper[4861]: I0309 09:25:40.328645 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/db5464b8-011f-4569-a47e-36766fa6c72e-public-tls-certs\") pod \"neutron-665f4b6689-tfdk9\" (UID: \"db5464b8-011f-4569-a47e-36766fa6c72e\") " pod="openstack/neutron-665f4b6689-tfdk9" Mar 09 09:25:40 crc kubenswrapper[4861]: I0309 09:25:40.337318 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db5464b8-011f-4569-a47e-36766fa6c72e-combined-ca-bundle\") pod \"neutron-665f4b6689-tfdk9\" (UID: \"db5464b8-011f-4569-a47e-36766fa6c72e\") " pod="openstack/neutron-665f4b6689-tfdk9" Mar 09 09:25:40 crc kubenswrapper[4861]: I0309 09:25:40.345194 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/db5464b8-011f-4569-a47e-36766fa6c72e-httpd-config\") pod \"neutron-665f4b6689-tfdk9\" (UID: \"db5464b8-011f-4569-a47e-36766fa6c72e\") " pod="openstack/neutron-665f4b6689-tfdk9" Mar 09 09:25:40 crc kubenswrapper[4861]: I0309 09:25:40.345744 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/db5464b8-011f-4569-a47e-36766fa6c72e-ovndb-tls-certs\") pod \"neutron-665f4b6689-tfdk9\" (UID: \"db5464b8-011f-4569-a47e-36766fa6c72e\") " pod="openstack/neutron-665f4b6689-tfdk9" Mar 09 09:25:40 crc kubenswrapper[4861]: I0309 09:25:40.348086 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/db5464b8-011f-4569-a47e-36766fa6c72e-config\") pod \"neutron-665f4b6689-tfdk9\" (UID: \"db5464b8-011f-4569-a47e-36766fa6c72e\") " pod="openstack/neutron-665f4b6689-tfdk9" Mar 09 09:25:40 crc kubenswrapper[4861]: I0309 09:25:40.352486 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shwh5\" (UniqueName: \"kubernetes.io/projected/db5464b8-011f-4569-a47e-36766fa6c72e-kube-api-access-shwh5\") pod \"neutron-665f4b6689-tfdk9\" (UID: \"db5464b8-011f-4569-a47e-36766fa6c72e\") " pod="openstack/neutron-665f4b6689-tfdk9" Mar 09 09:25:40 crc kubenswrapper[4861]: I0309 09:25:40.357904 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1","Type":"ContainerStarted","Data":"7876f246bbc8d15cd13ada8a14e25713502882d6b23c20b00c4b0ddde4ebdc34"} Mar 09 09:25:40 crc kubenswrapper[4861]: I0309 09:25:40.360111 4861 generic.go:334] "Generic (PLEG): container finished" podID="3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a" containerID="687d779e14aa31b5a5bfd8d961aa81863f1af619b3ce31c1804ce1b27e08e60d" exitCode=0 Mar 09 09:25:40 crc kubenswrapper[4861]: I0309 09:25:40.360135 4861 generic.go:334] "Generic (PLEG): container finished" podID="3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a" containerID="2c79b4b0ca27df0f16a48c63dea824fd62d21f4930a8301169db649d33109e66" exitCode=143 Mar 09 09:25:40 crc kubenswrapper[4861]: I0309 09:25:40.360175 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a","Type":"ContainerDied","Data":"687d779e14aa31b5a5bfd8d961aa81863f1af619b3ce31c1804ce1b27e08e60d"} Mar 09 09:25:40 crc kubenswrapper[4861]: I0309 09:25:40.360189 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a","Type":"ContainerDied","Data":"2c79b4b0ca27df0f16a48c63dea824fd62d21f4930a8301169db649d33109e66"} Mar 09 09:25:40 crc kubenswrapper[4861]: I0309 09:25:40.360198 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a","Type":"ContainerDied","Data":"d4a623eedaab959360b90da881bac9206fbe0adfd2d7532b203502c47fcdb022"} Mar 09 09:25:40 crc kubenswrapper[4861]: I0309 09:25:40.360208 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4a623eedaab959360b90da881bac9206fbe0adfd2d7532b203502c47fcdb022" Mar 09 09:25:40 crc kubenswrapper[4861]: I0309 09:25:40.362187 4861 generic.go:334] "Generic (PLEG): container finished" podID="bbbb2b98-1a7f-45f2-999d-8c4d1d879c01" containerID="2acb7c830dfedd1fba55a973d93130f09024212c84ce99b6a2152b544d5cbd8a" exitCode=0 Mar 09 09:25:40 crc kubenswrapper[4861]: I0309 09:25:40.362226 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8dfc7bfd5-qgpjh" event={"ID":"bbbb2b98-1a7f-45f2-999d-8c4d1d879c01","Type":"ContainerDied","Data":"2acb7c830dfedd1fba55a973d93130f09024212c84ce99b6a2152b544d5cbd8a"} Mar 09 09:25:40 crc kubenswrapper[4861]: I0309 09:25:40.416470 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 09 09:25:40 crc kubenswrapper[4861]: I0309 09:25:40.447587 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-665f4b6689-tfdk9" Mar 09 09:25:40 crc kubenswrapper[4861]: I0309 09:25:40.524048 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a-logs\") pod \"3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a\" (UID: \"3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a\") " Mar 09 09:25:40 crc kubenswrapper[4861]: I0309 09:25:40.524123 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a-combined-ca-bundle\") pod \"3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a\" (UID: \"3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a\") " Mar 09 09:25:40 crc kubenswrapper[4861]: I0309 09:25:40.524198 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8k2kc\" (UniqueName: \"kubernetes.io/projected/3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a-kube-api-access-8k2kc\") pod \"3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a\" (UID: \"3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a\") " Mar 09 09:25:40 crc kubenswrapper[4861]: I0309 09:25:40.524227 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a-scripts\") pod \"3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a\" (UID: \"3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a\") " Mar 09 09:25:40 crc kubenswrapper[4861]: I0309 09:25:40.524248 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a-etc-machine-id\") pod \"3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a\" (UID: \"3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a\") " Mar 09 09:25:40 crc kubenswrapper[4861]: I0309 09:25:40.524285 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a-config-data\") pod \"3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a\" (UID: \"3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a\") " Mar 09 09:25:40 crc kubenswrapper[4861]: I0309 09:25:40.524345 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a-config-data-custom\") pod \"3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a\" (UID: \"3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a\") " Mar 09 09:25:40 crc kubenswrapper[4861]: I0309 09:25:40.524791 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a-logs" (OuterVolumeSpecName: "logs") pod "3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a" (UID: "3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:25:40 crc kubenswrapper[4861]: I0309 09:25:40.530488 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a" (UID: "3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:25:40 crc kubenswrapper[4861]: I0309 09:25:40.536949 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a-scripts" (OuterVolumeSpecName: "scripts") pod "3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a" (UID: "3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:25:40 crc kubenswrapper[4861]: I0309 09:25:40.541102 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a" (UID: "3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:25:40 crc kubenswrapper[4861]: I0309 09:25:40.543037 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a-kube-api-access-8k2kc" (OuterVolumeSpecName: "kube-api-access-8k2kc") pod "3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a" (UID: "3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a"). InnerVolumeSpecName "kube-api-access-8k2kc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:25:40 crc kubenswrapper[4861]: I0309 09:25:40.577152 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a" (UID: "3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:25:40 crc kubenswrapper[4861]: I0309 09:25:40.631552 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a-config-data" (OuterVolumeSpecName: "config-data") pod "3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a" (UID: "3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:25:40 crc kubenswrapper[4861]: I0309 09:25:40.639586 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8k2kc\" (UniqueName: \"kubernetes.io/projected/3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a-kube-api-access-8k2kc\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:40 crc kubenswrapper[4861]: I0309 09:25:40.639621 4861 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:40 crc kubenswrapper[4861]: I0309 09:25:40.639637 4861 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:40 crc kubenswrapper[4861]: I0309 09:25:40.639648 4861 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:40 crc kubenswrapper[4861]: I0309 09:25:40.639660 4861 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:40 crc kubenswrapper[4861]: I0309 09:25:40.639670 4861 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a-logs\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:40 crc kubenswrapper[4861]: I0309 09:25:40.639679 4861 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:40 crc kubenswrapper[4861]: I0309 09:25:40.968808 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-679c9c695-9vt85" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.103963 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-665f4b6689-tfdk9"] Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.152131 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/906444a4-92dd-48ac-931d-0799f1256e9b-scripts\") pod \"906444a4-92dd-48ac-931d-0799f1256e9b\" (UID: \"906444a4-92dd-48ac-931d-0799f1256e9b\") " Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.152331 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/906444a4-92dd-48ac-931d-0799f1256e9b-horizon-secret-key\") pod \"906444a4-92dd-48ac-931d-0799f1256e9b\" (UID: \"906444a4-92dd-48ac-931d-0799f1256e9b\") " Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.152379 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/906444a4-92dd-48ac-931d-0799f1256e9b-logs\") pod \"906444a4-92dd-48ac-931d-0799f1256e9b\" (UID: \"906444a4-92dd-48ac-931d-0799f1256e9b\") " Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.152418 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vl74\" (UniqueName: \"kubernetes.io/projected/906444a4-92dd-48ac-931d-0799f1256e9b-kube-api-access-6vl74\") pod \"906444a4-92dd-48ac-931d-0799f1256e9b\" (UID: \"906444a4-92dd-48ac-931d-0799f1256e9b\") " Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.152496 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/906444a4-92dd-48ac-931d-0799f1256e9b-config-data\") pod \"906444a4-92dd-48ac-931d-0799f1256e9b\" (UID: \"906444a4-92dd-48ac-931d-0799f1256e9b\") " Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.155720 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/906444a4-92dd-48ac-931d-0799f1256e9b-logs" (OuterVolumeSpecName: "logs") pod "906444a4-92dd-48ac-931d-0799f1256e9b" (UID: "906444a4-92dd-48ac-931d-0799f1256e9b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.158895 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/906444a4-92dd-48ac-931d-0799f1256e9b-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "906444a4-92dd-48ac-931d-0799f1256e9b" (UID: "906444a4-92dd-48ac-931d-0799f1256e9b"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.164538 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/906444a4-92dd-48ac-931d-0799f1256e9b-kube-api-access-6vl74" (OuterVolumeSpecName: "kube-api-access-6vl74") pod "906444a4-92dd-48ac-931d-0799f1256e9b" (UID: "906444a4-92dd-48ac-931d-0799f1256e9b"). InnerVolumeSpecName "kube-api-access-6vl74". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.180040 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/906444a4-92dd-48ac-931d-0799f1256e9b-config-data" (OuterVolumeSpecName: "config-data") pod "906444a4-92dd-48ac-931d-0799f1256e9b" (UID: "906444a4-92dd-48ac-931d-0799f1256e9b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.189829 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/906444a4-92dd-48ac-931d-0799f1256e9b-scripts" (OuterVolumeSpecName: "scripts") pod "906444a4-92dd-48ac-931d-0799f1256e9b" (UID: "906444a4-92dd-48ac-931d-0799f1256e9b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.257925 4861 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/906444a4-92dd-48ac-931d-0799f1256e9b-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.257957 4861 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/906444a4-92dd-48ac-931d-0799f1256e9b-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.257966 4861 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/906444a4-92dd-48ac-931d-0799f1256e9b-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.257975 4861 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/906444a4-92dd-48ac-931d-0799f1256e9b-logs\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.257984 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vl74\" (UniqueName: \"kubernetes.io/projected/906444a4-92dd-48ac-931d-0799f1256e9b-kube-api-access-6vl74\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.326925 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5b9d58d97d-h7pvq"] Mar 09 09:25:41 crc kubenswrapper[4861]: E0309 09:25:41.327610 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a" containerName="cinder-api-log" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.327630 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a" containerName="cinder-api-log" Mar 09 09:25:41 crc kubenswrapper[4861]: E0309 09:25:41.327651 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="906444a4-92dd-48ac-931d-0799f1256e9b" containerName="horizon-log" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.327660 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="906444a4-92dd-48ac-931d-0799f1256e9b" containerName="horizon-log" Mar 09 09:25:41 crc kubenswrapper[4861]: E0309 09:25:41.327679 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a" containerName="cinder-api" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.327684 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a" containerName="cinder-api" Mar 09 09:25:41 crc kubenswrapper[4861]: E0309 09:25:41.327700 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="906444a4-92dd-48ac-931d-0799f1256e9b" containerName="horizon" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.327707 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="906444a4-92dd-48ac-931d-0799f1256e9b" containerName="horizon" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.327871 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="906444a4-92dd-48ac-931d-0799f1256e9b" containerName="horizon-log" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.327886 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="906444a4-92dd-48ac-931d-0799f1256e9b" containerName="horizon" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.327899 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a" containerName="cinder-api" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.327908 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a" containerName="cinder-api-log" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.328792 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5b9d58d97d-h7pvq" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.331068 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.331802 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.342203 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5b9d58d97d-h7pvq"] Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.408900 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-665f4b6689-tfdk9" event={"ID":"db5464b8-011f-4569-a47e-36766fa6c72e","Type":"ContainerStarted","Data":"1cf4e6102de86b8c88eaac2b0a370e4d7006f3052dc32e2512a936df1a71fefc"} Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.417174 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1","Type":"ContainerStarted","Data":"75a923c67628e7a6170db32559a1aab94be16069f2bfa80eb9fe6cbde9ed6208"} Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.421856 4861 generic.go:334] "Generic (PLEG): container finished" podID="906444a4-92dd-48ac-931d-0799f1256e9b" containerID="63fca8073164eb35d747996d9451264d49b4dbd0ca09db4d4dbcbecc2c70924d" exitCode=137 Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.421894 4861 generic.go:334] "Generic (PLEG): container finished" podID="906444a4-92dd-48ac-931d-0799f1256e9b" containerID="7411e0031187d25c4f45f41d34ceadea9cb54445acc1c97f27bac54fcb423e4d" exitCode=137 Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.423246 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-679c9c695-9vt85" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.425473 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-679c9c695-9vt85" event={"ID":"906444a4-92dd-48ac-931d-0799f1256e9b","Type":"ContainerDied","Data":"63fca8073164eb35d747996d9451264d49b4dbd0ca09db4d4dbcbecc2c70924d"} Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.425585 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-679c9c695-9vt85" event={"ID":"906444a4-92dd-48ac-931d-0799f1256e9b","Type":"ContainerDied","Data":"7411e0031187d25c4f45f41d34ceadea9cb54445acc1c97f27bac54fcb423e4d"} Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.425605 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-679c9c695-9vt85" event={"ID":"906444a4-92dd-48ac-931d-0799f1256e9b","Type":"ContainerDied","Data":"56eaea71ed937670a5cc2a4657b82d4019963b984dba0e1625515f13dafb8335"} Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.425619 4861 scope.go:117] "RemoveContainer" containerID="63fca8073164eb35d747996d9451264d49b4dbd0ca09db4d4dbcbecc2c70924d" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.425532 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.466935 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23b061c3-2bd5-4b7c-bdf6-76da2791cc8e-config-data\") pod \"barbican-api-5b9d58d97d-h7pvq\" (UID: \"23b061c3-2bd5-4b7c-bdf6-76da2791cc8e\") " pod="openstack/barbican-api-5b9d58d97d-h7pvq" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.467008 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/23b061c3-2bd5-4b7c-bdf6-76da2791cc8e-config-data-custom\") pod \"barbican-api-5b9d58d97d-h7pvq\" (UID: \"23b061c3-2bd5-4b7c-bdf6-76da2791cc8e\") " pod="openstack/barbican-api-5b9d58d97d-h7pvq" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.467067 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/23b061c3-2bd5-4b7c-bdf6-76da2791cc8e-internal-tls-certs\") pod \"barbican-api-5b9d58d97d-h7pvq\" (UID: \"23b061c3-2bd5-4b7c-bdf6-76da2791cc8e\") " pod="openstack/barbican-api-5b9d58d97d-h7pvq" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.467183 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23b061c3-2bd5-4b7c-bdf6-76da2791cc8e-logs\") pod \"barbican-api-5b9d58d97d-h7pvq\" (UID: \"23b061c3-2bd5-4b7c-bdf6-76da2791cc8e\") " pod="openstack/barbican-api-5b9d58d97d-h7pvq" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.467234 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23b061c3-2bd5-4b7c-bdf6-76da2791cc8e-combined-ca-bundle\") pod \"barbican-api-5b9d58d97d-h7pvq\" (UID: \"23b061c3-2bd5-4b7c-bdf6-76da2791cc8e\") " pod="openstack/barbican-api-5b9d58d97d-h7pvq" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.467264 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5qx7\" (UniqueName: \"kubernetes.io/projected/23b061c3-2bd5-4b7c-bdf6-76da2791cc8e-kube-api-access-r5qx7\") pod \"barbican-api-5b9d58d97d-h7pvq\" (UID: \"23b061c3-2bd5-4b7c-bdf6-76da2791cc8e\") " pod="openstack/barbican-api-5b9d58d97d-h7pvq" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.467382 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/23b061c3-2bd5-4b7c-bdf6-76da2791cc8e-public-tls-certs\") pod \"barbican-api-5b9d58d97d-h7pvq\" (UID: \"23b061c3-2bd5-4b7c-bdf6-76da2791cc8e\") " pod="openstack/barbican-api-5b9d58d97d-h7pvq" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.518571 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.538682 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.562458 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-679c9c695-9vt85"] Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.579043 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-679c9c695-9vt85"] Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.580046 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23b061c3-2bd5-4b7c-bdf6-76da2791cc8e-config-data\") pod \"barbican-api-5b9d58d97d-h7pvq\" (UID: \"23b061c3-2bd5-4b7c-bdf6-76da2791cc8e\") " pod="openstack/barbican-api-5b9d58d97d-h7pvq" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.580082 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/23b061c3-2bd5-4b7c-bdf6-76da2791cc8e-config-data-custom\") pod \"barbican-api-5b9d58d97d-h7pvq\" (UID: \"23b061c3-2bd5-4b7c-bdf6-76da2791cc8e\") " pod="openstack/barbican-api-5b9d58d97d-h7pvq" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.580130 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/23b061c3-2bd5-4b7c-bdf6-76da2791cc8e-internal-tls-certs\") pod \"barbican-api-5b9d58d97d-h7pvq\" (UID: \"23b061c3-2bd5-4b7c-bdf6-76da2791cc8e\") " pod="openstack/barbican-api-5b9d58d97d-h7pvq" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.580201 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23b061c3-2bd5-4b7c-bdf6-76da2791cc8e-logs\") pod \"barbican-api-5b9d58d97d-h7pvq\" (UID: \"23b061c3-2bd5-4b7c-bdf6-76da2791cc8e\") " pod="openstack/barbican-api-5b9d58d97d-h7pvq" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.580236 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23b061c3-2bd5-4b7c-bdf6-76da2791cc8e-combined-ca-bundle\") pod \"barbican-api-5b9d58d97d-h7pvq\" (UID: \"23b061c3-2bd5-4b7c-bdf6-76da2791cc8e\") " pod="openstack/barbican-api-5b9d58d97d-h7pvq" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.580252 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5qx7\" (UniqueName: \"kubernetes.io/projected/23b061c3-2bd5-4b7c-bdf6-76da2791cc8e-kube-api-access-r5qx7\") pod \"barbican-api-5b9d58d97d-h7pvq\" (UID: \"23b061c3-2bd5-4b7c-bdf6-76da2791cc8e\") " pod="openstack/barbican-api-5b9d58d97d-h7pvq" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.580295 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/23b061c3-2bd5-4b7c-bdf6-76da2791cc8e-public-tls-certs\") pod \"barbican-api-5b9d58d97d-h7pvq\" (UID: \"23b061c3-2bd5-4b7c-bdf6-76da2791cc8e\") " pod="openstack/barbican-api-5b9d58d97d-h7pvq" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.586779 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23b061c3-2bd5-4b7c-bdf6-76da2791cc8e-logs\") pod \"barbican-api-5b9d58d97d-h7pvq\" (UID: \"23b061c3-2bd5-4b7c-bdf6-76da2791cc8e\") " pod="openstack/barbican-api-5b9d58d97d-h7pvq" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.587396 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/23b061c3-2bd5-4b7c-bdf6-76da2791cc8e-config-data-custom\") pod \"barbican-api-5b9d58d97d-h7pvq\" (UID: \"23b061c3-2bd5-4b7c-bdf6-76da2791cc8e\") " pod="openstack/barbican-api-5b9d58d97d-h7pvq" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.591924 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/23b061c3-2bd5-4b7c-bdf6-76da2791cc8e-internal-tls-certs\") pod \"barbican-api-5b9d58d97d-h7pvq\" (UID: \"23b061c3-2bd5-4b7c-bdf6-76da2791cc8e\") " pod="openstack/barbican-api-5b9d58d97d-h7pvq" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.592943 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23b061c3-2bd5-4b7c-bdf6-76da2791cc8e-config-data\") pod \"barbican-api-5b9d58d97d-h7pvq\" (UID: \"23b061c3-2bd5-4b7c-bdf6-76da2791cc8e\") " pod="openstack/barbican-api-5b9d58d97d-h7pvq" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.595522 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23b061c3-2bd5-4b7c-bdf6-76da2791cc8e-combined-ca-bundle\") pod \"barbican-api-5b9d58d97d-h7pvq\" (UID: \"23b061c3-2bd5-4b7c-bdf6-76da2791cc8e\") " pod="openstack/barbican-api-5b9d58d97d-h7pvq" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.596235 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/23b061c3-2bd5-4b7c-bdf6-76da2791cc8e-public-tls-certs\") pod \"barbican-api-5b9d58d97d-h7pvq\" (UID: \"23b061c3-2bd5-4b7c-bdf6-76da2791cc8e\") " pod="openstack/barbican-api-5b9d58d97d-h7pvq" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.615224 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.630135 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.630263 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.638559 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.638663 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.638770 4861 scope.go:117] "RemoveContainer" containerID="7411e0031187d25c4f45f41d34ceadea9cb54445acc1c97f27bac54fcb423e4d" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.639815 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.678304 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5qx7\" (UniqueName: \"kubernetes.io/projected/23b061c3-2bd5-4b7c-bdf6-76da2791cc8e-kube-api-access-r5qx7\") pod \"barbican-api-5b9d58d97d-h7pvq\" (UID: \"23b061c3-2bd5-4b7c-bdf6-76da2791cc8e\") " pod="openstack/barbican-api-5b9d58d97d-h7pvq" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.710095 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a" path="/var/lib/kubelet/pods/3d0b7d4f-14a7-4c8a-b6e6-b98cce229f7a/volumes" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.718569 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="906444a4-92dd-48ac-931d-0799f1256e9b" path="/var/lib/kubelet/pods/906444a4-92dd-48ac-931d-0799f1256e9b/volumes" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.720909 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-7bb4db8c4-sxjc7" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.726723 4861 scope.go:117] "RemoveContainer" containerID="63fca8073164eb35d747996d9451264d49b4dbd0ca09db4d4dbcbecc2c70924d" Mar 09 09:25:41 crc kubenswrapper[4861]: E0309 09:25:41.731416 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63fca8073164eb35d747996d9451264d49b4dbd0ca09db4d4dbcbecc2c70924d\": container with ID starting with 63fca8073164eb35d747996d9451264d49b4dbd0ca09db4d4dbcbecc2c70924d not found: ID does not exist" containerID="63fca8073164eb35d747996d9451264d49b4dbd0ca09db4d4dbcbecc2c70924d" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.731455 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63fca8073164eb35d747996d9451264d49b4dbd0ca09db4d4dbcbecc2c70924d"} err="failed to get container status \"63fca8073164eb35d747996d9451264d49b4dbd0ca09db4d4dbcbecc2c70924d\": rpc error: code = NotFound desc = could not find container \"63fca8073164eb35d747996d9451264d49b4dbd0ca09db4d4dbcbecc2c70924d\": container with ID starting with 63fca8073164eb35d747996d9451264d49b4dbd0ca09db4d4dbcbecc2c70924d not found: ID does not exist" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.731478 4861 scope.go:117] "RemoveContainer" containerID="7411e0031187d25c4f45f41d34ceadea9cb54445acc1c97f27bac54fcb423e4d" Mar 09 09:25:41 crc kubenswrapper[4861]: E0309 09:25:41.733201 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7411e0031187d25c4f45f41d34ceadea9cb54445acc1c97f27bac54fcb423e4d\": container with ID starting with 7411e0031187d25c4f45f41d34ceadea9cb54445acc1c97f27bac54fcb423e4d not found: ID does not exist" containerID="7411e0031187d25c4f45f41d34ceadea9cb54445acc1c97f27bac54fcb423e4d" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.733245 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7411e0031187d25c4f45f41d34ceadea9cb54445acc1c97f27bac54fcb423e4d"} err="failed to get container status \"7411e0031187d25c4f45f41d34ceadea9cb54445acc1c97f27bac54fcb423e4d\": rpc error: code = NotFound desc = could not find container \"7411e0031187d25c4f45f41d34ceadea9cb54445acc1c97f27bac54fcb423e4d\": container with ID starting with 7411e0031187d25c4f45f41d34ceadea9cb54445acc1c97f27bac54fcb423e4d not found: ID does not exist" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.733302 4861 scope.go:117] "RemoveContainer" containerID="63fca8073164eb35d747996d9451264d49b4dbd0ca09db4d4dbcbecc2c70924d" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.738584 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63fca8073164eb35d747996d9451264d49b4dbd0ca09db4d4dbcbecc2c70924d"} err="failed to get container status \"63fca8073164eb35d747996d9451264d49b4dbd0ca09db4d4dbcbecc2c70924d\": rpc error: code = NotFound desc = could not find container \"63fca8073164eb35d747996d9451264d49b4dbd0ca09db4d4dbcbecc2c70924d\": container with ID starting with 63fca8073164eb35d747996d9451264d49b4dbd0ca09db4d4dbcbecc2c70924d not found: ID does not exist" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.738654 4861 scope.go:117] "RemoveContainer" containerID="7411e0031187d25c4f45f41d34ceadea9cb54445acc1c97f27bac54fcb423e4d" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.739098 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7411e0031187d25c4f45f41d34ceadea9cb54445acc1c97f27bac54fcb423e4d"} err="failed to get container status \"7411e0031187d25c4f45f41d34ceadea9cb54445acc1c97f27bac54fcb423e4d\": rpc error: code = NotFound desc = could not find container \"7411e0031187d25c4f45f41d34ceadea9cb54445acc1c97f27bac54fcb423e4d\": container with ID starting with 7411e0031187d25c4f45f41d34ceadea9cb54445acc1c97f27bac54fcb423e4d not found: ID does not exist" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.784448 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c9fa67cc-6a0f-485d-b064-cd14971058db-config-data-custom\") pod \"cinder-api-0\" (UID: \"c9fa67cc-6a0f-485d-b064-cd14971058db\") " pod="openstack/cinder-api-0" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.784599 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9fa67cc-6a0f-485d-b064-cd14971058db-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c9fa67cc-6a0f-485d-b064-cd14971058db\") " pod="openstack/cinder-api-0" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.784629 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9fa67cc-6a0f-485d-b064-cd14971058db-config-data\") pod \"cinder-api-0\" (UID: \"c9fa67cc-6a0f-485d-b064-cd14971058db\") " pod="openstack/cinder-api-0" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.784653 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9fa67cc-6a0f-485d-b064-cd14971058db-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c9fa67cc-6a0f-485d-b064-cd14971058db\") " pod="openstack/cinder-api-0" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.784673 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9fa67cc-6a0f-485d-b064-cd14971058db-scripts\") pod \"cinder-api-0\" (UID: \"c9fa67cc-6a0f-485d-b064-cd14971058db\") " pod="openstack/cinder-api-0" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.784752 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9fa67cc-6a0f-485d-b064-cd14971058db-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c9fa67cc-6a0f-485d-b064-cd14971058db\") " pod="openstack/cinder-api-0" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.784815 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9fa67cc-6a0f-485d-b064-cd14971058db-logs\") pod \"cinder-api-0\" (UID: \"c9fa67cc-6a0f-485d-b064-cd14971058db\") " pod="openstack/cinder-api-0" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.784839 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c9fa67cc-6a0f-485d-b064-cd14971058db-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c9fa67cc-6a0f-485d-b064-cd14971058db\") " pod="openstack/cinder-api-0" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.784853 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7j9x\" (UniqueName: \"kubernetes.io/projected/c9fa67cc-6a0f-485d-b064-cd14971058db-kube-api-access-g7j9x\") pod \"cinder-api-0\" (UID: \"c9fa67cc-6a0f-485d-b064-cd14971058db\") " pod="openstack/cinder-api-0" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.833855 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5b9d58d97d-h7pvq" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.852690 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-5487f4d458-lnthc" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.886853 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c9fa67cc-6a0f-485d-b064-cd14971058db-config-data-custom\") pod \"cinder-api-0\" (UID: \"c9fa67cc-6a0f-485d-b064-cd14971058db\") " pod="openstack/cinder-api-0" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.886973 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9fa67cc-6a0f-485d-b064-cd14971058db-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c9fa67cc-6a0f-485d-b064-cd14971058db\") " pod="openstack/cinder-api-0" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.887002 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9fa67cc-6a0f-485d-b064-cd14971058db-config-data\") pod \"cinder-api-0\" (UID: \"c9fa67cc-6a0f-485d-b064-cd14971058db\") " pod="openstack/cinder-api-0" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.887028 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9fa67cc-6a0f-485d-b064-cd14971058db-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c9fa67cc-6a0f-485d-b064-cd14971058db\") " pod="openstack/cinder-api-0" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.887050 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9fa67cc-6a0f-485d-b064-cd14971058db-scripts\") pod \"cinder-api-0\" (UID: \"c9fa67cc-6a0f-485d-b064-cd14971058db\") " pod="openstack/cinder-api-0" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.887121 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9fa67cc-6a0f-485d-b064-cd14971058db-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c9fa67cc-6a0f-485d-b064-cd14971058db\") " pod="openstack/cinder-api-0" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.887167 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9fa67cc-6a0f-485d-b064-cd14971058db-logs\") pod \"cinder-api-0\" (UID: \"c9fa67cc-6a0f-485d-b064-cd14971058db\") " pod="openstack/cinder-api-0" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.887202 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c9fa67cc-6a0f-485d-b064-cd14971058db-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c9fa67cc-6a0f-485d-b064-cd14971058db\") " pod="openstack/cinder-api-0" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.887262 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7j9x\" (UniqueName: \"kubernetes.io/projected/c9fa67cc-6a0f-485d-b064-cd14971058db-kube-api-access-g7j9x\") pod \"cinder-api-0\" (UID: \"c9fa67cc-6a0f-485d-b064-cd14971058db\") " pod="openstack/cinder-api-0" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.893863 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9fa67cc-6a0f-485d-b064-cd14971058db-scripts\") pod \"cinder-api-0\" (UID: \"c9fa67cc-6a0f-485d-b064-cd14971058db\") " pod="openstack/cinder-api-0" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.895085 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c9fa67cc-6a0f-485d-b064-cd14971058db-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c9fa67cc-6a0f-485d-b064-cd14971058db\") " pod="openstack/cinder-api-0" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.895326 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9fa67cc-6a0f-485d-b064-cd14971058db-logs\") pod \"cinder-api-0\" (UID: \"c9fa67cc-6a0f-485d-b064-cd14971058db\") " pod="openstack/cinder-api-0" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.896892 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9fa67cc-6a0f-485d-b064-cd14971058db-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c9fa67cc-6a0f-485d-b064-cd14971058db\") " pod="openstack/cinder-api-0" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.900200 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9fa67cc-6a0f-485d-b064-cd14971058db-config-data\") pod \"cinder-api-0\" (UID: \"c9fa67cc-6a0f-485d-b064-cd14971058db\") " pod="openstack/cinder-api-0" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.900266 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c9fa67cc-6a0f-485d-b064-cd14971058db-config-data-custom\") pod \"cinder-api-0\" (UID: \"c9fa67cc-6a0f-485d-b064-cd14971058db\") " pod="openstack/cinder-api-0" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.906060 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7j9x\" (UniqueName: \"kubernetes.io/projected/c9fa67cc-6a0f-485d-b064-cd14971058db-kube-api-access-g7j9x\") pod \"cinder-api-0\" (UID: \"c9fa67cc-6a0f-485d-b064-cd14971058db\") " pod="openstack/cinder-api-0" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.906908 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9fa67cc-6a0f-485d-b064-cd14971058db-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c9fa67cc-6a0f-485d-b064-cd14971058db\") " pod="openstack/cinder-api-0" Mar 09 09:25:41 crc kubenswrapper[4861]: I0309 09:25:41.910828 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9fa67cc-6a0f-485d-b064-cd14971058db-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c9fa67cc-6a0f-485d-b064-cd14971058db\") " pod="openstack/cinder-api-0" Mar 09 09:25:42 crc kubenswrapper[4861]: I0309 09:25:42.041032 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 09 09:25:42 crc kubenswrapper[4861]: I0309 09:25:42.109342 4861 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-8dfc7bfd5-qgpjh" podUID="bbbb2b98-1a7f-45f2-999d-8c4d1d879c01" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.159:9696/\": dial tcp 10.217.0.159:9696: connect: connection refused" Mar 09 09:25:42 crc kubenswrapper[4861]: I0309 09:25:42.456774 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1","Type":"ContainerStarted","Data":"c2e8d3629585fa584403ade19c51f099ae4d48c3cab7991240d1d3ddd0f69c99"} Mar 09 09:25:42 crc kubenswrapper[4861]: I0309 09:25:42.457182 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5b9d58d97d-h7pvq"] Mar 09 09:25:42 crc kubenswrapper[4861]: I0309 09:25:42.484519 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-665f4b6689-tfdk9" event={"ID":"db5464b8-011f-4569-a47e-36766fa6c72e","Type":"ContainerStarted","Data":"963f6683ce44d6e8e087457f73941eb36177a01f38a9b238ec13673a041aef45"} Mar 09 09:25:42 crc kubenswrapper[4861]: I0309 09:25:42.484586 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-665f4b6689-tfdk9" event={"ID":"db5464b8-011f-4569-a47e-36766fa6c72e","Type":"ContainerStarted","Data":"53794d2f509d12783d64e4d4b378edf69b7deb6b30897976408dd166d6bf32ff"} Mar 09 09:25:42 crc kubenswrapper[4861]: I0309 09:25:42.486822 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-665f4b6689-tfdk9" Mar 09 09:25:42 crc kubenswrapper[4861]: I0309 09:25:42.521059 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-665f4b6689-tfdk9" podStartSLOduration=2.521040243 podStartE2EDuration="2.521040243s" podCreationTimestamp="2026-03-09 09:25:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:25:42.508989721 +0000 UTC m=+1185.594029122" watchObservedRunningTime="2026-03-09 09:25:42.521040243 +0000 UTC m=+1185.606079644" Mar 09 09:25:42 crc kubenswrapper[4861]: I0309 09:25:42.718530 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 09 09:25:42 crc kubenswrapper[4861]: W0309 09:25:42.760238 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc9fa67cc_6a0f_485d_b064_cd14971058db.slice/crio-887400f5f7bafb1ae60763c1087a1aedbe6ec3202f941248d5d6e673a0578d8c WatchSource:0}: Error finding container 887400f5f7bafb1ae60763c1087a1aedbe6ec3202f941248d5d6e673a0578d8c: Status 404 returned error can't find the container with id 887400f5f7bafb1ae60763c1087a1aedbe6ec3202f941248d5d6e673a0578d8c Mar 09 09:25:43 crc kubenswrapper[4861]: I0309 09:25:43.494577 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c9fa67cc-6a0f-485d-b064-cd14971058db","Type":"ContainerStarted","Data":"887400f5f7bafb1ae60763c1087a1aedbe6ec3202f941248d5d6e673a0578d8c"} Mar 09 09:25:43 crc kubenswrapper[4861]: I0309 09:25:43.496864 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5b9d58d97d-h7pvq" event={"ID":"23b061c3-2bd5-4b7c-bdf6-76da2791cc8e","Type":"ContainerStarted","Data":"7158c09686173060cc67756a84257d5853c8cd6a0122f0330e91e0f4d674767b"} Mar 09 09:25:43 crc kubenswrapper[4861]: I0309 09:25:43.945789 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-7bb4db8c4-sxjc7" Mar 09 09:25:44 crc kubenswrapper[4861]: I0309 09:25:44.032534 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-5487f4d458-lnthc" Mar 09 09:25:44 crc kubenswrapper[4861]: I0309 09:25:44.112135 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7bb4db8c4-sxjc7"] Mar 09 09:25:44 crc kubenswrapper[4861]: I0309 09:25:44.512877 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1","Type":"ContainerStarted","Data":"5cdeb6e5500f4678560109b3d7fbc725a3f70c195e678f4d8a4357e48903dd6f"} Mar 09 09:25:44 crc kubenswrapper[4861]: I0309 09:25:44.516567 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5b9d58d97d-h7pvq" event={"ID":"23b061c3-2bd5-4b7c-bdf6-76da2791cc8e","Type":"ContainerStarted","Data":"edb7459f7f1dfe610b6cca336514009bc082db9519fbb9deef28bf2af6fc3d51"} Mar 09 09:25:44 crc kubenswrapper[4861]: I0309 09:25:44.516604 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5b9d58d97d-h7pvq" event={"ID":"23b061c3-2bd5-4b7c-bdf6-76da2791cc8e","Type":"ContainerStarted","Data":"ea5ad220a0725a3942e72ee9b1c3bfe9be82be8850b2fe9742ae8aa6929ecd07"} Mar 09 09:25:44 crc kubenswrapper[4861]: I0309 09:25:44.516726 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7bb4db8c4-sxjc7" podUID="71492031-e589-409e-b8c8-c0a1194b97ed" containerName="horizon-log" containerID="cri-o://f789d6c7cb2c90f537f7edf8f51e1f3aa6d4ed13c1a60ecf2646f0e1dd6894ae" gracePeriod=30 Mar 09 09:25:44 crc kubenswrapper[4861]: I0309 09:25:44.516826 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7bb4db8c4-sxjc7" podUID="71492031-e589-409e-b8c8-c0a1194b97ed" containerName="horizon" containerID="cri-o://c579881c05bf1dff8d11a8b2bc2f24f89a926610877fc04c23a9f98aa2682890" gracePeriod=30 Mar 09 09:25:44 crc kubenswrapper[4861]: I0309 09:25:44.545731 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5b9d58d97d-h7pvq" podStartSLOduration=3.54570958 podStartE2EDuration="3.54570958s" podCreationTimestamp="2026-03-09 09:25:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:25:44.540063005 +0000 UTC m=+1187.625102416" watchObservedRunningTime="2026-03-09 09:25:44.54570958 +0000 UTC m=+1187.630748981" Mar 09 09:25:45 crc kubenswrapper[4861]: I0309 09:25:45.179641 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 09 09:25:45 crc kubenswrapper[4861]: I0309 09:25:45.242513 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 09:25:45 crc kubenswrapper[4861]: I0309 09:25:45.332550 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7b8fcc65cc-466c6" Mar 09 09:25:45 crc kubenswrapper[4861]: I0309 09:25:45.399178 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7859c7799c-cztvv"] Mar 09 09:25:45 crc kubenswrapper[4861]: I0309 09:25:45.399589 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7859c7799c-cztvv" podUID="2d07e6c0-377a-44f4-a1d8-f984474200f6" containerName="dnsmasq-dns" containerID="cri-o://5c0fdd5e3bf5d45ccd03d9486f5f6cd479ed4ab1c3641e8d2275e0f89a4d8465" gracePeriod=10 Mar 09 09:25:45 crc kubenswrapper[4861]: I0309 09:25:45.535843 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c9fa67cc-6a0f-485d-b064-cd14971058db","Type":"ContainerStarted","Data":"b698bfcc3f803a5283aa94f9e7860dabff9eacaaf86ef3dddac34fd89cd6bce7"} Mar 09 09:25:45 crc kubenswrapper[4861]: I0309 09:25:45.535892 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c9fa67cc-6a0f-485d-b064-cd14971058db","Type":"ContainerStarted","Data":"6dfab1d509934d8019565d3693da1344149eced70471f6911dc2b7e881618fb5"} Mar 09 09:25:45 crc kubenswrapper[4861]: I0309 09:25:45.535945 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5b9d58d97d-h7pvq" Mar 09 09:25:45 crc kubenswrapper[4861]: I0309 09:25:45.536090 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="31e86fa6-cfa9-488f-b87f-ea95b05f8531" containerName="cinder-scheduler" containerID="cri-o://89ee5ee2a106daae472411239bf503ecb869ad61b83c5951d1bc49b5fedc1312" gracePeriod=30 Mar 09 09:25:45 crc kubenswrapper[4861]: I0309 09:25:45.536207 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="31e86fa6-cfa9-488f-b87f-ea95b05f8531" containerName="probe" containerID="cri-o://60e7294b8c9dd84642ac16dc7db39a6608018b69c3bca361fc06584d76e3e486" gracePeriod=30 Mar 09 09:25:45 crc kubenswrapper[4861]: I0309 09:25:45.536617 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5b9d58d97d-h7pvq" Mar 09 09:25:45 crc kubenswrapper[4861]: I0309 09:25:45.536647 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 09 09:25:45 crc kubenswrapper[4861]: I0309 09:25:45.578044 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.578019576 podStartE2EDuration="4.578019576s" podCreationTimestamp="2026-03-09 09:25:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:25:45.576645306 +0000 UTC m=+1188.661684707" watchObservedRunningTime="2026-03-09 09:25:45.578019576 +0000 UTC m=+1188.663058977" Mar 09 09:25:46 crc kubenswrapper[4861]: I0309 09:25:46.478939 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7859c7799c-cztvv" Mar 09 09:25:46 crc kubenswrapper[4861]: I0309 09:25:46.521729 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2d07e6c0-377a-44f4-a1d8-f984474200f6-ovsdbserver-nb\") pod \"2d07e6c0-377a-44f4-a1d8-f984474200f6\" (UID: \"2d07e6c0-377a-44f4-a1d8-f984474200f6\") " Mar 09 09:25:46 crc kubenswrapper[4861]: I0309 09:25:46.522327 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxfl7\" (UniqueName: \"kubernetes.io/projected/2d07e6c0-377a-44f4-a1d8-f984474200f6-kube-api-access-xxfl7\") pod \"2d07e6c0-377a-44f4-a1d8-f984474200f6\" (UID: \"2d07e6c0-377a-44f4-a1d8-f984474200f6\") " Mar 09 09:25:46 crc kubenswrapper[4861]: I0309 09:25:46.522531 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2d07e6c0-377a-44f4-a1d8-f984474200f6-dns-swift-storage-0\") pod \"2d07e6c0-377a-44f4-a1d8-f984474200f6\" (UID: \"2d07e6c0-377a-44f4-a1d8-f984474200f6\") " Mar 09 09:25:46 crc kubenswrapper[4861]: I0309 09:25:46.522680 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d07e6c0-377a-44f4-a1d8-f984474200f6-config\") pod \"2d07e6c0-377a-44f4-a1d8-f984474200f6\" (UID: \"2d07e6c0-377a-44f4-a1d8-f984474200f6\") " Mar 09 09:25:46 crc kubenswrapper[4861]: I0309 09:25:46.522781 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2d07e6c0-377a-44f4-a1d8-f984474200f6-ovsdbserver-sb\") pod \"2d07e6c0-377a-44f4-a1d8-f984474200f6\" (UID: \"2d07e6c0-377a-44f4-a1d8-f984474200f6\") " Mar 09 09:25:46 crc kubenswrapper[4861]: I0309 09:25:46.522897 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d07e6c0-377a-44f4-a1d8-f984474200f6-dns-svc\") pod \"2d07e6c0-377a-44f4-a1d8-f984474200f6\" (UID: \"2d07e6c0-377a-44f4-a1d8-f984474200f6\") " Mar 09 09:25:46 crc kubenswrapper[4861]: I0309 09:25:46.528564 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d07e6c0-377a-44f4-a1d8-f984474200f6-kube-api-access-xxfl7" (OuterVolumeSpecName: "kube-api-access-xxfl7") pod "2d07e6c0-377a-44f4-a1d8-f984474200f6" (UID: "2d07e6c0-377a-44f4-a1d8-f984474200f6"). InnerVolumeSpecName "kube-api-access-xxfl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:25:46 crc kubenswrapper[4861]: I0309 09:25:46.574677 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1","Type":"ContainerStarted","Data":"f59f0c4abfb6f63caeaa0f687c6369bee86ab126739c55cbc0490c2d536d44ef"} Mar 09 09:25:46 crc kubenswrapper[4861]: I0309 09:25:46.576180 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 09:25:46 crc kubenswrapper[4861]: I0309 09:25:46.580968 4861 generic.go:334] "Generic (PLEG): container finished" podID="2d07e6c0-377a-44f4-a1d8-f984474200f6" containerID="5c0fdd5e3bf5d45ccd03d9486f5f6cd479ed4ab1c3641e8d2275e0f89a4d8465" exitCode=0 Mar 09 09:25:46 crc kubenswrapper[4861]: I0309 09:25:46.581036 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7859c7799c-cztvv" event={"ID":"2d07e6c0-377a-44f4-a1d8-f984474200f6","Type":"ContainerDied","Data":"5c0fdd5e3bf5d45ccd03d9486f5f6cd479ed4ab1c3641e8d2275e0f89a4d8465"} Mar 09 09:25:46 crc kubenswrapper[4861]: I0309 09:25:46.581071 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7859c7799c-cztvv" event={"ID":"2d07e6c0-377a-44f4-a1d8-f984474200f6","Type":"ContainerDied","Data":"39f09b76e20369d596b9a0075258f17a2d7eecbf7179ae95d31a8c4a7717605e"} Mar 09 09:25:46 crc kubenswrapper[4861]: I0309 09:25:46.581093 4861 scope.go:117] "RemoveContainer" containerID="5c0fdd5e3bf5d45ccd03d9486f5f6cd479ed4ab1c3641e8d2275e0f89a4d8465" Mar 09 09:25:46 crc kubenswrapper[4861]: I0309 09:25:46.581236 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7859c7799c-cztvv" Mar 09 09:25:46 crc kubenswrapper[4861]: I0309 09:25:46.585115 4861 generic.go:334] "Generic (PLEG): container finished" podID="31e86fa6-cfa9-488f-b87f-ea95b05f8531" containerID="60e7294b8c9dd84642ac16dc7db39a6608018b69c3bca361fc06584d76e3e486" exitCode=0 Mar 09 09:25:46 crc kubenswrapper[4861]: I0309 09:25:46.586651 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"31e86fa6-cfa9-488f-b87f-ea95b05f8531","Type":"ContainerDied","Data":"60e7294b8c9dd84642ac16dc7db39a6608018b69c3bca361fc06584d76e3e486"} Mar 09 09:25:46 crc kubenswrapper[4861]: I0309 09:25:46.602395 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.010144322 podStartE2EDuration="8.602378791s" podCreationTimestamp="2026-03-09 09:25:38 +0000 UTC" firstStartedPulling="2026-03-09 09:25:39.512747431 +0000 UTC m=+1182.597786832" lastFinishedPulling="2026-03-09 09:25:46.1049819 +0000 UTC m=+1189.190021301" observedRunningTime="2026-03-09 09:25:46.596839719 +0000 UTC m=+1189.681879120" watchObservedRunningTime="2026-03-09 09:25:46.602378791 +0000 UTC m=+1189.687418192" Mar 09 09:25:46 crc kubenswrapper[4861]: I0309 09:25:46.608899 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d07e6c0-377a-44f4-a1d8-f984474200f6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2d07e6c0-377a-44f4-a1d8-f984474200f6" (UID: "2d07e6c0-377a-44f4-a1d8-f984474200f6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:25:46 crc kubenswrapper[4861]: I0309 09:25:46.609879 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d07e6c0-377a-44f4-a1d8-f984474200f6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2d07e6c0-377a-44f4-a1d8-f984474200f6" (UID: "2d07e6c0-377a-44f4-a1d8-f984474200f6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:25:46 crc kubenswrapper[4861]: I0309 09:25:46.618660 4861 scope.go:117] "RemoveContainer" containerID="fb25124f5deabeeac87d8a8c675a366d1d834083aef6ac71a805107146b18406" Mar 09 09:25:46 crc kubenswrapper[4861]: I0309 09:25:46.622647 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d07e6c0-377a-44f4-a1d8-f984474200f6-config" (OuterVolumeSpecName: "config") pod "2d07e6c0-377a-44f4-a1d8-f984474200f6" (UID: "2d07e6c0-377a-44f4-a1d8-f984474200f6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:25:46 crc kubenswrapper[4861]: I0309 09:25:46.627182 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d07e6c0-377a-44f4-a1d8-f984474200f6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2d07e6c0-377a-44f4-a1d8-f984474200f6" (UID: "2d07e6c0-377a-44f4-a1d8-f984474200f6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:25:46 crc kubenswrapper[4861]: I0309 09:25:46.629908 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d07e6c0-377a-44f4-a1d8-f984474200f6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2d07e6c0-377a-44f4-a1d8-f984474200f6" (UID: "2d07e6c0-377a-44f4-a1d8-f984474200f6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:25:46 crc kubenswrapper[4861]: I0309 09:25:46.632713 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxfl7\" (UniqueName: \"kubernetes.io/projected/2d07e6c0-377a-44f4-a1d8-f984474200f6-kube-api-access-xxfl7\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:46 crc kubenswrapper[4861]: I0309 09:25:46.632744 4861 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2d07e6c0-377a-44f4-a1d8-f984474200f6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:46 crc kubenswrapper[4861]: I0309 09:25:46.632758 4861 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d07e6c0-377a-44f4-a1d8-f984474200f6-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:46 crc kubenswrapper[4861]: I0309 09:25:46.632770 4861 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2d07e6c0-377a-44f4-a1d8-f984474200f6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:46 crc kubenswrapper[4861]: I0309 09:25:46.632782 4861 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2d07e6c0-377a-44f4-a1d8-f984474200f6-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:46 crc kubenswrapper[4861]: I0309 09:25:46.632792 4861 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2d07e6c0-377a-44f4-a1d8-f984474200f6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:46 crc kubenswrapper[4861]: I0309 09:25:46.657309 4861 scope.go:117] "RemoveContainer" containerID="5c0fdd5e3bf5d45ccd03d9486f5f6cd479ed4ab1c3641e8d2275e0f89a4d8465" Mar 09 09:25:46 crc kubenswrapper[4861]: E0309 09:25:46.659728 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c0fdd5e3bf5d45ccd03d9486f5f6cd479ed4ab1c3641e8d2275e0f89a4d8465\": container with ID starting with 5c0fdd5e3bf5d45ccd03d9486f5f6cd479ed4ab1c3641e8d2275e0f89a4d8465 not found: ID does not exist" containerID="5c0fdd5e3bf5d45ccd03d9486f5f6cd479ed4ab1c3641e8d2275e0f89a4d8465" Mar 09 09:25:46 crc kubenswrapper[4861]: I0309 09:25:46.659810 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c0fdd5e3bf5d45ccd03d9486f5f6cd479ed4ab1c3641e8d2275e0f89a4d8465"} err="failed to get container status \"5c0fdd5e3bf5d45ccd03d9486f5f6cd479ed4ab1c3641e8d2275e0f89a4d8465\": rpc error: code = NotFound desc = could not find container \"5c0fdd5e3bf5d45ccd03d9486f5f6cd479ed4ab1c3641e8d2275e0f89a4d8465\": container with ID starting with 5c0fdd5e3bf5d45ccd03d9486f5f6cd479ed4ab1c3641e8d2275e0f89a4d8465 not found: ID does not exist" Mar 09 09:25:46 crc kubenswrapper[4861]: I0309 09:25:46.659832 4861 scope.go:117] "RemoveContainer" containerID="fb25124f5deabeeac87d8a8c675a366d1d834083aef6ac71a805107146b18406" Mar 09 09:25:46 crc kubenswrapper[4861]: E0309 09:25:46.660261 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb25124f5deabeeac87d8a8c675a366d1d834083aef6ac71a805107146b18406\": container with ID starting with fb25124f5deabeeac87d8a8c675a366d1d834083aef6ac71a805107146b18406 not found: ID does not exist" containerID="fb25124f5deabeeac87d8a8c675a366d1d834083aef6ac71a805107146b18406" Mar 09 09:25:46 crc kubenswrapper[4861]: I0309 09:25:46.660312 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb25124f5deabeeac87d8a8c675a366d1d834083aef6ac71a805107146b18406"} err="failed to get container status \"fb25124f5deabeeac87d8a8c675a366d1d834083aef6ac71a805107146b18406\": rpc error: code = NotFound desc = could not find container \"fb25124f5deabeeac87d8a8c675a366d1d834083aef6ac71a805107146b18406\": container with ID starting with fb25124f5deabeeac87d8a8c675a366d1d834083aef6ac71a805107146b18406 not found: ID does not exist" Mar 09 09:25:46 crc kubenswrapper[4861]: I0309 09:25:46.920633 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7859c7799c-cztvv"] Mar 09 09:25:46 crc kubenswrapper[4861]: I0309 09:25:46.935122 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7859c7799c-cztvv"] Mar 09 09:25:47 crc kubenswrapper[4861]: I0309 09:25:47.319239 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-764b486b9b-tk6j5" Mar 09 09:25:47 crc kubenswrapper[4861]: I0309 09:25:47.390347 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-764b486b9b-tk6j5" Mar 09 09:25:47 crc kubenswrapper[4861]: I0309 09:25:47.668677 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d07e6c0-377a-44f4-a1d8-f984474200f6" path="/var/lib/kubelet/pods/2d07e6c0-377a-44f4-a1d8-f984474200f6/volumes" Mar 09 09:25:48 crc kubenswrapper[4861]: I0309 09:25:48.143098 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8dfc7bfd5-qgpjh" Mar 09 09:25:48 crc kubenswrapper[4861]: I0309 09:25:48.162983 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbbb2b98-1a7f-45f2-999d-8c4d1d879c01-internal-tls-certs\") pod \"bbbb2b98-1a7f-45f2-999d-8c4d1d879c01\" (UID: \"bbbb2b98-1a7f-45f2-999d-8c4d1d879c01\") " Mar 09 09:25:48 crc kubenswrapper[4861]: I0309 09:25:48.163034 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbbb2b98-1a7f-45f2-999d-8c4d1d879c01-combined-ca-bundle\") pod \"bbbb2b98-1a7f-45f2-999d-8c4d1d879c01\" (UID: \"bbbb2b98-1a7f-45f2-999d-8c4d1d879c01\") " Mar 09 09:25:48 crc kubenswrapper[4861]: I0309 09:25:48.163064 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bbbb2b98-1a7f-45f2-999d-8c4d1d879c01-config\") pod \"bbbb2b98-1a7f-45f2-999d-8c4d1d879c01\" (UID: \"bbbb2b98-1a7f-45f2-999d-8c4d1d879c01\") " Mar 09 09:25:48 crc kubenswrapper[4861]: I0309 09:25:48.163104 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbbb2b98-1a7f-45f2-999d-8c4d1d879c01-public-tls-certs\") pod \"bbbb2b98-1a7f-45f2-999d-8c4d1d879c01\" (UID: \"bbbb2b98-1a7f-45f2-999d-8c4d1d879c01\") " Mar 09 09:25:48 crc kubenswrapper[4861]: I0309 09:25:48.163130 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6pl2\" (UniqueName: \"kubernetes.io/projected/bbbb2b98-1a7f-45f2-999d-8c4d1d879c01-kube-api-access-c6pl2\") pod \"bbbb2b98-1a7f-45f2-999d-8c4d1d879c01\" (UID: \"bbbb2b98-1a7f-45f2-999d-8c4d1d879c01\") " Mar 09 09:25:48 crc kubenswrapper[4861]: I0309 09:25:48.163173 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbbb2b98-1a7f-45f2-999d-8c4d1d879c01-ovndb-tls-certs\") pod \"bbbb2b98-1a7f-45f2-999d-8c4d1d879c01\" (UID: \"bbbb2b98-1a7f-45f2-999d-8c4d1d879c01\") " Mar 09 09:25:48 crc kubenswrapper[4861]: I0309 09:25:48.163216 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bbbb2b98-1a7f-45f2-999d-8c4d1d879c01-httpd-config\") pod \"bbbb2b98-1a7f-45f2-999d-8c4d1d879c01\" (UID: \"bbbb2b98-1a7f-45f2-999d-8c4d1d879c01\") " Mar 09 09:25:48 crc kubenswrapper[4861]: I0309 09:25:48.187590 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbbb2b98-1a7f-45f2-999d-8c4d1d879c01-kube-api-access-c6pl2" (OuterVolumeSpecName: "kube-api-access-c6pl2") pod "bbbb2b98-1a7f-45f2-999d-8c4d1d879c01" (UID: "bbbb2b98-1a7f-45f2-999d-8c4d1d879c01"). InnerVolumeSpecName "kube-api-access-c6pl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:25:48 crc kubenswrapper[4861]: I0309 09:25:48.197419 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbbb2b98-1a7f-45f2-999d-8c4d1d879c01-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "bbbb2b98-1a7f-45f2-999d-8c4d1d879c01" (UID: "bbbb2b98-1a7f-45f2-999d-8c4d1d879c01"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:25:48 crc kubenswrapper[4861]: I0309 09:25:48.232083 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbbb2b98-1a7f-45f2-999d-8c4d1d879c01-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bbbb2b98-1a7f-45f2-999d-8c4d1d879c01" (UID: "bbbb2b98-1a7f-45f2-999d-8c4d1d879c01"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:25:48 crc kubenswrapper[4861]: I0309 09:25:48.250763 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbbb2b98-1a7f-45f2-999d-8c4d1d879c01-config" (OuterVolumeSpecName: "config") pod "bbbb2b98-1a7f-45f2-999d-8c4d1d879c01" (UID: "bbbb2b98-1a7f-45f2-999d-8c4d1d879c01"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:25:48 crc kubenswrapper[4861]: I0309 09:25:48.273652 4861 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbbb2b98-1a7f-45f2-999d-8c4d1d879c01-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:48 crc kubenswrapper[4861]: I0309 09:25:48.273677 4861 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/bbbb2b98-1a7f-45f2-999d-8c4d1d879c01-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:48 crc kubenswrapper[4861]: I0309 09:25:48.273688 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6pl2\" (UniqueName: \"kubernetes.io/projected/bbbb2b98-1a7f-45f2-999d-8c4d1d879c01-kube-api-access-c6pl2\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:48 crc kubenswrapper[4861]: I0309 09:25:48.273699 4861 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bbbb2b98-1a7f-45f2-999d-8c4d1d879c01-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:48 crc kubenswrapper[4861]: I0309 09:25:48.275603 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbbb2b98-1a7f-45f2-999d-8c4d1d879c01-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "bbbb2b98-1a7f-45f2-999d-8c4d1d879c01" (UID: "bbbb2b98-1a7f-45f2-999d-8c4d1d879c01"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:25:48 crc kubenswrapper[4861]: I0309 09:25:48.289816 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbbb2b98-1a7f-45f2-999d-8c4d1d879c01-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "bbbb2b98-1a7f-45f2-999d-8c4d1d879c01" (UID: "bbbb2b98-1a7f-45f2-999d-8c4d1d879c01"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:25:48 crc kubenswrapper[4861]: I0309 09:25:48.293599 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbbb2b98-1a7f-45f2-999d-8c4d1d879c01-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "bbbb2b98-1a7f-45f2-999d-8c4d1d879c01" (UID: "bbbb2b98-1a7f-45f2-999d-8c4d1d879c01"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:25:48 crc kubenswrapper[4861]: I0309 09:25:48.375537 4861 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbbb2b98-1a7f-45f2-999d-8c4d1d879c01-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:48 crc kubenswrapper[4861]: I0309 09:25:48.375587 4861 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbbb2b98-1a7f-45f2-999d-8c4d1d879c01-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:48 crc kubenswrapper[4861]: I0309 09:25:48.375595 4861 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbbb2b98-1a7f-45f2-999d-8c4d1d879c01-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:48 crc kubenswrapper[4861]: I0309 09:25:48.614714 4861 generic.go:334] "Generic (PLEG): container finished" podID="71492031-e589-409e-b8c8-c0a1194b97ed" containerID="c579881c05bf1dff8d11a8b2bc2f24f89a926610877fc04c23a9f98aa2682890" exitCode=0 Mar 09 09:25:48 crc kubenswrapper[4861]: I0309 09:25:48.614782 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7bb4db8c4-sxjc7" event={"ID":"71492031-e589-409e-b8c8-c0a1194b97ed","Type":"ContainerDied","Data":"c579881c05bf1dff8d11a8b2bc2f24f89a926610877fc04c23a9f98aa2682890"} Mar 09 09:25:48 crc kubenswrapper[4861]: I0309 09:25:48.616426 4861 generic.go:334] "Generic (PLEG): container finished" podID="bbbb2b98-1a7f-45f2-999d-8c4d1d879c01" containerID="62d9f61865c0aba71cc143cf38566dfd023bf5ccc20494aec11ffb3b58f48655" exitCode=0 Mar 09 09:25:48 crc kubenswrapper[4861]: I0309 09:25:48.616458 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8dfc7bfd5-qgpjh" event={"ID":"bbbb2b98-1a7f-45f2-999d-8c4d1d879c01","Type":"ContainerDied","Data":"62d9f61865c0aba71cc143cf38566dfd023bf5ccc20494aec11ffb3b58f48655"} Mar 09 09:25:48 crc kubenswrapper[4861]: I0309 09:25:48.616492 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8dfc7bfd5-qgpjh" event={"ID":"bbbb2b98-1a7f-45f2-999d-8c4d1d879c01","Type":"ContainerDied","Data":"178eefe1348bf47a8a4ff069f73aee67d2a9d41c9f6f1b2a8fc901ce1b9cb77c"} Mar 09 09:25:48 crc kubenswrapper[4861]: I0309 09:25:48.616516 4861 scope.go:117] "RemoveContainer" containerID="2acb7c830dfedd1fba55a973d93130f09024212c84ce99b6a2152b544d5cbd8a" Mar 09 09:25:48 crc kubenswrapper[4861]: I0309 09:25:48.616533 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8dfc7bfd5-qgpjh" Mar 09 09:25:48 crc kubenswrapper[4861]: I0309 09:25:48.657564 4861 scope.go:117] "RemoveContainer" containerID="62d9f61865c0aba71cc143cf38566dfd023bf5ccc20494aec11ffb3b58f48655" Mar 09 09:25:48 crc kubenswrapper[4861]: I0309 09:25:48.756395 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-8dfc7bfd5-qgpjh"] Mar 09 09:25:48 crc kubenswrapper[4861]: I0309 09:25:48.760271 4861 scope.go:117] "RemoveContainer" containerID="2acb7c830dfedd1fba55a973d93130f09024212c84ce99b6a2152b544d5cbd8a" Mar 09 09:25:48 crc kubenswrapper[4861]: E0309 09:25:48.764701 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2acb7c830dfedd1fba55a973d93130f09024212c84ce99b6a2152b544d5cbd8a\": container with ID starting with 2acb7c830dfedd1fba55a973d93130f09024212c84ce99b6a2152b544d5cbd8a not found: ID does not exist" containerID="2acb7c830dfedd1fba55a973d93130f09024212c84ce99b6a2152b544d5cbd8a" Mar 09 09:25:48 crc kubenswrapper[4861]: I0309 09:25:48.764738 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2acb7c830dfedd1fba55a973d93130f09024212c84ce99b6a2152b544d5cbd8a"} err="failed to get container status \"2acb7c830dfedd1fba55a973d93130f09024212c84ce99b6a2152b544d5cbd8a\": rpc error: code = NotFound desc = could not find container \"2acb7c830dfedd1fba55a973d93130f09024212c84ce99b6a2152b544d5cbd8a\": container with ID starting with 2acb7c830dfedd1fba55a973d93130f09024212c84ce99b6a2152b544d5cbd8a not found: ID does not exist" Mar 09 09:25:48 crc kubenswrapper[4861]: I0309 09:25:48.764759 4861 scope.go:117] "RemoveContainer" containerID="62d9f61865c0aba71cc143cf38566dfd023bf5ccc20494aec11ffb3b58f48655" Mar 09 09:25:48 crc kubenswrapper[4861]: E0309 09:25:48.765077 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62d9f61865c0aba71cc143cf38566dfd023bf5ccc20494aec11ffb3b58f48655\": container with ID starting with 62d9f61865c0aba71cc143cf38566dfd023bf5ccc20494aec11ffb3b58f48655 not found: ID does not exist" containerID="62d9f61865c0aba71cc143cf38566dfd023bf5ccc20494aec11ffb3b58f48655" Mar 09 09:25:48 crc kubenswrapper[4861]: I0309 09:25:48.765100 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62d9f61865c0aba71cc143cf38566dfd023bf5ccc20494aec11ffb3b58f48655"} err="failed to get container status \"62d9f61865c0aba71cc143cf38566dfd023bf5ccc20494aec11ffb3b58f48655\": rpc error: code = NotFound desc = could not find container \"62d9f61865c0aba71cc143cf38566dfd023bf5ccc20494aec11ffb3b58f48655\": container with ID starting with 62d9f61865c0aba71cc143cf38566dfd023bf5ccc20494aec11ffb3b58f48655 not found: ID does not exist" Mar 09 09:25:48 crc kubenswrapper[4861]: I0309 09:25:48.771535 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-8dfc7bfd5-qgpjh"] Mar 09 09:25:48 crc kubenswrapper[4861]: I0309 09:25:48.854727 4861 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7bb4db8c4-sxjc7" podUID="71492031-e589-409e-b8c8-c0a1194b97ed" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.153:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.153:8443: connect: connection refused" Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.022092 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.089047 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/31e86fa6-cfa9-488f-b87f-ea95b05f8531-etc-machine-id\") pod \"31e86fa6-cfa9-488f-b87f-ea95b05f8531\" (UID: \"31e86fa6-cfa9-488f-b87f-ea95b05f8531\") " Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.089224 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42q9t\" (UniqueName: \"kubernetes.io/projected/31e86fa6-cfa9-488f-b87f-ea95b05f8531-kube-api-access-42q9t\") pod \"31e86fa6-cfa9-488f-b87f-ea95b05f8531\" (UID: \"31e86fa6-cfa9-488f-b87f-ea95b05f8531\") " Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.089291 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31e86fa6-cfa9-488f-b87f-ea95b05f8531-scripts\") pod \"31e86fa6-cfa9-488f-b87f-ea95b05f8531\" (UID: \"31e86fa6-cfa9-488f-b87f-ea95b05f8531\") " Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.089357 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31e86fa6-cfa9-488f-b87f-ea95b05f8531-config-data\") pod \"31e86fa6-cfa9-488f-b87f-ea95b05f8531\" (UID: \"31e86fa6-cfa9-488f-b87f-ea95b05f8531\") " Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.089401 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/31e86fa6-cfa9-488f-b87f-ea95b05f8531-config-data-custom\") pod \"31e86fa6-cfa9-488f-b87f-ea95b05f8531\" (UID: \"31e86fa6-cfa9-488f-b87f-ea95b05f8531\") " Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.089426 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31e86fa6-cfa9-488f-b87f-ea95b05f8531-combined-ca-bundle\") pod \"31e86fa6-cfa9-488f-b87f-ea95b05f8531\" (UID: \"31e86fa6-cfa9-488f-b87f-ea95b05f8531\") " Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.089153 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/31e86fa6-cfa9-488f-b87f-ea95b05f8531-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "31e86fa6-cfa9-488f-b87f-ea95b05f8531" (UID: "31e86fa6-cfa9-488f-b87f-ea95b05f8531"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.090635 4861 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/31e86fa6-cfa9-488f-b87f-ea95b05f8531-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.095062 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31e86fa6-cfa9-488f-b87f-ea95b05f8531-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "31e86fa6-cfa9-488f-b87f-ea95b05f8531" (UID: "31e86fa6-cfa9-488f-b87f-ea95b05f8531"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.098684 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31e86fa6-cfa9-488f-b87f-ea95b05f8531-scripts" (OuterVolumeSpecName: "scripts") pod "31e86fa6-cfa9-488f-b87f-ea95b05f8531" (UID: "31e86fa6-cfa9-488f-b87f-ea95b05f8531"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.109261 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31e86fa6-cfa9-488f-b87f-ea95b05f8531-kube-api-access-42q9t" (OuterVolumeSpecName: "kube-api-access-42q9t") pod "31e86fa6-cfa9-488f-b87f-ea95b05f8531" (UID: "31e86fa6-cfa9-488f-b87f-ea95b05f8531"). InnerVolumeSpecName "kube-api-access-42q9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.191533 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31e86fa6-cfa9-488f-b87f-ea95b05f8531-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "31e86fa6-cfa9-488f-b87f-ea95b05f8531" (UID: "31e86fa6-cfa9-488f-b87f-ea95b05f8531"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.192668 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42q9t\" (UniqueName: \"kubernetes.io/projected/31e86fa6-cfa9-488f-b87f-ea95b05f8531-kube-api-access-42q9t\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.192687 4861 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31e86fa6-cfa9-488f-b87f-ea95b05f8531-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.192696 4861 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/31e86fa6-cfa9-488f-b87f-ea95b05f8531-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.192704 4861 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31e86fa6-cfa9-488f-b87f-ea95b05f8531-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.278619 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31e86fa6-cfa9-488f-b87f-ea95b05f8531-config-data" (OuterVolumeSpecName: "config-data") pod "31e86fa6-cfa9-488f-b87f-ea95b05f8531" (UID: "31e86fa6-cfa9-488f-b87f-ea95b05f8531"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.296969 4861 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31e86fa6-cfa9-488f-b87f-ea95b05f8531-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.628262 4861 generic.go:334] "Generic (PLEG): container finished" podID="31e86fa6-cfa9-488f-b87f-ea95b05f8531" containerID="89ee5ee2a106daae472411239bf503ecb869ad61b83c5951d1bc49b5fedc1312" exitCode=0 Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.628322 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"31e86fa6-cfa9-488f-b87f-ea95b05f8531","Type":"ContainerDied","Data":"89ee5ee2a106daae472411239bf503ecb869ad61b83c5951d1bc49b5fedc1312"} Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.628332 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.628415 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"31e86fa6-cfa9-488f-b87f-ea95b05f8531","Type":"ContainerDied","Data":"d304f13f1ce386d3891eec10d13001ba01107ff8c2ea4da0ef9b8320624c256f"} Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.628452 4861 scope.go:117] "RemoveContainer" containerID="60e7294b8c9dd84642ac16dc7db39a6608018b69c3bca361fc06584d76e3e486" Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.657341 4861 scope.go:117] "RemoveContainer" containerID="89ee5ee2a106daae472411239bf503ecb869ad61b83c5951d1bc49b5fedc1312" Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.672308 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbbb2b98-1a7f-45f2-999d-8c4d1d879c01" path="/var/lib/kubelet/pods/bbbb2b98-1a7f-45f2-999d-8c4d1d879c01/volumes" Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.686310 4861 scope.go:117] "RemoveContainer" containerID="60e7294b8c9dd84642ac16dc7db39a6608018b69c3bca361fc06584d76e3e486" Mar 09 09:25:49 crc kubenswrapper[4861]: E0309 09:25:49.688554 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60e7294b8c9dd84642ac16dc7db39a6608018b69c3bca361fc06584d76e3e486\": container with ID starting with 60e7294b8c9dd84642ac16dc7db39a6608018b69c3bca361fc06584d76e3e486 not found: ID does not exist" containerID="60e7294b8c9dd84642ac16dc7db39a6608018b69c3bca361fc06584d76e3e486" Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.688607 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60e7294b8c9dd84642ac16dc7db39a6608018b69c3bca361fc06584d76e3e486"} err="failed to get container status \"60e7294b8c9dd84642ac16dc7db39a6608018b69c3bca361fc06584d76e3e486\": rpc error: code = NotFound desc = could not find container \"60e7294b8c9dd84642ac16dc7db39a6608018b69c3bca361fc06584d76e3e486\": container with ID starting with 60e7294b8c9dd84642ac16dc7db39a6608018b69c3bca361fc06584d76e3e486 not found: ID does not exist" Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.688636 4861 scope.go:117] "RemoveContainer" containerID="89ee5ee2a106daae472411239bf503ecb869ad61b83c5951d1bc49b5fedc1312" Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.689483 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 09:25:49 crc kubenswrapper[4861]: E0309 09:25:49.690107 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89ee5ee2a106daae472411239bf503ecb869ad61b83c5951d1bc49b5fedc1312\": container with ID starting with 89ee5ee2a106daae472411239bf503ecb869ad61b83c5951d1bc49b5fedc1312 not found: ID does not exist" containerID="89ee5ee2a106daae472411239bf503ecb869ad61b83c5951d1bc49b5fedc1312" Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.690139 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89ee5ee2a106daae472411239bf503ecb869ad61b83c5951d1bc49b5fedc1312"} err="failed to get container status \"89ee5ee2a106daae472411239bf503ecb869ad61b83c5951d1bc49b5fedc1312\": rpc error: code = NotFound desc = could not find container \"89ee5ee2a106daae472411239bf503ecb869ad61b83c5951d1bc49b5fedc1312\": container with ID starting with 89ee5ee2a106daae472411239bf503ecb869ad61b83c5951d1bc49b5fedc1312 not found: ID does not exist" Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.703752 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.728349 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 09:25:49 crc kubenswrapper[4861]: E0309 09:25:49.728779 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbbb2b98-1a7f-45f2-999d-8c4d1d879c01" containerName="neutron-api" Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.728800 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbbb2b98-1a7f-45f2-999d-8c4d1d879c01" containerName="neutron-api" Mar 09 09:25:49 crc kubenswrapper[4861]: E0309 09:25:49.728811 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d07e6c0-377a-44f4-a1d8-f984474200f6" containerName="dnsmasq-dns" Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.728818 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d07e6c0-377a-44f4-a1d8-f984474200f6" containerName="dnsmasq-dns" Mar 09 09:25:49 crc kubenswrapper[4861]: E0309 09:25:49.728833 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31e86fa6-cfa9-488f-b87f-ea95b05f8531" containerName="probe" Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.728840 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="31e86fa6-cfa9-488f-b87f-ea95b05f8531" containerName="probe" Mar 09 09:25:49 crc kubenswrapper[4861]: E0309 09:25:49.728853 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d07e6c0-377a-44f4-a1d8-f984474200f6" containerName="init" Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.728881 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d07e6c0-377a-44f4-a1d8-f984474200f6" containerName="init" Mar 09 09:25:49 crc kubenswrapper[4861]: E0309 09:25:49.728899 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbbb2b98-1a7f-45f2-999d-8c4d1d879c01" containerName="neutron-httpd" Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.728905 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbbb2b98-1a7f-45f2-999d-8c4d1d879c01" containerName="neutron-httpd" Mar 09 09:25:49 crc kubenswrapper[4861]: E0309 09:25:49.728923 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31e86fa6-cfa9-488f-b87f-ea95b05f8531" containerName="cinder-scheduler" Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.728929 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="31e86fa6-cfa9-488f-b87f-ea95b05f8531" containerName="cinder-scheduler" Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.729124 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbbb2b98-1a7f-45f2-999d-8c4d1d879c01" containerName="neutron-api" Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.729146 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="31e86fa6-cfa9-488f-b87f-ea95b05f8531" containerName="cinder-scheduler" Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.729160 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="31e86fa6-cfa9-488f-b87f-ea95b05f8531" containerName="probe" Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.729170 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d07e6c0-377a-44f4-a1d8-f984474200f6" containerName="dnsmasq-dns" Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.729177 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbbb2b98-1a7f-45f2-999d-8c4d1d879c01" containerName="neutron-httpd" Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.730206 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.735665 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.740621 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.805275 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/072cabf9-18cb-4562-a6a2-7f2b46a4f9ec-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"072cabf9-18cb-4562-a6a2-7f2b46a4f9ec\") " pod="openstack/cinder-scheduler-0" Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.805403 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/072cabf9-18cb-4562-a6a2-7f2b46a4f9ec-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"072cabf9-18cb-4562-a6a2-7f2b46a4f9ec\") " pod="openstack/cinder-scheduler-0" Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.805457 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/072cabf9-18cb-4562-a6a2-7f2b46a4f9ec-config-data\") pod \"cinder-scheduler-0\" (UID: \"072cabf9-18cb-4562-a6a2-7f2b46a4f9ec\") " pod="openstack/cinder-scheduler-0" Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.805520 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd72g\" (UniqueName: \"kubernetes.io/projected/072cabf9-18cb-4562-a6a2-7f2b46a4f9ec-kube-api-access-kd72g\") pod \"cinder-scheduler-0\" (UID: \"072cabf9-18cb-4562-a6a2-7f2b46a4f9ec\") " pod="openstack/cinder-scheduler-0" Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.805551 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/072cabf9-18cb-4562-a6a2-7f2b46a4f9ec-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"072cabf9-18cb-4562-a6a2-7f2b46a4f9ec\") " pod="openstack/cinder-scheduler-0" Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.805644 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/072cabf9-18cb-4562-a6a2-7f2b46a4f9ec-scripts\") pod \"cinder-scheduler-0\" (UID: \"072cabf9-18cb-4562-a6a2-7f2b46a4f9ec\") " pod="openstack/cinder-scheduler-0" Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.907354 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/072cabf9-18cb-4562-a6a2-7f2b46a4f9ec-config-data\") pod \"cinder-scheduler-0\" (UID: \"072cabf9-18cb-4562-a6a2-7f2b46a4f9ec\") " pod="openstack/cinder-scheduler-0" Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.907450 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kd72g\" (UniqueName: \"kubernetes.io/projected/072cabf9-18cb-4562-a6a2-7f2b46a4f9ec-kube-api-access-kd72g\") pod \"cinder-scheduler-0\" (UID: \"072cabf9-18cb-4562-a6a2-7f2b46a4f9ec\") " pod="openstack/cinder-scheduler-0" Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.907482 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/072cabf9-18cb-4562-a6a2-7f2b46a4f9ec-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"072cabf9-18cb-4562-a6a2-7f2b46a4f9ec\") " pod="openstack/cinder-scheduler-0" Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.907535 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/072cabf9-18cb-4562-a6a2-7f2b46a4f9ec-scripts\") pod \"cinder-scheduler-0\" (UID: \"072cabf9-18cb-4562-a6a2-7f2b46a4f9ec\") " pod="openstack/cinder-scheduler-0" Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.907577 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/072cabf9-18cb-4562-a6a2-7f2b46a4f9ec-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"072cabf9-18cb-4562-a6a2-7f2b46a4f9ec\") " pod="openstack/cinder-scheduler-0" Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.907640 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/072cabf9-18cb-4562-a6a2-7f2b46a4f9ec-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"072cabf9-18cb-4562-a6a2-7f2b46a4f9ec\") " pod="openstack/cinder-scheduler-0" Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.907741 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/072cabf9-18cb-4562-a6a2-7f2b46a4f9ec-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"072cabf9-18cb-4562-a6a2-7f2b46a4f9ec\") " pod="openstack/cinder-scheduler-0" Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.911051 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/072cabf9-18cb-4562-a6a2-7f2b46a4f9ec-config-data\") pod \"cinder-scheduler-0\" (UID: \"072cabf9-18cb-4562-a6a2-7f2b46a4f9ec\") " pod="openstack/cinder-scheduler-0" Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.911105 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/072cabf9-18cb-4562-a6a2-7f2b46a4f9ec-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"072cabf9-18cb-4562-a6a2-7f2b46a4f9ec\") " pod="openstack/cinder-scheduler-0" Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.912337 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/072cabf9-18cb-4562-a6a2-7f2b46a4f9ec-scripts\") pod \"cinder-scheduler-0\" (UID: \"072cabf9-18cb-4562-a6a2-7f2b46a4f9ec\") " pod="openstack/cinder-scheduler-0" Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.916766 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/072cabf9-18cb-4562-a6a2-7f2b46a4f9ec-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"072cabf9-18cb-4562-a6a2-7f2b46a4f9ec\") " pod="openstack/cinder-scheduler-0" Mar 09 09:25:49 crc kubenswrapper[4861]: I0309 09:25:49.930326 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd72g\" (UniqueName: \"kubernetes.io/projected/072cabf9-18cb-4562-a6a2-7f2b46a4f9ec-kube-api-access-kd72g\") pod \"cinder-scheduler-0\" (UID: \"072cabf9-18cb-4562-a6a2-7f2b46a4f9ec\") " pod="openstack/cinder-scheduler-0" Mar 09 09:25:50 crc kubenswrapper[4861]: I0309 09:25:50.052918 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 09 09:25:50 crc kubenswrapper[4861]: I0309 09:25:50.529111 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 09:25:50 crc kubenswrapper[4861]: I0309 09:25:50.642785 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"072cabf9-18cb-4562-a6a2-7f2b46a4f9ec","Type":"ContainerStarted","Data":"0e536e638987c71dd98487230e68a05a5db088ec5c62f48484b444fa481a7f01"} Mar 09 09:25:51 crc kubenswrapper[4861]: I0309 09:25:51.656635 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"072cabf9-18cb-4562-a6a2-7f2b46a4f9ec","Type":"ContainerStarted","Data":"d7668ba90c2e7cb0b2ad1516c02ccd8b169c1a579a7e6648d3ff3676ff44c7e9"} Mar 09 09:25:51 crc kubenswrapper[4861]: I0309 09:25:51.670908 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31e86fa6-cfa9-488f-b87f-ea95b05f8531" path="/var/lib/kubelet/pods/31e86fa6-cfa9-488f-b87f-ea95b05f8531/volumes" Mar 09 09:25:52 crc kubenswrapper[4861]: I0309 09:25:52.666465 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"072cabf9-18cb-4562-a6a2-7f2b46a4f9ec","Type":"ContainerStarted","Data":"3f50a553a8605c8854ae94ad49945db74e3d3cb553daf92d99d40941bd933d59"} Mar 09 09:25:52 crc kubenswrapper[4861]: I0309 09:25:52.691978 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.691959206 podStartE2EDuration="3.691959206s" podCreationTimestamp="2026-03-09 09:25:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:25:52.684281802 +0000 UTC m=+1195.769321213" watchObservedRunningTime="2026-03-09 09:25:52.691959206 +0000 UTC m=+1195.776998607" Mar 09 09:25:53 crc kubenswrapper[4861]: I0309 09:25:53.353315 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5b9d58d97d-h7pvq" Mar 09 09:25:53 crc kubenswrapper[4861]: I0309 09:25:53.389278 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5b9d58d97d-h7pvq" Mar 09 09:25:53 crc kubenswrapper[4861]: I0309 09:25:53.457155 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-764b486b9b-tk6j5"] Mar 09 09:25:53 crc kubenswrapper[4861]: I0309 09:25:53.457625 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-764b486b9b-tk6j5" podUID="48fbe4a1-81ab-4a46-8150-821bc8afa220" containerName="barbican-api-log" containerID="cri-o://c72c0b34b04fcf6e7685f1ef59cd6abe37a5f56a9418f9ff457e6d47203e3c14" gracePeriod=30 Mar 09 09:25:53 crc kubenswrapper[4861]: I0309 09:25:53.458118 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-764b486b9b-tk6j5" podUID="48fbe4a1-81ab-4a46-8150-821bc8afa220" containerName="barbican-api" containerID="cri-o://9eadb20d6f8b86d83be9007e709be1785f3dbbf280c5db1b3a3ee672323d43c0" gracePeriod=30 Mar 09 09:25:53 crc kubenswrapper[4861]: I0309 09:25:53.686928 4861 generic.go:334] "Generic (PLEG): container finished" podID="48fbe4a1-81ab-4a46-8150-821bc8afa220" containerID="c72c0b34b04fcf6e7685f1ef59cd6abe37a5f56a9418f9ff457e6d47203e3c14" exitCode=143 Mar 09 09:25:53 crc kubenswrapper[4861]: I0309 09:25:53.687168 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-764b486b9b-tk6j5" event={"ID":"48fbe4a1-81ab-4a46-8150-821bc8afa220","Type":"ContainerDied","Data":"c72c0b34b04fcf6e7685f1ef59cd6abe37a5f56a9418f9ff457e6d47203e3c14"} Mar 09 09:25:54 crc kubenswrapper[4861]: I0309 09:25:54.122556 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 09 09:25:54 crc kubenswrapper[4861]: I0309 09:25:54.155846 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-5cc5bc567f-7k86v" Mar 09 09:25:54 crc kubenswrapper[4861]: I0309 09:25:54.606421 4861 patch_prober.go:28] interesting pod/machine-config-daemon-5g7gc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:25:54 crc kubenswrapper[4861]: I0309 09:25:54.606488 4861 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:25:54 crc kubenswrapper[4861]: I0309 09:25:54.606534 4861 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" Mar 09 09:25:54 crc kubenswrapper[4861]: I0309 09:25:54.607309 4861 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"925e707d587470ad152a1a9ef2490c9fccb36de6da22acc63f3054b647081cf1"} pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 09:25:54 crc kubenswrapper[4861]: I0309 09:25:54.607423 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" containerID="cri-o://925e707d587470ad152a1a9ef2490c9fccb36de6da22acc63f3054b647081cf1" gracePeriod=600 Mar 09 09:25:54 crc kubenswrapper[4861]: I0309 09:25:54.987949 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 09 09:25:54 crc kubenswrapper[4861]: I0309 09:25:54.989347 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 09 09:25:55 crc kubenswrapper[4861]: I0309 09:25:55.006974 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 09 09:25:55 crc kubenswrapper[4861]: I0309 09:25:55.012787 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 09 09:25:55 crc kubenswrapper[4861]: I0309 09:25:55.012983 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-czzt6" Mar 09 09:25:55 crc kubenswrapper[4861]: I0309 09:25:55.013109 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 09 09:25:55 crc kubenswrapper[4861]: I0309 09:25:55.054585 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 09 09:25:55 crc kubenswrapper[4861]: I0309 09:25:55.129455 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpvpp\" (UniqueName: \"kubernetes.io/projected/62f873db-0b4f-4a99-bc1d-7cdff56989a2-kube-api-access-lpvpp\") pod \"openstackclient\" (UID: \"62f873db-0b4f-4a99-bc1d-7cdff56989a2\") " pod="openstack/openstackclient" Mar 09 09:25:55 crc kubenswrapper[4861]: I0309 09:25:55.129533 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62f873db-0b4f-4a99-bc1d-7cdff56989a2-combined-ca-bundle\") pod \"openstackclient\" (UID: \"62f873db-0b4f-4a99-bc1d-7cdff56989a2\") " pod="openstack/openstackclient" Mar 09 09:25:55 crc kubenswrapper[4861]: I0309 09:25:55.129606 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/62f873db-0b4f-4a99-bc1d-7cdff56989a2-openstack-config-secret\") pod \"openstackclient\" (UID: \"62f873db-0b4f-4a99-bc1d-7cdff56989a2\") " pod="openstack/openstackclient" Mar 09 09:25:55 crc kubenswrapper[4861]: I0309 09:25:55.129651 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/62f873db-0b4f-4a99-bc1d-7cdff56989a2-openstack-config\") pod \"openstackclient\" (UID: \"62f873db-0b4f-4a99-bc1d-7cdff56989a2\") " pod="openstack/openstackclient" Mar 09 09:25:55 crc kubenswrapper[4861]: I0309 09:25:55.232872 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62f873db-0b4f-4a99-bc1d-7cdff56989a2-combined-ca-bundle\") pod \"openstackclient\" (UID: \"62f873db-0b4f-4a99-bc1d-7cdff56989a2\") " pod="openstack/openstackclient" Mar 09 09:25:55 crc kubenswrapper[4861]: I0309 09:25:55.233036 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/62f873db-0b4f-4a99-bc1d-7cdff56989a2-openstack-config-secret\") pod \"openstackclient\" (UID: \"62f873db-0b4f-4a99-bc1d-7cdff56989a2\") " pod="openstack/openstackclient" Mar 09 09:25:55 crc kubenswrapper[4861]: I0309 09:25:55.233086 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/62f873db-0b4f-4a99-bc1d-7cdff56989a2-openstack-config\") pod \"openstackclient\" (UID: \"62f873db-0b4f-4a99-bc1d-7cdff56989a2\") " pod="openstack/openstackclient" Mar 09 09:25:55 crc kubenswrapper[4861]: I0309 09:25:55.233192 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpvpp\" (UniqueName: \"kubernetes.io/projected/62f873db-0b4f-4a99-bc1d-7cdff56989a2-kube-api-access-lpvpp\") pod \"openstackclient\" (UID: \"62f873db-0b4f-4a99-bc1d-7cdff56989a2\") " pod="openstack/openstackclient" Mar 09 09:25:55 crc kubenswrapper[4861]: I0309 09:25:55.236329 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/62f873db-0b4f-4a99-bc1d-7cdff56989a2-openstack-config\") pod \"openstackclient\" (UID: \"62f873db-0b4f-4a99-bc1d-7cdff56989a2\") " pod="openstack/openstackclient" Mar 09 09:25:55 crc kubenswrapper[4861]: I0309 09:25:55.245902 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/62f873db-0b4f-4a99-bc1d-7cdff56989a2-openstack-config-secret\") pod \"openstackclient\" (UID: \"62f873db-0b4f-4a99-bc1d-7cdff56989a2\") " pod="openstack/openstackclient" Mar 09 09:25:55 crc kubenswrapper[4861]: I0309 09:25:55.255361 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62f873db-0b4f-4a99-bc1d-7cdff56989a2-combined-ca-bundle\") pod \"openstackclient\" (UID: \"62f873db-0b4f-4a99-bc1d-7cdff56989a2\") " pod="openstack/openstackclient" Mar 09 09:25:55 crc kubenswrapper[4861]: I0309 09:25:55.261097 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpvpp\" (UniqueName: \"kubernetes.io/projected/62f873db-0b4f-4a99-bc1d-7cdff56989a2-kube-api-access-lpvpp\") pod \"openstackclient\" (UID: \"62f873db-0b4f-4a99-bc1d-7cdff56989a2\") " pod="openstack/openstackclient" Mar 09 09:25:55 crc kubenswrapper[4861]: I0309 09:25:55.332687 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 09 09:25:55 crc kubenswrapper[4861]: I0309 09:25:55.709574 4861 generic.go:334] "Generic (PLEG): container finished" podID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerID="925e707d587470ad152a1a9ef2490c9fccb36de6da22acc63f3054b647081cf1" exitCode=0 Mar 09 09:25:55 crc kubenswrapper[4861]: I0309 09:25:55.709662 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" event={"ID":"6f7875e3-174f-4c67-8675-d878de74aa4f","Type":"ContainerDied","Data":"925e707d587470ad152a1a9ef2490c9fccb36de6da22acc63f3054b647081cf1"} Mar 09 09:25:55 crc kubenswrapper[4861]: I0309 09:25:55.710363 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" event={"ID":"6f7875e3-174f-4c67-8675-d878de74aa4f","Type":"ContainerStarted","Data":"7fdd7f3e15e67ed60e9bdb64538958917f435e5fb6449fedcf993ae2c627b46e"} Mar 09 09:25:55 crc kubenswrapper[4861]: I0309 09:25:55.710467 4861 scope.go:117] "RemoveContainer" containerID="496f42623ddb6e7fafcb7e74986e05b309e695c4366816fec0ffd01b5c0a1be9" Mar 09 09:25:55 crc kubenswrapper[4861]: I0309 09:25:55.812287 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 09 09:25:55 crc kubenswrapper[4861]: W0309 09:25:55.817195 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62f873db_0b4f_4a99_bc1d_7cdff56989a2.slice/crio-6e8d6947222b4770d6e24d16f4d512d91ea2709ea59a80933dcb53e811d7e566 WatchSource:0}: Error finding container 6e8d6947222b4770d6e24d16f4d512d91ea2709ea59a80933dcb53e811d7e566: Status 404 returned error can't find the container with id 6e8d6947222b4770d6e24d16f4d512d91ea2709ea59a80933dcb53e811d7e566 Mar 09 09:25:56 crc kubenswrapper[4861]: I0309 09:25:56.171036 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-678fb94c4b-9x5d2" Mar 09 09:25:56 crc kubenswrapper[4861]: I0309 09:25:56.349656 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-678fb94c4b-9x5d2" Mar 09 09:25:56 crc kubenswrapper[4861]: I0309 09:25:56.625872 4861 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-764b486b9b-tk6j5" podUID="48fbe4a1-81ab-4a46-8150-821bc8afa220" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.169:9311/healthcheck\": read tcp 10.217.0.2:58206->10.217.0.169:9311: read: connection reset by peer" Mar 09 09:25:56 crc kubenswrapper[4861]: I0309 09:25:56.625946 4861 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-764b486b9b-tk6j5" podUID="48fbe4a1-81ab-4a46-8150-821bc8afa220" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.169:9311/healthcheck\": read tcp 10.217.0.2:58200->10.217.0.169:9311: read: connection reset by peer" Mar 09 09:25:56 crc kubenswrapper[4861]: I0309 09:25:56.743748 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"62f873db-0b4f-4a99-bc1d-7cdff56989a2","Type":"ContainerStarted","Data":"6e8d6947222b4770d6e24d16f4d512d91ea2709ea59a80933dcb53e811d7e566"} Mar 09 09:25:56 crc kubenswrapper[4861]: I0309 09:25:56.745096 4861 generic.go:334] "Generic (PLEG): container finished" podID="48fbe4a1-81ab-4a46-8150-821bc8afa220" containerID="9eadb20d6f8b86d83be9007e709be1785f3dbbf280c5db1b3a3ee672323d43c0" exitCode=0 Mar 09 09:25:56 crc kubenswrapper[4861]: I0309 09:25:56.745866 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-764b486b9b-tk6j5" event={"ID":"48fbe4a1-81ab-4a46-8150-821bc8afa220","Type":"ContainerDied","Data":"9eadb20d6f8b86d83be9007e709be1785f3dbbf280c5db1b3a3ee672323d43c0"} Mar 09 09:25:57 crc kubenswrapper[4861]: I0309 09:25:57.103935 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-764b486b9b-tk6j5" Mar 09 09:25:57 crc kubenswrapper[4861]: I0309 09:25:57.272278 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48fbe4a1-81ab-4a46-8150-821bc8afa220-config-data\") pod \"48fbe4a1-81ab-4a46-8150-821bc8afa220\" (UID: \"48fbe4a1-81ab-4a46-8150-821bc8afa220\") " Mar 09 09:25:57 crc kubenswrapper[4861]: I0309 09:25:57.272363 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48fbe4a1-81ab-4a46-8150-821bc8afa220-config-data-custom\") pod \"48fbe4a1-81ab-4a46-8150-821bc8afa220\" (UID: \"48fbe4a1-81ab-4a46-8150-821bc8afa220\") " Mar 09 09:25:57 crc kubenswrapper[4861]: I0309 09:25:57.272445 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48fbe4a1-81ab-4a46-8150-821bc8afa220-logs\") pod \"48fbe4a1-81ab-4a46-8150-821bc8afa220\" (UID: \"48fbe4a1-81ab-4a46-8150-821bc8afa220\") " Mar 09 09:25:57 crc kubenswrapper[4861]: I0309 09:25:57.272485 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2m78c\" (UniqueName: \"kubernetes.io/projected/48fbe4a1-81ab-4a46-8150-821bc8afa220-kube-api-access-2m78c\") pod \"48fbe4a1-81ab-4a46-8150-821bc8afa220\" (UID: \"48fbe4a1-81ab-4a46-8150-821bc8afa220\") " Mar 09 09:25:57 crc kubenswrapper[4861]: I0309 09:25:57.272541 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48fbe4a1-81ab-4a46-8150-821bc8afa220-combined-ca-bundle\") pod \"48fbe4a1-81ab-4a46-8150-821bc8afa220\" (UID: \"48fbe4a1-81ab-4a46-8150-821bc8afa220\") " Mar 09 09:25:57 crc kubenswrapper[4861]: I0309 09:25:57.273681 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48fbe4a1-81ab-4a46-8150-821bc8afa220-logs" (OuterVolumeSpecName: "logs") pod "48fbe4a1-81ab-4a46-8150-821bc8afa220" (UID: "48fbe4a1-81ab-4a46-8150-821bc8afa220"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:25:57 crc kubenswrapper[4861]: I0309 09:25:57.299862 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48fbe4a1-81ab-4a46-8150-821bc8afa220-kube-api-access-2m78c" (OuterVolumeSpecName: "kube-api-access-2m78c") pod "48fbe4a1-81ab-4a46-8150-821bc8afa220" (UID: "48fbe4a1-81ab-4a46-8150-821bc8afa220"). InnerVolumeSpecName "kube-api-access-2m78c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:25:57 crc kubenswrapper[4861]: I0309 09:25:57.301320 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48fbe4a1-81ab-4a46-8150-821bc8afa220-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "48fbe4a1-81ab-4a46-8150-821bc8afa220" (UID: "48fbe4a1-81ab-4a46-8150-821bc8afa220"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:25:57 crc kubenswrapper[4861]: I0309 09:25:57.316486 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48fbe4a1-81ab-4a46-8150-821bc8afa220-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "48fbe4a1-81ab-4a46-8150-821bc8afa220" (UID: "48fbe4a1-81ab-4a46-8150-821bc8afa220"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:25:57 crc kubenswrapper[4861]: I0309 09:25:57.333934 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48fbe4a1-81ab-4a46-8150-821bc8afa220-config-data" (OuterVolumeSpecName: "config-data") pod "48fbe4a1-81ab-4a46-8150-821bc8afa220" (UID: "48fbe4a1-81ab-4a46-8150-821bc8afa220"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:25:57 crc kubenswrapper[4861]: I0309 09:25:57.374721 4861 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48fbe4a1-81ab-4a46-8150-821bc8afa220-logs\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:57 crc kubenswrapper[4861]: I0309 09:25:57.374753 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2m78c\" (UniqueName: \"kubernetes.io/projected/48fbe4a1-81ab-4a46-8150-821bc8afa220-kube-api-access-2m78c\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:57 crc kubenswrapper[4861]: I0309 09:25:57.374764 4861 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48fbe4a1-81ab-4a46-8150-821bc8afa220-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:57 crc kubenswrapper[4861]: I0309 09:25:57.374772 4861 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48fbe4a1-81ab-4a46-8150-821bc8afa220-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:57 crc kubenswrapper[4861]: I0309 09:25:57.374780 4861 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48fbe4a1-81ab-4a46-8150-821bc8afa220-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 09:25:57 crc kubenswrapper[4861]: I0309 09:25:57.768831 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-764b486b9b-tk6j5" event={"ID":"48fbe4a1-81ab-4a46-8150-821bc8afa220","Type":"ContainerDied","Data":"3cace9b3826297e13b51f776c4a63465d7f30becfc9c4dce3918a85756fc792e"} Mar 09 09:25:57 crc kubenswrapper[4861]: I0309 09:25:57.769052 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-764b486b9b-tk6j5" Mar 09 09:25:57 crc kubenswrapper[4861]: I0309 09:25:57.769426 4861 scope.go:117] "RemoveContainer" containerID="9eadb20d6f8b86d83be9007e709be1785f3dbbf280c5db1b3a3ee672323d43c0" Mar 09 09:25:57 crc kubenswrapper[4861]: I0309 09:25:57.799559 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-764b486b9b-tk6j5"] Mar 09 09:25:57 crc kubenswrapper[4861]: I0309 09:25:57.809472 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-764b486b9b-tk6j5"] Mar 09 09:25:57 crc kubenswrapper[4861]: I0309 09:25:57.811322 4861 scope.go:117] "RemoveContainer" containerID="c72c0b34b04fcf6e7685f1ef59cd6abe37a5f56a9418f9ff457e6d47203e3c14" Mar 09 09:25:58 crc kubenswrapper[4861]: I0309 09:25:58.854111 4861 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7bb4db8c4-sxjc7" podUID="71492031-e589-409e-b8c8-c0a1194b97ed" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.153:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.153:8443: connect: connection refused" Mar 09 09:25:59 crc kubenswrapper[4861]: I0309 09:25:59.671555 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48fbe4a1-81ab-4a46-8150-821bc8afa220" path="/var/lib/kubelet/pods/48fbe4a1-81ab-4a46-8150-821bc8afa220/volumes" Mar 09 09:26:00 crc kubenswrapper[4861]: I0309 09:26:00.132682 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550806-dlnz6"] Mar 09 09:26:00 crc kubenswrapper[4861]: E0309 09:26:00.133138 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48fbe4a1-81ab-4a46-8150-821bc8afa220" containerName="barbican-api" Mar 09 09:26:00 crc kubenswrapper[4861]: I0309 09:26:00.133152 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="48fbe4a1-81ab-4a46-8150-821bc8afa220" containerName="barbican-api" Mar 09 09:26:00 crc kubenswrapper[4861]: E0309 09:26:00.133178 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48fbe4a1-81ab-4a46-8150-821bc8afa220" containerName="barbican-api-log" Mar 09 09:26:00 crc kubenswrapper[4861]: I0309 09:26:00.133186 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="48fbe4a1-81ab-4a46-8150-821bc8afa220" containerName="barbican-api-log" Mar 09 09:26:00 crc kubenswrapper[4861]: I0309 09:26:00.133425 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="48fbe4a1-81ab-4a46-8150-821bc8afa220" containerName="barbican-api-log" Mar 09 09:26:00 crc kubenswrapper[4861]: I0309 09:26:00.133445 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="48fbe4a1-81ab-4a46-8150-821bc8afa220" containerName="barbican-api" Mar 09 09:26:00 crc kubenswrapper[4861]: I0309 09:26:00.134184 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550806-dlnz6" Mar 09 09:26:00 crc kubenswrapper[4861]: I0309 09:26:00.138006 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:26:00 crc kubenswrapper[4861]: I0309 09:26:00.138226 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:26:00 crc kubenswrapper[4861]: I0309 09:26:00.140260 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550806-dlnz6"] Mar 09 09:26:00 crc kubenswrapper[4861]: I0309 09:26:00.143673 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k86l8" Mar 09 09:26:00 crc kubenswrapper[4861]: I0309 09:26:00.234849 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvnqq\" (UniqueName: \"kubernetes.io/projected/5330b450-5c51-43e8-b0c0-c9b875dc49b5-kube-api-access-jvnqq\") pod \"auto-csr-approver-29550806-dlnz6\" (UID: \"5330b450-5c51-43e8-b0c0-c9b875dc49b5\") " pod="openshift-infra/auto-csr-approver-29550806-dlnz6" Mar 09 09:26:00 crc kubenswrapper[4861]: I0309 09:26:00.295342 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 09 09:26:00 crc kubenswrapper[4861]: I0309 09:26:00.336359 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvnqq\" (UniqueName: \"kubernetes.io/projected/5330b450-5c51-43e8-b0c0-c9b875dc49b5-kube-api-access-jvnqq\") pod \"auto-csr-approver-29550806-dlnz6\" (UID: \"5330b450-5c51-43e8-b0c0-c9b875dc49b5\") " pod="openshift-infra/auto-csr-approver-29550806-dlnz6" Mar 09 09:26:00 crc kubenswrapper[4861]: I0309 09:26:00.356179 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvnqq\" (UniqueName: \"kubernetes.io/projected/5330b450-5c51-43e8-b0c0-c9b875dc49b5-kube-api-access-jvnqq\") pod \"auto-csr-approver-29550806-dlnz6\" (UID: \"5330b450-5c51-43e8-b0c0-c9b875dc49b5\") " pod="openshift-infra/auto-csr-approver-29550806-dlnz6" Mar 09 09:26:00 crc kubenswrapper[4861]: I0309 09:26:00.456893 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550806-dlnz6" Mar 09 09:26:00 crc kubenswrapper[4861]: I0309 09:26:00.965054 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550806-dlnz6"] Mar 09 09:26:00 crc kubenswrapper[4861]: W0309 09:26:00.973272 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5330b450_5c51_43e8_b0c0_c9b875dc49b5.slice/crio-3c8f27d502b2b1dc67fa5d91348209bc5a3f5bcdc59a119334d8a7a0afe2a23c WatchSource:0}: Error finding container 3c8f27d502b2b1dc67fa5d91348209bc5a3f5bcdc59a119334d8a7a0afe2a23c: Status 404 returned error can't find the container with id 3c8f27d502b2b1dc67fa5d91348209bc5a3f5bcdc59a119334d8a7a0afe2a23c Mar 09 09:26:01 crc kubenswrapper[4861]: I0309 09:26:01.277031 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-6f4f458d55-lxkls"] Mar 09 09:26:01 crc kubenswrapper[4861]: I0309 09:26:01.278834 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6f4f458d55-lxkls" Mar 09 09:26:01 crc kubenswrapper[4861]: I0309 09:26:01.281514 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 09 09:26:01 crc kubenswrapper[4861]: I0309 09:26:01.285744 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 09 09:26:01 crc kubenswrapper[4861]: I0309 09:26:01.292239 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 09 09:26:01 crc kubenswrapper[4861]: I0309 09:26:01.324448 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6f4f458d55-lxkls"] Mar 09 09:26:01 crc kubenswrapper[4861]: I0309 09:26:01.456336 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ed378c6-5773-4dd7-9889-52bcf62216e5-internal-tls-certs\") pod \"swift-proxy-6f4f458d55-lxkls\" (UID: \"1ed378c6-5773-4dd7-9889-52bcf62216e5\") " pod="openstack/swift-proxy-6f4f458d55-lxkls" Mar 09 09:26:01 crc kubenswrapper[4861]: I0309 09:26:01.456663 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ed378c6-5773-4dd7-9889-52bcf62216e5-run-httpd\") pod \"swift-proxy-6f4f458d55-lxkls\" (UID: \"1ed378c6-5773-4dd7-9889-52bcf62216e5\") " pod="openstack/swift-proxy-6f4f458d55-lxkls" Mar 09 09:26:01 crc kubenswrapper[4861]: I0309 09:26:01.456686 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ed378c6-5773-4dd7-9889-52bcf62216e5-log-httpd\") pod \"swift-proxy-6f4f458d55-lxkls\" (UID: \"1ed378c6-5773-4dd7-9889-52bcf62216e5\") " pod="openstack/swift-proxy-6f4f458d55-lxkls" Mar 09 09:26:01 crc kubenswrapper[4861]: I0309 09:26:01.456719 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1ed378c6-5773-4dd7-9889-52bcf62216e5-etc-swift\") pod \"swift-proxy-6f4f458d55-lxkls\" (UID: \"1ed378c6-5773-4dd7-9889-52bcf62216e5\") " pod="openstack/swift-proxy-6f4f458d55-lxkls" Mar 09 09:26:01 crc kubenswrapper[4861]: I0309 09:26:01.456742 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ed378c6-5773-4dd7-9889-52bcf62216e5-config-data\") pod \"swift-proxy-6f4f458d55-lxkls\" (UID: \"1ed378c6-5773-4dd7-9889-52bcf62216e5\") " pod="openstack/swift-proxy-6f4f458d55-lxkls" Mar 09 09:26:01 crc kubenswrapper[4861]: I0309 09:26:01.456944 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ed378c6-5773-4dd7-9889-52bcf62216e5-combined-ca-bundle\") pod \"swift-proxy-6f4f458d55-lxkls\" (UID: \"1ed378c6-5773-4dd7-9889-52bcf62216e5\") " pod="openstack/swift-proxy-6f4f458d55-lxkls" Mar 09 09:26:01 crc kubenswrapper[4861]: I0309 09:26:01.456987 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwxxk\" (UniqueName: \"kubernetes.io/projected/1ed378c6-5773-4dd7-9889-52bcf62216e5-kube-api-access-kwxxk\") pod \"swift-proxy-6f4f458d55-lxkls\" (UID: \"1ed378c6-5773-4dd7-9889-52bcf62216e5\") " pod="openstack/swift-proxy-6f4f458d55-lxkls" Mar 09 09:26:01 crc kubenswrapper[4861]: I0309 09:26:01.457013 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ed378c6-5773-4dd7-9889-52bcf62216e5-public-tls-certs\") pod \"swift-proxy-6f4f458d55-lxkls\" (UID: \"1ed378c6-5773-4dd7-9889-52bcf62216e5\") " pod="openstack/swift-proxy-6f4f458d55-lxkls" Mar 09 09:26:01 crc kubenswrapper[4861]: I0309 09:26:01.559094 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ed378c6-5773-4dd7-9889-52bcf62216e5-internal-tls-certs\") pod \"swift-proxy-6f4f458d55-lxkls\" (UID: \"1ed378c6-5773-4dd7-9889-52bcf62216e5\") " pod="openstack/swift-proxy-6f4f458d55-lxkls" Mar 09 09:26:01 crc kubenswrapper[4861]: I0309 09:26:01.559146 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ed378c6-5773-4dd7-9889-52bcf62216e5-run-httpd\") pod \"swift-proxy-6f4f458d55-lxkls\" (UID: \"1ed378c6-5773-4dd7-9889-52bcf62216e5\") " pod="openstack/swift-proxy-6f4f458d55-lxkls" Mar 09 09:26:01 crc kubenswrapper[4861]: I0309 09:26:01.559170 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ed378c6-5773-4dd7-9889-52bcf62216e5-log-httpd\") pod \"swift-proxy-6f4f458d55-lxkls\" (UID: \"1ed378c6-5773-4dd7-9889-52bcf62216e5\") " pod="openstack/swift-proxy-6f4f458d55-lxkls" Mar 09 09:26:01 crc kubenswrapper[4861]: I0309 09:26:01.559214 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1ed378c6-5773-4dd7-9889-52bcf62216e5-etc-swift\") pod \"swift-proxy-6f4f458d55-lxkls\" (UID: \"1ed378c6-5773-4dd7-9889-52bcf62216e5\") " pod="openstack/swift-proxy-6f4f458d55-lxkls" Mar 09 09:26:01 crc kubenswrapper[4861]: I0309 09:26:01.559244 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ed378c6-5773-4dd7-9889-52bcf62216e5-config-data\") pod \"swift-proxy-6f4f458d55-lxkls\" (UID: \"1ed378c6-5773-4dd7-9889-52bcf62216e5\") " pod="openstack/swift-proxy-6f4f458d55-lxkls" Mar 09 09:26:01 crc kubenswrapper[4861]: I0309 09:26:01.559292 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ed378c6-5773-4dd7-9889-52bcf62216e5-combined-ca-bundle\") pod \"swift-proxy-6f4f458d55-lxkls\" (UID: \"1ed378c6-5773-4dd7-9889-52bcf62216e5\") " pod="openstack/swift-proxy-6f4f458d55-lxkls" Mar 09 09:26:01 crc kubenswrapper[4861]: I0309 09:26:01.559317 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwxxk\" (UniqueName: \"kubernetes.io/projected/1ed378c6-5773-4dd7-9889-52bcf62216e5-kube-api-access-kwxxk\") pod \"swift-proxy-6f4f458d55-lxkls\" (UID: \"1ed378c6-5773-4dd7-9889-52bcf62216e5\") " pod="openstack/swift-proxy-6f4f458d55-lxkls" Mar 09 09:26:01 crc kubenswrapper[4861]: I0309 09:26:01.559341 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ed378c6-5773-4dd7-9889-52bcf62216e5-public-tls-certs\") pod \"swift-proxy-6f4f458d55-lxkls\" (UID: \"1ed378c6-5773-4dd7-9889-52bcf62216e5\") " pod="openstack/swift-proxy-6f4f458d55-lxkls" Mar 09 09:26:01 crc kubenswrapper[4861]: I0309 09:26:01.559751 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ed378c6-5773-4dd7-9889-52bcf62216e5-run-httpd\") pod \"swift-proxy-6f4f458d55-lxkls\" (UID: \"1ed378c6-5773-4dd7-9889-52bcf62216e5\") " pod="openstack/swift-proxy-6f4f458d55-lxkls" Mar 09 09:26:01 crc kubenswrapper[4861]: I0309 09:26:01.559892 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1ed378c6-5773-4dd7-9889-52bcf62216e5-log-httpd\") pod \"swift-proxy-6f4f458d55-lxkls\" (UID: \"1ed378c6-5773-4dd7-9889-52bcf62216e5\") " pod="openstack/swift-proxy-6f4f458d55-lxkls" Mar 09 09:26:01 crc kubenswrapper[4861]: I0309 09:26:01.565239 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ed378c6-5773-4dd7-9889-52bcf62216e5-config-data\") pod \"swift-proxy-6f4f458d55-lxkls\" (UID: \"1ed378c6-5773-4dd7-9889-52bcf62216e5\") " pod="openstack/swift-proxy-6f4f458d55-lxkls" Mar 09 09:26:01 crc kubenswrapper[4861]: I0309 09:26:01.565902 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ed378c6-5773-4dd7-9889-52bcf62216e5-combined-ca-bundle\") pod \"swift-proxy-6f4f458d55-lxkls\" (UID: \"1ed378c6-5773-4dd7-9889-52bcf62216e5\") " pod="openstack/swift-proxy-6f4f458d55-lxkls" Mar 09 09:26:01 crc kubenswrapper[4861]: I0309 09:26:01.566310 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ed378c6-5773-4dd7-9889-52bcf62216e5-internal-tls-certs\") pod \"swift-proxy-6f4f458d55-lxkls\" (UID: \"1ed378c6-5773-4dd7-9889-52bcf62216e5\") " pod="openstack/swift-proxy-6f4f458d55-lxkls" Mar 09 09:26:01 crc kubenswrapper[4861]: I0309 09:26:01.572077 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1ed378c6-5773-4dd7-9889-52bcf62216e5-public-tls-certs\") pod \"swift-proxy-6f4f458d55-lxkls\" (UID: \"1ed378c6-5773-4dd7-9889-52bcf62216e5\") " pod="openstack/swift-proxy-6f4f458d55-lxkls" Mar 09 09:26:01 crc kubenswrapper[4861]: I0309 09:26:01.577979 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1ed378c6-5773-4dd7-9889-52bcf62216e5-etc-swift\") pod \"swift-proxy-6f4f458d55-lxkls\" (UID: \"1ed378c6-5773-4dd7-9889-52bcf62216e5\") " pod="openstack/swift-proxy-6f4f458d55-lxkls" Mar 09 09:26:01 crc kubenswrapper[4861]: I0309 09:26:01.585447 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwxxk\" (UniqueName: \"kubernetes.io/projected/1ed378c6-5773-4dd7-9889-52bcf62216e5-kube-api-access-kwxxk\") pod \"swift-proxy-6f4f458d55-lxkls\" (UID: \"1ed378c6-5773-4dd7-9889-52bcf62216e5\") " pod="openstack/swift-proxy-6f4f458d55-lxkls" Mar 09 09:26:01 crc kubenswrapper[4861]: I0309 09:26:01.611349 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6f4f458d55-lxkls" Mar 09 09:26:01 crc kubenswrapper[4861]: I0309 09:26:01.805422 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550806-dlnz6" event={"ID":"5330b450-5c51-43e8-b0c0-c9b875dc49b5","Type":"ContainerStarted","Data":"3c8f27d502b2b1dc67fa5d91348209bc5a3f5bcdc59a119334d8a7a0afe2a23c"} Mar 09 09:26:02 crc kubenswrapper[4861]: I0309 09:26:02.673678 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:26:02 crc kubenswrapper[4861]: I0309 09:26:02.674092 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1" containerName="ceilometer-central-agent" containerID="cri-o://75a923c67628e7a6170db32559a1aab94be16069f2bfa80eb9fe6cbde9ed6208" gracePeriod=30 Mar 09 09:26:02 crc kubenswrapper[4861]: I0309 09:26:02.674656 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1" containerName="proxy-httpd" containerID="cri-o://f59f0c4abfb6f63caeaa0f687c6369bee86ab126739c55cbc0490c2d536d44ef" gracePeriod=30 Mar 09 09:26:02 crc kubenswrapper[4861]: I0309 09:26:02.674675 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1" containerName="sg-core" containerID="cri-o://5cdeb6e5500f4678560109b3d7fbc725a3f70c195e678f4d8a4357e48903dd6f" gracePeriod=30 Mar 09 09:26:02 crc kubenswrapper[4861]: I0309 09:26:02.674690 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1" containerName="ceilometer-notification-agent" containerID="cri-o://c2e8d3629585fa584403ade19c51f099ae4d48c3cab7991240d1d3ddd0f69c99" gracePeriod=30 Mar 09 09:26:02 crc kubenswrapper[4861]: I0309 09:26:02.684975 4861 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.170:3000/\": EOF" Mar 09 09:26:02 crc kubenswrapper[4861]: I0309 09:26:02.816929 4861 generic.go:334] "Generic (PLEG): container finished" podID="8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1" containerID="5cdeb6e5500f4678560109b3d7fbc725a3f70c195e678f4d8a4357e48903dd6f" exitCode=2 Mar 09 09:26:02 crc kubenswrapper[4861]: I0309 09:26:02.817002 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1","Type":"ContainerDied","Data":"5cdeb6e5500f4678560109b3d7fbc725a3f70c195e678f4d8a4357e48903dd6f"} Mar 09 09:26:03 crc kubenswrapper[4861]: I0309 09:26:03.828396 4861 generic.go:334] "Generic (PLEG): container finished" podID="8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1" containerID="f59f0c4abfb6f63caeaa0f687c6369bee86ab126739c55cbc0490c2d536d44ef" exitCode=0 Mar 09 09:26:03 crc kubenswrapper[4861]: I0309 09:26:03.828430 4861 generic.go:334] "Generic (PLEG): container finished" podID="8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1" containerID="c2e8d3629585fa584403ade19c51f099ae4d48c3cab7991240d1d3ddd0f69c99" exitCode=0 Mar 09 09:26:03 crc kubenswrapper[4861]: I0309 09:26:03.828439 4861 generic.go:334] "Generic (PLEG): container finished" podID="8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1" containerID="75a923c67628e7a6170db32559a1aab94be16069f2bfa80eb9fe6cbde9ed6208" exitCode=0 Mar 09 09:26:03 crc kubenswrapper[4861]: I0309 09:26:03.828460 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1","Type":"ContainerDied","Data":"f59f0c4abfb6f63caeaa0f687c6369bee86ab126739c55cbc0490c2d536d44ef"} Mar 09 09:26:03 crc kubenswrapper[4861]: I0309 09:26:03.828483 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1","Type":"ContainerDied","Data":"c2e8d3629585fa584403ade19c51f099ae4d48c3cab7991240d1d3ddd0f69c99"} Mar 09 09:26:03 crc kubenswrapper[4861]: I0309 09:26:03.828494 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1","Type":"ContainerDied","Data":"75a923c67628e7a6170db32559a1aab94be16069f2bfa80eb9fe6cbde9ed6208"} Mar 09 09:26:06 crc kubenswrapper[4861]: E0309 09:26:06.398586 4861 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48fbe4a1_81ab_4a46_8150_821bc8afa220.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48fbe4a1_81ab_4a46_8150_821bc8afa220.slice/crio-3cace9b3826297e13b51f776c4a63465d7f30becfc9c4dce3918a85756fc792e\": RecentStats: unable to find data in memory cache]" Mar 09 09:26:07 crc kubenswrapper[4861]: I0309 09:26:07.621491 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:26:07 crc kubenswrapper[4861]: I0309 09:26:07.652029 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1-log-httpd\") pod \"8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1\" (UID: \"8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1\") " Mar 09 09:26:07 crc kubenswrapper[4861]: I0309 09:26:07.652072 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1-config-data\") pod \"8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1\" (UID: \"8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1\") " Mar 09 09:26:07 crc kubenswrapper[4861]: I0309 09:26:07.652122 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsnsk\" (UniqueName: \"kubernetes.io/projected/8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1-kube-api-access-jsnsk\") pod \"8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1\" (UID: \"8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1\") " Mar 09 09:26:07 crc kubenswrapper[4861]: I0309 09:26:07.652223 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1-sg-core-conf-yaml\") pod \"8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1\" (UID: \"8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1\") " Mar 09 09:26:07 crc kubenswrapper[4861]: I0309 09:26:07.652252 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1-scripts\") pod \"8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1\" (UID: \"8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1\") " Mar 09 09:26:07 crc kubenswrapper[4861]: I0309 09:26:07.652272 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1-combined-ca-bundle\") pod \"8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1\" (UID: \"8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1\") " Mar 09 09:26:07 crc kubenswrapper[4861]: I0309 09:26:07.652326 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1-run-httpd\") pod \"8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1\" (UID: \"8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1\") " Mar 09 09:26:07 crc kubenswrapper[4861]: I0309 09:26:07.654010 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1" (UID: "8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:26:07 crc kubenswrapper[4861]: I0309 09:26:07.655143 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1" (UID: "8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:26:07 crc kubenswrapper[4861]: I0309 09:26:07.656065 4861 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:07 crc kubenswrapper[4861]: I0309 09:26:07.656101 4861 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:07 crc kubenswrapper[4861]: I0309 09:26:07.659043 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1-kube-api-access-jsnsk" (OuterVolumeSpecName: "kube-api-access-jsnsk") pod "8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1" (UID: "8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1"). InnerVolumeSpecName "kube-api-access-jsnsk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:26:07 crc kubenswrapper[4861]: I0309 09:26:07.668173 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1-scripts" (OuterVolumeSpecName: "scripts") pod "8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1" (UID: "8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:26:07 crc kubenswrapper[4861]: I0309 09:26:07.740274 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1" (UID: "8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:26:07 crc kubenswrapper[4861]: I0309 09:26:07.758360 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsnsk\" (UniqueName: \"kubernetes.io/projected/8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1-kube-api-access-jsnsk\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:07 crc kubenswrapper[4861]: I0309 09:26:07.758626 4861 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:07 crc kubenswrapper[4861]: I0309 09:26:07.758713 4861 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:07 crc kubenswrapper[4861]: I0309 09:26:07.793694 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1" (UID: "8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:26:07 crc kubenswrapper[4861]: I0309 09:26:07.810379 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1-config-data" (OuterVolumeSpecName: "config-data") pod "8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1" (UID: "8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:26:07 crc kubenswrapper[4861]: W0309 09:26:07.810546 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ed378c6_5773_4dd7_9889_52bcf62216e5.slice/crio-21cff548434134ec19998806d385facab783b12ab26b7b0c079969629b02e10e WatchSource:0}: Error finding container 21cff548434134ec19998806d385facab783b12ab26b7b0c079969629b02e10e: Status 404 returned error can't find the container with id 21cff548434134ec19998806d385facab783b12ab26b7b0c079969629b02e10e Mar 09 09:26:07 crc kubenswrapper[4861]: I0309 09:26:07.813007 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6f4f458d55-lxkls"] Mar 09 09:26:07 crc kubenswrapper[4861]: I0309 09:26:07.859922 4861 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:07 crc kubenswrapper[4861]: I0309 09:26:07.860093 4861 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:07 crc kubenswrapper[4861]: I0309 09:26:07.867146 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"62f873db-0b4f-4a99-bc1d-7cdff56989a2","Type":"ContainerStarted","Data":"42167c1e8a0806575f9c490813ac25f3038697e9d63f108ffacc940fe6fbcd3f"} Mar 09 09:26:07 crc kubenswrapper[4861]: I0309 09:26:07.870647 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550806-dlnz6" event={"ID":"5330b450-5c51-43e8-b0c0-c9b875dc49b5","Type":"ContainerStarted","Data":"5fa6ba578ffae467793f3902a8751c10074b77e91f06af157cb008ec9397b5dc"} Mar 09 09:26:07 crc kubenswrapper[4861]: I0309 09:26:07.874591 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1","Type":"ContainerDied","Data":"7876f246bbc8d15cd13ada8a14e25713502882d6b23c20b00c4b0ddde4ebdc34"} Mar 09 09:26:07 crc kubenswrapper[4861]: I0309 09:26:07.874665 4861 scope.go:117] "RemoveContainer" containerID="f59f0c4abfb6f63caeaa0f687c6369bee86ab126739c55cbc0490c2d536d44ef" Mar 09 09:26:07 crc kubenswrapper[4861]: I0309 09:26:07.874975 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:26:07 crc kubenswrapper[4861]: I0309 09:26:07.890808 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6f4f458d55-lxkls" event={"ID":"1ed378c6-5773-4dd7-9889-52bcf62216e5","Type":"ContainerStarted","Data":"21cff548434134ec19998806d385facab783b12ab26b7b0c079969629b02e10e"} Mar 09 09:26:07 crc kubenswrapper[4861]: I0309 09:26:07.903097 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.369916407 podStartE2EDuration="13.903075869s" podCreationTimestamp="2026-03-09 09:25:54 +0000 UTC" firstStartedPulling="2026-03-09 09:25:55.820493118 +0000 UTC m=+1198.905532519" lastFinishedPulling="2026-03-09 09:26:07.35365258 +0000 UTC m=+1210.438691981" observedRunningTime="2026-03-09 09:26:07.884015412 +0000 UTC m=+1210.969054823" watchObservedRunningTime="2026-03-09 09:26:07.903075869 +0000 UTC m=+1210.988115270" Mar 09 09:26:07 crc kubenswrapper[4861]: I0309 09:26:07.919515 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550806-dlnz6" podStartSLOduration=1.538561058 podStartE2EDuration="7.919491278s" podCreationTimestamp="2026-03-09 09:26:00 +0000 UTC" firstStartedPulling="2026-03-09 09:26:00.97480053 +0000 UTC m=+1204.059839921" lastFinishedPulling="2026-03-09 09:26:07.35573073 +0000 UTC m=+1210.440770141" observedRunningTime="2026-03-09 09:26:07.907766065 +0000 UTC m=+1210.992805486" watchObservedRunningTime="2026-03-09 09:26:07.919491278 +0000 UTC m=+1211.004530679" Mar 09 09:26:07 crc kubenswrapper[4861]: I0309 09:26:07.929188 4861 scope.go:117] "RemoveContainer" containerID="5cdeb6e5500f4678560109b3d7fbc725a3f70c195e678f4d8a4357e48903dd6f" Mar 09 09:26:07 crc kubenswrapper[4861]: I0309 09:26:07.946016 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:26:07 crc kubenswrapper[4861]: I0309 09:26:07.968410 4861 scope.go:117] "RemoveContainer" containerID="c2e8d3629585fa584403ade19c51f099ae4d48c3cab7991240d1d3ddd0f69c99" Mar 09 09:26:07 crc kubenswrapper[4861]: I0309 09:26:07.971264 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:26:07 crc kubenswrapper[4861]: I0309 09:26:07.983608 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:26:07 crc kubenswrapper[4861]: E0309 09:26:07.984068 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1" containerName="ceilometer-notification-agent" Mar 09 09:26:07 crc kubenswrapper[4861]: I0309 09:26:07.984090 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1" containerName="ceilometer-notification-agent" Mar 09 09:26:07 crc kubenswrapper[4861]: E0309 09:26:07.984133 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1" containerName="proxy-httpd" Mar 09 09:26:07 crc kubenswrapper[4861]: I0309 09:26:07.984140 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1" containerName="proxy-httpd" Mar 09 09:26:07 crc kubenswrapper[4861]: E0309 09:26:07.984159 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1" containerName="sg-core" Mar 09 09:26:07 crc kubenswrapper[4861]: I0309 09:26:07.984166 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1" containerName="sg-core" Mar 09 09:26:07 crc kubenswrapper[4861]: E0309 09:26:07.984182 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1" containerName="ceilometer-central-agent" Mar 09 09:26:07 crc kubenswrapper[4861]: I0309 09:26:07.989123 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1" containerName="ceilometer-central-agent" Mar 09 09:26:07 crc kubenswrapper[4861]: I0309 09:26:07.989691 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1" containerName="sg-core" Mar 09 09:26:07 crc kubenswrapper[4861]: I0309 09:26:07.989711 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1" containerName="proxy-httpd" Mar 09 09:26:07 crc kubenswrapper[4861]: I0309 09:26:07.989726 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1" containerName="ceilometer-notification-agent" Mar 09 09:26:07 crc kubenswrapper[4861]: I0309 09:26:07.989736 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1" containerName="ceilometer-central-agent" Mar 09 09:26:08 crc kubenswrapper[4861]: I0309 09:26:08.003247 4861 scope.go:117] "RemoveContainer" containerID="75a923c67628e7a6170db32559a1aab94be16069f2bfa80eb9fe6cbde9ed6208" Mar 09 09:26:08 crc kubenswrapper[4861]: I0309 09:26:08.003532 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:26:08 crc kubenswrapper[4861]: I0309 09:26:08.003651 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:26:08 crc kubenswrapper[4861]: I0309 09:26:08.009917 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 09:26:08 crc kubenswrapper[4861]: I0309 09:26:08.010204 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 09:26:08 crc kubenswrapper[4861]: I0309 09:26:08.063734 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c295c66-4e4c-429a-9769-6119f1c5f087-run-httpd\") pod \"ceilometer-0\" (UID: \"6c295c66-4e4c-429a-9769-6119f1c5f087\") " pod="openstack/ceilometer-0" Mar 09 09:26:08 crc kubenswrapper[4861]: I0309 09:26:08.063821 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6c295c66-4e4c-429a-9769-6119f1c5f087-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6c295c66-4e4c-429a-9769-6119f1c5f087\") " pod="openstack/ceilometer-0" Mar 09 09:26:08 crc kubenswrapper[4861]: I0309 09:26:08.063863 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c295c66-4e4c-429a-9769-6119f1c5f087-scripts\") pod \"ceilometer-0\" (UID: \"6c295c66-4e4c-429a-9769-6119f1c5f087\") " pod="openstack/ceilometer-0" Mar 09 09:26:08 crc kubenswrapper[4861]: I0309 09:26:08.063888 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c295c66-4e4c-429a-9769-6119f1c5f087-log-httpd\") pod \"ceilometer-0\" (UID: \"6c295c66-4e4c-429a-9769-6119f1c5f087\") " pod="openstack/ceilometer-0" Mar 09 09:26:08 crc kubenswrapper[4861]: I0309 09:26:08.063927 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hb7tx\" (UniqueName: \"kubernetes.io/projected/6c295c66-4e4c-429a-9769-6119f1c5f087-kube-api-access-hb7tx\") pod \"ceilometer-0\" (UID: \"6c295c66-4e4c-429a-9769-6119f1c5f087\") " pod="openstack/ceilometer-0" Mar 09 09:26:08 crc kubenswrapper[4861]: I0309 09:26:08.063958 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c295c66-4e4c-429a-9769-6119f1c5f087-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6c295c66-4e4c-429a-9769-6119f1c5f087\") " pod="openstack/ceilometer-0" Mar 09 09:26:08 crc kubenswrapper[4861]: I0309 09:26:08.063975 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c295c66-4e4c-429a-9769-6119f1c5f087-config-data\") pod \"ceilometer-0\" (UID: \"6c295c66-4e4c-429a-9769-6119f1c5f087\") " pod="openstack/ceilometer-0" Mar 09 09:26:08 crc kubenswrapper[4861]: I0309 09:26:08.165172 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c295c66-4e4c-429a-9769-6119f1c5f087-scripts\") pod \"ceilometer-0\" (UID: \"6c295c66-4e4c-429a-9769-6119f1c5f087\") " pod="openstack/ceilometer-0" Mar 09 09:26:08 crc kubenswrapper[4861]: I0309 09:26:08.165226 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c295c66-4e4c-429a-9769-6119f1c5f087-log-httpd\") pod \"ceilometer-0\" (UID: \"6c295c66-4e4c-429a-9769-6119f1c5f087\") " pod="openstack/ceilometer-0" Mar 09 09:26:08 crc kubenswrapper[4861]: I0309 09:26:08.165268 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hb7tx\" (UniqueName: \"kubernetes.io/projected/6c295c66-4e4c-429a-9769-6119f1c5f087-kube-api-access-hb7tx\") pod \"ceilometer-0\" (UID: \"6c295c66-4e4c-429a-9769-6119f1c5f087\") " pod="openstack/ceilometer-0" Mar 09 09:26:08 crc kubenswrapper[4861]: I0309 09:26:08.165298 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c295c66-4e4c-429a-9769-6119f1c5f087-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6c295c66-4e4c-429a-9769-6119f1c5f087\") " pod="openstack/ceilometer-0" Mar 09 09:26:08 crc kubenswrapper[4861]: I0309 09:26:08.165313 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c295c66-4e4c-429a-9769-6119f1c5f087-config-data\") pod \"ceilometer-0\" (UID: \"6c295c66-4e4c-429a-9769-6119f1c5f087\") " pod="openstack/ceilometer-0" Mar 09 09:26:08 crc kubenswrapper[4861]: I0309 09:26:08.165353 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c295c66-4e4c-429a-9769-6119f1c5f087-run-httpd\") pod \"ceilometer-0\" (UID: \"6c295c66-4e4c-429a-9769-6119f1c5f087\") " pod="openstack/ceilometer-0" Mar 09 09:26:08 crc kubenswrapper[4861]: I0309 09:26:08.165411 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6c295c66-4e4c-429a-9769-6119f1c5f087-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6c295c66-4e4c-429a-9769-6119f1c5f087\") " pod="openstack/ceilometer-0" Mar 09 09:26:08 crc kubenswrapper[4861]: I0309 09:26:08.166669 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c295c66-4e4c-429a-9769-6119f1c5f087-log-httpd\") pod \"ceilometer-0\" (UID: \"6c295c66-4e4c-429a-9769-6119f1c5f087\") " pod="openstack/ceilometer-0" Mar 09 09:26:08 crc kubenswrapper[4861]: I0309 09:26:08.168210 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c295c66-4e4c-429a-9769-6119f1c5f087-run-httpd\") pod \"ceilometer-0\" (UID: \"6c295c66-4e4c-429a-9769-6119f1c5f087\") " pod="openstack/ceilometer-0" Mar 09 09:26:08 crc kubenswrapper[4861]: I0309 09:26:08.169352 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6c295c66-4e4c-429a-9769-6119f1c5f087-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6c295c66-4e4c-429a-9769-6119f1c5f087\") " pod="openstack/ceilometer-0" Mar 09 09:26:08 crc kubenswrapper[4861]: I0309 09:26:08.170930 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c295c66-4e4c-429a-9769-6119f1c5f087-config-data\") pod \"ceilometer-0\" (UID: \"6c295c66-4e4c-429a-9769-6119f1c5f087\") " pod="openstack/ceilometer-0" Mar 09 09:26:08 crc kubenswrapper[4861]: I0309 09:26:08.171475 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c295c66-4e4c-429a-9769-6119f1c5f087-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6c295c66-4e4c-429a-9769-6119f1c5f087\") " pod="openstack/ceilometer-0" Mar 09 09:26:08 crc kubenswrapper[4861]: I0309 09:26:08.171663 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c295c66-4e4c-429a-9769-6119f1c5f087-scripts\") pod \"ceilometer-0\" (UID: \"6c295c66-4e4c-429a-9769-6119f1c5f087\") " pod="openstack/ceilometer-0" Mar 09 09:26:08 crc kubenswrapper[4861]: I0309 09:26:08.188312 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hb7tx\" (UniqueName: \"kubernetes.io/projected/6c295c66-4e4c-429a-9769-6119f1c5f087-kube-api-access-hb7tx\") pod \"ceilometer-0\" (UID: \"6c295c66-4e4c-429a-9769-6119f1c5f087\") " pod="openstack/ceilometer-0" Mar 09 09:26:08 crc kubenswrapper[4861]: I0309 09:26:08.326707 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:26:08 crc kubenswrapper[4861]: I0309 09:26:08.845608 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:26:08 crc kubenswrapper[4861]: I0309 09:26:08.861002 4861 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-7bb4db8c4-sxjc7" podUID="71492031-e589-409e-b8c8-c0a1194b97ed" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.153:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.153:8443: connect: connection refused" Mar 09 09:26:08 crc kubenswrapper[4861]: I0309 09:26:08.861167 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7bb4db8c4-sxjc7" Mar 09 09:26:08 crc kubenswrapper[4861]: W0309 09:26:08.863247 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c295c66_4e4c_429a_9769_6119f1c5f087.slice/crio-dcf4fb82bd572aa9ad02b9c8f51123d84cd8d3ecad7c1c4804310b69c36c1458 WatchSource:0}: Error finding container dcf4fb82bd572aa9ad02b9c8f51123d84cd8d3ecad7c1c4804310b69c36c1458: Status 404 returned error can't find the container with id dcf4fb82bd572aa9ad02b9c8f51123d84cd8d3ecad7c1c4804310b69c36c1458 Mar 09 09:26:08 crc kubenswrapper[4861]: I0309 09:26:08.902600 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6c295c66-4e4c-429a-9769-6119f1c5f087","Type":"ContainerStarted","Data":"dcf4fb82bd572aa9ad02b9c8f51123d84cd8d3ecad7c1c4804310b69c36c1458"} Mar 09 09:26:08 crc kubenswrapper[4861]: I0309 09:26:08.904389 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6f4f458d55-lxkls" event={"ID":"1ed378c6-5773-4dd7-9889-52bcf62216e5","Type":"ContainerStarted","Data":"8314beee0a7d2cd5d4cc4432f751b11605c5a28b9f9747c4b86dbf28e3f11e04"} Mar 09 09:26:08 crc kubenswrapper[4861]: I0309 09:26:08.904422 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6f4f458d55-lxkls" event={"ID":"1ed378c6-5773-4dd7-9889-52bcf62216e5","Type":"ContainerStarted","Data":"891a09d7da3ad1b5f1a7a213a59add85a920b42462ddf9be06ea85976d80399b"} Mar 09 09:26:08 crc kubenswrapper[4861]: I0309 09:26:08.905094 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6f4f458d55-lxkls" Mar 09 09:26:08 crc kubenswrapper[4861]: I0309 09:26:08.905162 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6f4f458d55-lxkls" Mar 09 09:26:08 crc kubenswrapper[4861]: I0309 09:26:08.907823 4861 generic.go:334] "Generic (PLEG): container finished" podID="5330b450-5c51-43e8-b0c0-c9b875dc49b5" containerID="5fa6ba578ffae467793f3902a8751c10074b77e91f06af157cb008ec9397b5dc" exitCode=0 Mar 09 09:26:08 crc kubenswrapper[4861]: I0309 09:26:08.908798 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550806-dlnz6" event={"ID":"5330b450-5c51-43e8-b0c0-c9b875dc49b5","Type":"ContainerDied","Data":"5fa6ba578ffae467793f3902a8751c10074b77e91f06af157cb008ec9397b5dc"} Mar 09 09:26:08 crc kubenswrapper[4861]: I0309 09:26:08.934311 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-6f4f458d55-lxkls" podStartSLOduration=7.934286124 podStartE2EDuration="7.934286124s" podCreationTimestamp="2026-03-09 09:26:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:26:08.931705688 +0000 UTC m=+1212.016745079" watchObservedRunningTime="2026-03-09 09:26:08.934286124 +0000 UTC m=+1212.019325525" Mar 09 09:26:08 crc kubenswrapper[4861]: I0309 09:26:08.969249 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:26:09 crc kubenswrapper[4861]: I0309 09:26:09.707108 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1" path="/var/lib/kubelet/pods/8af4e6e3-c7cc-4e2f-b3bf-48461f86a1f1/volumes" Mar 09 09:26:09 crc kubenswrapper[4861]: I0309 09:26:09.934653 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6c295c66-4e4c-429a-9769-6119f1c5f087","Type":"ContainerStarted","Data":"e4f92f447639a3202f3431b4008c094cc8222eb15f4dcdefb507cbe1370ec04b"} Mar 09 09:26:10 crc kubenswrapper[4861]: I0309 09:26:10.360111 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550806-dlnz6" Mar 09 09:26:10 crc kubenswrapper[4861]: I0309 09:26:10.476852 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-665f4b6689-tfdk9" Mar 09 09:26:10 crc kubenswrapper[4861]: I0309 09:26:10.533129 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvnqq\" (UniqueName: \"kubernetes.io/projected/5330b450-5c51-43e8-b0c0-c9b875dc49b5-kube-api-access-jvnqq\") pod \"5330b450-5c51-43e8-b0c0-c9b875dc49b5\" (UID: \"5330b450-5c51-43e8-b0c0-c9b875dc49b5\") " Mar 09 09:26:10 crc kubenswrapper[4861]: I0309 09:26:10.539355 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7fb7d5546d-n665d"] Mar 09 09:26:10 crc kubenswrapper[4861]: I0309 09:26:10.550572 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7fb7d5546d-n665d" podUID="30e45abc-d44d-4e9a-8478-b562905ee7c2" containerName="neutron-api" containerID="cri-o://b5be7ddaf2cc488012e82daab81d04dc4edccf05a70400a457f9705e7d7229b6" gracePeriod=30 Mar 09 09:26:10 crc kubenswrapper[4861]: I0309 09:26:10.551115 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7fb7d5546d-n665d" podUID="30e45abc-d44d-4e9a-8478-b562905ee7c2" containerName="neutron-httpd" containerID="cri-o://3fd703ecac672d85228b81f88612761d619524529973bcd35d35b8dbfa04b92c" gracePeriod=30 Mar 09 09:26:10 crc kubenswrapper[4861]: I0309 09:26:10.563991 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5330b450-5c51-43e8-b0c0-c9b875dc49b5-kube-api-access-jvnqq" (OuterVolumeSpecName: "kube-api-access-jvnqq") pod "5330b450-5c51-43e8-b0c0-c9b875dc49b5" (UID: "5330b450-5c51-43e8-b0c0-c9b875dc49b5"). InnerVolumeSpecName "kube-api-access-jvnqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:26:10 crc kubenswrapper[4861]: I0309 09:26:10.637056 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvnqq\" (UniqueName: \"kubernetes.io/projected/5330b450-5c51-43e8-b0c0-c9b875dc49b5-kube-api-access-jvnqq\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:10 crc kubenswrapper[4861]: I0309 09:26:10.803832 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550800-twnhd"] Mar 09 09:26:10 crc kubenswrapper[4861]: I0309 09:26:10.811301 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550800-twnhd"] Mar 09 09:26:10 crc kubenswrapper[4861]: I0309 09:26:10.946993 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550806-dlnz6" event={"ID":"5330b450-5c51-43e8-b0c0-c9b875dc49b5","Type":"ContainerDied","Data":"3c8f27d502b2b1dc67fa5d91348209bc5a3f5bcdc59a119334d8a7a0afe2a23c"} Mar 09 09:26:10 crc kubenswrapper[4861]: I0309 09:26:10.947329 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c8f27d502b2b1dc67fa5d91348209bc5a3f5bcdc59a119334d8a7a0afe2a23c" Mar 09 09:26:10 crc kubenswrapper[4861]: I0309 09:26:10.947016 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550806-dlnz6" Mar 09 09:26:10 crc kubenswrapper[4861]: I0309 09:26:10.949776 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6c295c66-4e4c-429a-9769-6119f1c5f087","Type":"ContainerStarted","Data":"0edeacb7534e99d5e30b62e7192d7570d6d2e0651bcc92cd8192715235d0d0d4"} Mar 09 09:26:10 crc kubenswrapper[4861]: I0309 09:26:10.952280 4861 generic.go:334] "Generic (PLEG): container finished" podID="30e45abc-d44d-4e9a-8478-b562905ee7c2" containerID="3fd703ecac672d85228b81f88612761d619524529973bcd35d35b8dbfa04b92c" exitCode=0 Mar 09 09:26:10 crc kubenswrapper[4861]: I0309 09:26:10.952337 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7fb7d5546d-n665d" event={"ID":"30e45abc-d44d-4e9a-8478-b562905ee7c2","Type":"ContainerDied","Data":"3fd703ecac672d85228b81f88612761d619524529973bcd35d35b8dbfa04b92c"} Mar 09 09:26:11 crc kubenswrapper[4861]: I0309 09:26:11.669345 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="677b13c8-497a-4807-a2ef-13f1bfe09db3" path="/var/lib/kubelet/pods/677b13c8-497a-4807-a2ef-13f1bfe09db3/volumes" Mar 09 09:26:11 crc kubenswrapper[4861]: I0309 09:26:11.966019 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6c295c66-4e4c-429a-9769-6119f1c5f087","Type":"ContainerStarted","Data":"28df8178150fb9bfb6de04e8846625e93866e85a8da803f9349b41f5aa4bcfaa"} Mar 09 09:26:13 crc kubenswrapper[4861]: I0309 09:26:13.990314 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6c295c66-4e4c-429a-9769-6119f1c5f087","Type":"ContainerStarted","Data":"d2e662b7674c66e8e6404aaaa20571740ba1d8fa5a409f820c6d26b3e4739c0d"} Mar 09 09:26:13 crc kubenswrapper[4861]: I0309 09:26:13.990933 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 09:26:13 crc kubenswrapper[4861]: I0309 09:26:13.990762 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6c295c66-4e4c-429a-9769-6119f1c5f087" containerName="proxy-httpd" containerID="cri-o://d2e662b7674c66e8e6404aaaa20571740ba1d8fa5a409f820c6d26b3e4739c0d" gracePeriod=30 Mar 09 09:26:13 crc kubenswrapper[4861]: I0309 09:26:13.990472 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6c295c66-4e4c-429a-9769-6119f1c5f087" containerName="ceilometer-central-agent" containerID="cri-o://e4f92f447639a3202f3431b4008c094cc8222eb15f4dcdefb507cbe1370ec04b" gracePeriod=30 Mar 09 09:26:13 crc kubenswrapper[4861]: I0309 09:26:13.990778 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6c295c66-4e4c-429a-9769-6119f1c5f087" containerName="sg-core" containerID="cri-o://28df8178150fb9bfb6de04e8846625e93866e85a8da803f9349b41f5aa4bcfaa" gracePeriod=30 Mar 09 09:26:13 crc kubenswrapper[4861]: I0309 09:26:13.990788 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6c295c66-4e4c-429a-9769-6119f1c5f087" containerName="ceilometer-notification-agent" containerID="cri-o://0edeacb7534e99d5e30b62e7192d7570d6d2e0651bcc92cd8192715235d0d0d4" gracePeriod=30 Mar 09 09:26:14 crc kubenswrapper[4861]: I0309 09:26:14.029051 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.9983677159999997 podStartE2EDuration="7.029028725s" podCreationTimestamp="2026-03-09 09:26:07 +0000 UTC" firstStartedPulling="2026-03-09 09:26:08.866118263 +0000 UTC m=+1211.951157664" lastFinishedPulling="2026-03-09 09:26:12.896779272 +0000 UTC m=+1215.981818673" observedRunningTime="2026-03-09 09:26:14.023888896 +0000 UTC m=+1217.108928297" watchObservedRunningTime="2026-03-09 09:26:14.029028725 +0000 UTC m=+1217.114068126" Mar 09 09:26:14 crc kubenswrapper[4861]: I0309 09:26:14.832568 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7bb4db8c4-sxjc7" Mar 09 09:26:15 crc kubenswrapper[4861]: I0309 09:26:15.003293 4861 generic.go:334] "Generic (PLEG): container finished" podID="71492031-e589-409e-b8c8-c0a1194b97ed" containerID="f789d6c7cb2c90f537f7edf8f51e1f3aa6d4ed13c1a60ecf2646f0e1dd6894ae" exitCode=137 Mar 09 09:26:15 crc kubenswrapper[4861]: I0309 09:26:15.003354 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7bb4db8c4-sxjc7" event={"ID":"71492031-e589-409e-b8c8-c0a1194b97ed","Type":"ContainerDied","Data":"f789d6c7cb2c90f537f7edf8f51e1f3aa6d4ed13c1a60ecf2646f0e1dd6894ae"} Mar 09 09:26:15 crc kubenswrapper[4861]: I0309 09:26:15.003396 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7bb4db8c4-sxjc7" event={"ID":"71492031-e589-409e-b8c8-c0a1194b97ed","Type":"ContainerDied","Data":"fdb5e36ba3482853e01391e95b3d25523a66bad351b0b4394ede23ef70f0a36c"} Mar 09 09:26:15 crc kubenswrapper[4861]: I0309 09:26:15.003436 4861 scope.go:117] "RemoveContainer" containerID="c579881c05bf1dff8d11a8b2bc2f24f89a926610877fc04c23a9f98aa2682890" Mar 09 09:26:15 crc kubenswrapper[4861]: I0309 09:26:15.003565 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7bb4db8c4-sxjc7" Mar 09 09:26:15 crc kubenswrapper[4861]: I0309 09:26:15.009097 4861 generic.go:334] "Generic (PLEG): container finished" podID="6c295c66-4e4c-429a-9769-6119f1c5f087" containerID="d2e662b7674c66e8e6404aaaa20571740ba1d8fa5a409f820c6d26b3e4739c0d" exitCode=0 Mar 09 09:26:15 crc kubenswrapper[4861]: I0309 09:26:15.009128 4861 generic.go:334] "Generic (PLEG): container finished" podID="6c295c66-4e4c-429a-9769-6119f1c5f087" containerID="28df8178150fb9bfb6de04e8846625e93866e85a8da803f9349b41f5aa4bcfaa" exitCode=2 Mar 09 09:26:15 crc kubenswrapper[4861]: I0309 09:26:15.009136 4861 generic.go:334] "Generic (PLEG): container finished" podID="6c295c66-4e4c-429a-9769-6119f1c5f087" containerID="0edeacb7534e99d5e30b62e7192d7570d6d2e0651bcc92cd8192715235d0d0d4" exitCode=0 Mar 09 09:26:15 crc kubenswrapper[4861]: I0309 09:26:15.009154 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6c295c66-4e4c-429a-9769-6119f1c5f087","Type":"ContainerDied","Data":"d2e662b7674c66e8e6404aaaa20571740ba1d8fa5a409f820c6d26b3e4739c0d"} Mar 09 09:26:15 crc kubenswrapper[4861]: I0309 09:26:15.009176 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6c295c66-4e4c-429a-9769-6119f1c5f087","Type":"ContainerDied","Data":"28df8178150fb9bfb6de04e8846625e93866e85a8da803f9349b41f5aa4bcfaa"} Mar 09 09:26:15 crc kubenswrapper[4861]: I0309 09:26:15.009186 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6c295c66-4e4c-429a-9769-6119f1c5f087","Type":"ContainerDied","Data":"0edeacb7534e99d5e30b62e7192d7570d6d2e0651bcc92cd8192715235d0d0d4"} Mar 09 09:26:15 crc kubenswrapper[4861]: I0309 09:26:15.011563 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71492031-e589-409e-b8c8-c0a1194b97ed-logs\") pod \"71492031-e589-409e-b8c8-c0a1194b97ed\" (UID: \"71492031-e589-409e-b8c8-c0a1194b97ed\") " Mar 09 09:26:15 crc kubenswrapper[4861]: I0309 09:26:15.011628 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/71492031-e589-409e-b8c8-c0a1194b97ed-horizon-secret-key\") pod \"71492031-e589-409e-b8c8-c0a1194b97ed\" (UID: \"71492031-e589-409e-b8c8-c0a1194b97ed\") " Mar 09 09:26:15 crc kubenswrapper[4861]: I0309 09:26:15.011673 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71492031-e589-409e-b8c8-c0a1194b97ed-combined-ca-bundle\") pod \"71492031-e589-409e-b8c8-c0a1194b97ed\" (UID: \"71492031-e589-409e-b8c8-c0a1194b97ed\") " Mar 09 09:26:15 crc kubenswrapper[4861]: I0309 09:26:15.011719 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/71492031-e589-409e-b8c8-c0a1194b97ed-horizon-tls-certs\") pod \"71492031-e589-409e-b8c8-c0a1194b97ed\" (UID: \"71492031-e589-409e-b8c8-c0a1194b97ed\") " Mar 09 09:26:15 crc kubenswrapper[4861]: I0309 09:26:15.011757 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/71492031-e589-409e-b8c8-c0a1194b97ed-scripts\") pod \"71492031-e589-409e-b8c8-c0a1194b97ed\" (UID: \"71492031-e589-409e-b8c8-c0a1194b97ed\") " Mar 09 09:26:15 crc kubenswrapper[4861]: I0309 09:26:15.011936 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jbdt\" (UniqueName: \"kubernetes.io/projected/71492031-e589-409e-b8c8-c0a1194b97ed-kube-api-access-9jbdt\") pod \"71492031-e589-409e-b8c8-c0a1194b97ed\" (UID: \"71492031-e589-409e-b8c8-c0a1194b97ed\") " Mar 09 09:26:15 crc kubenswrapper[4861]: I0309 09:26:15.011978 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/71492031-e589-409e-b8c8-c0a1194b97ed-config-data\") pod \"71492031-e589-409e-b8c8-c0a1194b97ed\" (UID: \"71492031-e589-409e-b8c8-c0a1194b97ed\") " Mar 09 09:26:15 crc kubenswrapper[4861]: I0309 09:26:15.012977 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71492031-e589-409e-b8c8-c0a1194b97ed-logs" (OuterVolumeSpecName: "logs") pod "71492031-e589-409e-b8c8-c0a1194b97ed" (UID: "71492031-e589-409e-b8c8-c0a1194b97ed"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:26:15 crc kubenswrapper[4861]: I0309 09:26:15.019356 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71492031-e589-409e-b8c8-c0a1194b97ed-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "71492031-e589-409e-b8c8-c0a1194b97ed" (UID: "71492031-e589-409e-b8c8-c0a1194b97ed"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:26:15 crc kubenswrapper[4861]: I0309 09:26:15.021041 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71492031-e589-409e-b8c8-c0a1194b97ed-kube-api-access-9jbdt" (OuterVolumeSpecName: "kube-api-access-9jbdt") pod "71492031-e589-409e-b8c8-c0a1194b97ed" (UID: "71492031-e589-409e-b8c8-c0a1194b97ed"). InnerVolumeSpecName "kube-api-access-9jbdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:26:15 crc kubenswrapper[4861]: I0309 09:26:15.052526 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71492031-e589-409e-b8c8-c0a1194b97ed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "71492031-e589-409e-b8c8-c0a1194b97ed" (UID: "71492031-e589-409e-b8c8-c0a1194b97ed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:26:15 crc kubenswrapper[4861]: I0309 09:26:15.059220 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71492031-e589-409e-b8c8-c0a1194b97ed-scripts" (OuterVolumeSpecName: "scripts") pod "71492031-e589-409e-b8c8-c0a1194b97ed" (UID: "71492031-e589-409e-b8c8-c0a1194b97ed"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:26:15 crc kubenswrapper[4861]: I0309 09:26:15.061120 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71492031-e589-409e-b8c8-c0a1194b97ed-config-data" (OuterVolumeSpecName: "config-data") pod "71492031-e589-409e-b8c8-c0a1194b97ed" (UID: "71492031-e589-409e-b8c8-c0a1194b97ed"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:26:15 crc kubenswrapper[4861]: I0309 09:26:15.074259 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71492031-e589-409e-b8c8-c0a1194b97ed-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "71492031-e589-409e-b8c8-c0a1194b97ed" (UID: "71492031-e589-409e-b8c8-c0a1194b97ed"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:26:15 crc kubenswrapper[4861]: I0309 09:26:15.114137 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jbdt\" (UniqueName: \"kubernetes.io/projected/71492031-e589-409e-b8c8-c0a1194b97ed-kube-api-access-9jbdt\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:15 crc kubenswrapper[4861]: I0309 09:26:15.114184 4861 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/71492031-e589-409e-b8c8-c0a1194b97ed-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:15 crc kubenswrapper[4861]: I0309 09:26:15.114194 4861 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71492031-e589-409e-b8c8-c0a1194b97ed-logs\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:15 crc kubenswrapper[4861]: I0309 09:26:15.114208 4861 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/71492031-e589-409e-b8c8-c0a1194b97ed-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:15 crc kubenswrapper[4861]: I0309 09:26:15.114222 4861 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71492031-e589-409e-b8c8-c0a1194b97ed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:15 crc kubenswrapper[4861]: I0309 09:26:15.114234 4861 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/71492031-e589-409e-b8c8-c0a1194b97ed-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:15 crc kubenswrapper[4861]: I0309 09:26:15.114247 4861 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/71492031-e589-409e-b8c8-c0a1194b97ed-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:15 crc kubenswrapper[4861]: I0309 09:26:15.188274 4861 scope.go:117] "RemoveContainer" containerID="f789d6c7cb2c90f537f7edf8f51e1f3aa6d4ed13c1a60ecf2646f0e1dd6894ae" Mar 09 09:26:15 crc kubenswrapper[4861]: I0309 09:26:15.205850 4861 scope.go:117] "RemoveContainer" containerID="c579881c05bf1dff8d11a8b2bc2f24f89a926610877fc04c23a9f98aa2682890" Mar 09 09:26:15 crc kubenswrapper[4861]: E0309 09:26:15.206249 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c579881c05bf1dff8d11a8b2bc2f24f89a926610877fc04c23a9f98aa2682890\": container with ID starting with c579881c05bf1dff8d11a8b2bc2f24f89a926610877fc04c23a9f98aa2682890 not found: ID does not exist" containerID="c579881c05bf1dff8d11a8b2bc2f24f89a926610877fc04c23a9f98aa2682890" Mar 09 09:26:15 crc kubenswrapper[4861]: I0309 09:26:15.206295 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c579881c05bf1dff8d11a8b2bc2f24f89a926610877fc04c23a9f98aa2682890"} err="failed to get container status \"c579881c05bf1dff8d11a8b2bc2f24f89a926610877fc04c23a9f98aa2682890\": rpc error: code = NotFound desc = could not find container \"c579881c05bf1dff8d11a8b2bc2f24f89a926610877fc04c23a9f98aa2682890\": container with ID starting with c579881c05bf1dff8d11a8b2bc2f24f89a926610877fc04c23a9f98aa2682890 not found: ID does not exist" Mar 09 09:26:15 crc kubenswrapper[4861]: I0309 09:26:15.206323 4861 scope.go:117] "RemoveContainer" containerID="f789d6c7cb2c90f537f7edf8f51e1f3aa6d4ed13c1a60ecf2646f0e1dd6894ae" Mar 09 09:26:15 crc kubenswrapper[4861]: E0309 09:26:15.206741 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f789d6c7cb2c90f537f7edf8f51e1f3aa6d4ed13c1a60ecf2646f0e1dd6894ae\": container with ID starting with f789d6c7cb2c90f537f7edf8f51e1f3aa6d4ed13c1a60ecf2646f0e1dd6894ae not found: ID does not exist" containerID="f789d6c7cb2c90f537f7edf8f51e1f3aa6d4ed13c1a60ecf2646f0e1dd6894ae" Mar 09 09:26:15 crc kubenswrapper[4861]: I0309 09:26:15.206765 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f789d6c7cb2c90f537f7edf8f51e1f3aa6d4ed13c1a60ecf2646f0e1dd6894ae"} err="failed to get container status \"f789d6c7cb2c90f537f7edf8f51e1f3aa6d4ed13c1a60ecf2646f0e1dd6894ae\": rpc error: code = NotFound desc = could not find container \"f789d6c7cb2c90f537f7edf8f51e1f3aa6d4ed13c1a60ecf2646f0e1dd6894ae\": container with ID starting with f789d6c7cb2c90f537f7edf8f51e1f3aa6d4ed13c1a60ecf2646f0e1dd6894ae not found: ID does not exist" Mar 09 09:26:15 crc kubenswrapper[4861]: I0309 09:26:15.338503 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7bb4db8c4-sxjc7"] Mar 09 09:26:15 crc kubenswrapper[4861]: I0309 09:26:15.347941 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7bb4db8c4-sxjc7"] Mar 09 09:26:15 crc kubenswrapper[4861]: I0309 09:26:15.669356 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71492031-e589-409e-b8c8-c0a1194b97ed" path="/var/lib/kubelet/pods/71492031-e589-409e-b8c8-c0a1194b97ed/volumes" Mar 09 09:26:16 crc kubenswrapper[4861]: I0309 09:26:16.626122 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6f4f458d55-lxkls" Mar 09 09:26:16 crc kubenswrapper[4861]: E0309 09:26:16.631127 4861 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48fbe4a1_81ab_4a46_8150_821bc8afa220.slice/crio-3cace9b3826297e13b51f776c4a63465d7f30becfc9c4dce3918a85756fc792e\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48fbe4a1_81ab_4a46_8150_821bc8afa220.slice\": RecentStats: unable to find data in memory cache]" Mar 09 09:26:16 crc kubenswrapper[4861]: I0309 09:26:16.631195 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6f4f458d55-lxkls" Mar 09 09:26:17 crc kubenswrapper[4861]: I0309 09:26:17.042183 4861 generic.go:334] "Generic (PLEG): container finished" podID="6c295c66-4e4c-429a-9769-6119f1c5f087" containerID="e4f92f447639a3202f3431b4008c094cc8222eb15f4dcdefb507cbe1370ec04b" exitCode=0 Mar 09 09:26:17 crc kubenswrapper[4861]: I0309 09:26:17.042268 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6c295c66-4e4c-429a-9769-6119f1c5f087","Type":"ContainerDied","Data":"e4f92f447639a3202f3431b4008c094cc8222eb15f4dcdefb507cbe1370ec04b"} Mar 09 09:26:17 crc kubenswrapper[4861]: I0309 09:26:17.044861 4861 generic.go:334] "Generic (PLEG): container finished" podID="30e45abc-d44d-4e9a-8478-b562905ee7c2" containerID="b5be7ddaf2cc488012e82daab81d04dc4edccf05a70400a457f9705e7d7229b6" exitCode=0 Mar 09 09:26:17 crc kubenswrapper[4861]: I0309 09:26:17.044912 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7fb7d5546d-n665d" event={"ID":"30e45abc-d44d-4e9a-8478-b562905ee7c2","Type":"ContainerDied","Data":"b5be7ddaf2cc488012e82daab81d04dc4edccf05a70400a457f9705e7d7229b6"} Mar 09 09:26:17 crc kubenswrapper[4861]: I0309 09:26:17.605354 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7fb7d5546d-n665d" Mar 09 09:26:17 crc kubenswrapper[4861]: I0309 09:26:17.612795 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:26:17 crc kubenswrapper[4861]: I0309 09:26:17.770722 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c295c66-4e4c-429a-9769-6119f1c5f087-run-httpd\") pod \"6c295c66-4e4c-429a-9769-6119f1c5f087\" (UID: \"6c295c66-4e4c-429a-9769-6119f1c5f087\") " Mar 09 09:26:17 crc kubenswrapper[4861]: I0309 09:26:17.770779 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbj8m\" (UniqueName: \"kubernetes.io/projected/30e45abc-d44d-4e9a-8478-b562905ee7c2-kube-api-access-cbj8m\") pod \"30e45abc-d44d-4e9a-8478-b562905ee7c2\" (UID: \"30e45abc-d44d-4e9a-8478-b562905ee7c2\") " Mar 09 09:26:17 crc kubenswrapper[4861]: I0309 09:26:17.770820 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c295c66-4e4c-429a-9769-6119f1c5f087-scripts\") pod \"6c295c66-4e4c-429a-9769-6119f1c5f087\" (UID: \"6c295c66-4e4c-429a-9769-6119f1c5f087\") " Mar 09 09:26:17 crc kubenswrapper[4861]: I0309 09:26:17.770901 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c295c66-4e4c-429a-9769-6119f1c5f087-combined-ca-bundle\") pod \"6c295c66-4e4c-429a-9769-6119f1c5f087\" (UID: \"6c295c66-4e4c-429a-9769-6119f1c5f087\") " Mar 09 09:26:17 crc kubenswrapper[4861]: I0309 09:26:17.770937 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/30e45abc-d44d-4e9a-8478-b562905ee7c2-ovndb-tls-certs\") pod \"30e45abc-d44d-4e9a-8478-b562905ee7c2\" (UID: \"30e45abc-d44d-4e9a-8478-b562905ee7c2\") " Mar 09 09:26:17 crc kubenswrapper[4861]: I0309 09:26:17.770978 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c295c66-4e4c-429a-9769-6119f1c5f087-config-data\") pod \"6c295c66-4e4c-429a-9769-6119f1c5f087\" (UID: \"6c295c66-4e4c-429a-9769-6119f1c5f087\") " Mar 09 09:26:17 crc kubenswrapper[4861]: I0309 09:26:17.771016 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/30e45abc-d44d-4e9a-8478-b562905ee7c2-config\") pod \"30e45abc-d44d-4e9a-8478-b562905ee7c2\" (UID: \"30e45abc-d44d-4e9a-8478-b562905ee7c2\") " Mar 09 09:26:17 crc kubenswrapper[4861]: I0309 09:26:17.771037 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30e45abc-d44d-4e9a-8478-b562905ee7c2-combined-ca-bundle\") pod \"30e45abc-d44d-4e9a-8478-b562905ee7c2\" (UID: \"30e45abc-d44d-4e9a-8478-b562905ee7c2\") " Mar 09 09:26:17 crc kubenswrapper[4861]: I0309 09:26:17.771085 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c295c66-4e4c-429a-9769-6119f1c5f087-log-httpd\") pod \"6c295c66-4e4c-429a-9769-6119f1c5f087\" (UID: \"6c295c66-4e4c-429a-9769-6119f1c5f087\") " Mar 09 09:26:17 crc kubenswrapper[4861]: I0309 09:26:17.771159 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/30e45abc-d44d-4e9a-8478-b562905ee7c2-httpd-config\") pod \"30e45abc-d44d-4e9a-8478-b562905ee7c2\" (UID: \"30e45abc-d44d-4e9a-8478-b562905ee7c2\") " Mar 09 09:26:17 crc kubenswrapper[4861]: I0309 09:26:17.771209 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hb7tx\" (UniqueName: \"kubernetes.io/projected/6c295c66-4e4c-429a-9769-6119f1c5f087-kube-api-access-hb7tx\") pod \"6c295c66-4e4c-429a-9769-6119f1c5f087\" (UID: \"6c295c66-4e4c-429a-9769-6119f1c5f087\") " Mar 09 09:26:17 crc kubenswrapper[4861]: I0309 09:26:17.771256 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6c295c66-4e4c-429a-9769-6119f1c5f087-sg-core-conf-yaml\") pod \"6c295c66-4e4c-429a-9769-6119f1c5f087\" (UID: \"6c295c66-4e4c-429a-9769-6119f1c5f087\") " Mar 09 09:26:17 crc kubenswrapper[4861]: I0309 09:26:17.771388 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c295c66-4e4c-429a-9769-6119f1c5f087-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6c295c66-4e4c-429a-9769-6119f1c5f087" (UID: "6c295c66-4e4c-429a-9769-6119f1c5f087"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:26:17 crc kubenswrapper[4861]: I0309 09:26:17.771665 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c295c66-4e4c-429a-9769-6119f1c5f087-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6c295c66-4e4c-429a-9769-6119f1c5f087" (UID: "6c295c66-4e4c-429a-9769-6119f1c5f087"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:26:17 crc kubenswrapper[4861]: I0309 09:26:17.771729 4861 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c295c66-4e4c-429a-9769-6119f1c5f087-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:17 crc kubenswrapper[4861]: I0309 09:26:17.779806 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30e45abc-d44d-4e9a-8478-b562905ee7c2-kube-api-access-cbj8m" (OuterVolumeSpecName: "kube-api-access-cbj8m") pod "30e45abc-d44d-4e9a-8478-b562905ee7c2" (UID: "30e45abc-d44d-4e9a-8478-b562905ee7c2"). InnerVolumeSpecName "kube-api-access-cbj8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:26:17 crc kubenswrapper[4861]: I0309 09:26:17.779862 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c295c66-4e4c-429a-9769-6119f1c5f087-scripts" (OuterVolumeSpecName: "scripts") pod "6c295c66-4e4c-429a-9769-6119f1c5f087" (UID: "6c295c66-4e4c-429a-9769-6119f1c5f087"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:26:17 crc kubenswrapper[4861]: I0309 09:26:17.780876 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c295c66-4e4c-429a-9769-6119f1c5f087-kube-api-access-hb7tx" (OuterVolumeSpecName: "kube-api-access-hb7tx") pod "6c295c66-4e4c-429a-9769-6119f1c5f087" (UID: "6c295c66-4e4c-429a-9769-6119f1c5f087"). InnerVolumeSpecName "kube-api-access-hb7tx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:26:17 crc kubenswrapper[4861]: I0309 09:26:17.782254 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30e45abc-d44d-4e9a-8478-b562905ee7c2-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "30e45abc-d44d-4e9a-8478-b562905ee7c2" (UID: "30e45abc-d44d-4e9a-8478-b562905ee7c2"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:26:17 crc kubenswrapper[4861]: I0309 09:26:17.808105 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c295c66-4e4c-429a-9769-6119f1c5f087-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6c295c66-4e4c-429a-9769-6119f1c5f087" (UID: "6c295c66-4e4c-429a-9769-6119f1c5f087"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:26:17 crc kubenswrapper[4861]: I0309 09:26:17.841493 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30e45abc-d44d-4e9a-8478-b562905ee7c2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "30e45abc-d44d-4e9a-8478-b562905ee7c2" (UID: "30e45abc-d44d-4e9a-8478-b562905ee7c2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:26:17 crc kubenswrapper[4861]: I0309 09:26:17.841519 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30e45abc-d44d-4e9a-8478-b562905ee7c2-config" (OuterVolumeSpecName: "config") pod "30e45abc-d44d-4e9a-8478-b562905ee7c2" (UID: "30e45abc-d44d-4e9a-8478-b562905ee7c2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:26:17 crc kubenswrapper[4861]: I0309 09:26:17.873082 4861 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/30e45abc-d44d-4e9a-8478-b562905ee7c2-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:17 crc kubenswrapper[4861]: I0309 09:26:17.873119 4861 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30e45abc-d44d-4e9a-8478-b562905ee7c2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:17 crc kubenswrapper[4861]: I0309 09:26:17.873133 4861 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6c295c66-4e4c-429a-9769-6119f1c5f087-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:17 crc kubenswrapper[4861]: I0309 09:26:17.873144 4861 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/30e45abc-d44d-4e9a-8478-b562905ee7c2-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:17 crc kubenswrapper[4861]: I0309 09:26:17.873155 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hb7tx\" (UniqueName: \"kubernetes.io/projected/6c295c66-4e4c-429a-9769-6119f1c5f087-kube-api-access-hb7tx\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:17 crc kubenswrapper[4861]: I0309 09:26:17.873164 4861 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6c295c66-4e4c-429a-9769-6119f1c5f087-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:17 crc kubenswrapper[4861]: I0309 09:26:17.873172 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbj8m\" (UniqueName: \"kubernetes.io/projected/30e45abc-d44d-4e9a-8478-b562905ee7c2-kube-api-access-cbj8m\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:17 crc kubenswrapper[4861]: I0309 09:26:17.873183 4861 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c295c66-4e4c-429a-9769-6119f1c5f087-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:17 crc kubenswrapper[4861]: I0309 09:26:17.875599 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30e45abc-d44d-4e9a-8478-b562905ee7c2-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "30e45abc-d44d-4e9a-8478-b562905ee7c2" (UID: "30e45abc-d44d-4e9a-8478-b562905ee7c2"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:26:17 crc kubenswrapper[4861]: I0309 09:26:17.886886 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c295c66-4e4c-429a-9769-6119f1c5f087-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6c295c66-4e4c-429a-9769-6119f1c5f087" (UID: "6c295c66-4e4c-429a-9769-6119f1c5f087"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:26:17 crc kubenswrapper[4861]: I0309 09:26:17.908725 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c295c66-4e4c-429a-9769-6119f1c5f087-config-data" (OuterVolumeSpecName: "config-data") pod "6c295c66-4e4c-429a-9769-6119f1c5f087" (UID: "6c295c66-4e4c-429a-9769-6119f1c5f087"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:26:17 crc kubenswrapper[4861]: I0309 09:26:17.974680 4861 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c295c66-4e4c-429a-9769-6119f1c5f087-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:17 crc kubenswrapper[4861]: I0309 09:26:17.974714 4861 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/30e45abc-d44d-4e9a-8478-b562905ee7c2-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:17 crc kubenswrapper[4861]: I0309 09:26:17.974726 4861 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c295c66-4e4c-429a-9769-6119f1c5f087-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:18 crc kubenswrapper[4861]: I0309 09:26:18.055599 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6c295c66-4e4c-429a-9769-6119f1c5f087","Type":"ContainerDied","Data":"dcf4fb82bd572aa9ad02b9c8f51123d84cd8d3ecad7c1c4804310b69c36c1458"} Mar 09 09:26:18 crc kubenswrapper[4861]: I0309 09:26:18.055694 4861 scope.go:117] "RemoveContainer" containerID="d2e662b7674c66e8e6404aaaa20571740ba1d8fa5a409f820c6d26b3e4739c0d" Mar 09 09:26:18 crc kubenswrapper[4861]: I0309 09:26:18.055702 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:26:18 crc kubenswrapper[4861]: I0309 09:26:18.058093 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7fb7d5546d-n665d" event={"ID":"30e45abc-d44d-4e9a-8478-b562905ee7c2","Type":"ContainerDied","Data":"be97584258dd26010d40b9cabf3562f147ae461b745ab9360373a4d97795f376"} Mar 09 09:26:18 crc kubenswrapper[4861]: I0309 09:26:18.058203 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7fb7d5546d-n665d" Mar 09 09:26:18 crc kubenswrapper[4861]: I0309 09:26:18.093948 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:26:18 crc kubenswrapper[4861]: I0309 09:26:18.095579 4861 scope.go:117] "RemoveContainer" containerID="28df8178150fb9bfb6de04e8846625e93866e85a8da803f9349b41f5aa4bcfaa" Mar 09 09:26:18 crc kubenswrapper[4861]: I0309 09:26:18.115847 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:26:18 crc kubenswrapper[4861]: I0309 09:26:18.121513 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7fb7d5546d-n665d"] Mar 09 09:26:18 crc kubenswrapper[4861]: I0309 09:26:18.134418 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:26:18 crc kubenswrapper[4861]: E0309 09:26:18.134786 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30e45abc-d44d-4e9a-8478-b562905ee7c2" containerName="neutron-api" Mar 09 09:26:18 crc kubenswrapper[4861]: I0309 09:26:18.134804 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="30e45abc-d44d-4e9a-8478-b562905ee7c2" containerName="neutron-api" Mar 09 09:26:18 crc kubenswrapper[4861]: E0309 09:26:18.134822 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71492031-e589-409e-b8c8-c0a1194b97ed" containerName="horizon" Mar 09 09:26:18 crc kubenswrapper[4861]: I0309 09:26:18.134830 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="71492031-e589-409e-b8c8-c0a1194b97ed" containerName="horizon" Mar 09 09:26:18 crc kubenswrapper[4861]: E0309 09:26:18.134839 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c295c66-4e4c-429a-9769-6119f1c5f087" containerName="ceilometer-central-agent" Mar 09 09:26:18 crc kubenswrapper[4861]: I0309 09:26:18.134845 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c295c66-4e4c-429a-9769-6119f1c5f087" containerName="ceilometer-central-agent" Mar 09 09:26:18 crc kubenswrapper[4861]: E0309 09:26:18.134859 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c295c66-4e4c-429a-9769-6119f1c5f087" containerName="proxy-httpd" Mar 09 09:26:18 crc kubenswrapper[4861]: I0309 09:26:18.134864 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c295c66-4e4c-429a-9769-6119f1c5f087" containerName="proxy-httpd" Mar 09 09:26:18 crc kubenswrapper[4861]: E0309 09:26:18.134873 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5330b450-5c51-43e8-b0c0-c9b875dc49b5" containerName="oc" Mar 09 09:26:18 crc kubenswrapper[4861]: I0309 09:26:18.134879 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="5330b450-5c51-43e8-b0c0-c9b875dc49b5" containerName="oc" Mar 09 09:26:18 crc kubenswrapper[4861]: E0309 09:26:18.134889 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c295c66-4e4c-429a-9769-6119f1c5f087" containerName="ceilometer-notification-agent" Mar 09 09:26:18 crc kubenswrapper[4861]: I0309 09:26:18.134895 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c295c66-4e4c-429a-9769-6119f1c5f087" containerName="ceilometer-notification-agent" Mar 09 09:26:18 crc kubenswrapper[4861]: E0309 09:26:18.134905 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71492031-e589-409e-b8c8-c0a1194b97ed" containerName="horizon-log" Mar 09 09:26:18 crc kubenswrapper[4861]: I0309 09:26:18.134911 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="71492031-e589-409e-b8c8-c0a1194b97ed" containerName="horizon-log" Mar 09 09:26:18 crc kubenswrapper[4861]: E0309 09:26:18.134927 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30e45abc-d44d-4e9a-8478-b562905ee7c2" containerName="neutron-httpd" Mar 09 09:26:18 crc kubenswrapper[4861]: I0309 09:26:18.134932 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="30e45abc-d44d-4e9a-8478-b562905ee7c2" containerName="neutron-httpd" Mar 09 09:26:18 crc kubenswrapper[4861]: E0309 09:26:18.134944 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c295c66-4e4c-429a-9769-6119f1c5f087" containerName="sg-core" Mar 09 09:26:18 crc kubenswrapper[4861]: I0309 09:26:18.134950 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c295c66-4e4c-429a-9769-6119f1c5f087" containerName="sg-core" Mar 09 09:26:18 crc kubenswrapper[4861]: I0309 09:26:18.135105 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="71492031-e589-409e-b8c8-c0a1194b97ed" containerName="horizon" Mar 09 09:26:18 crc kubenswrapper[4861]: I0309 09:26:18.135115 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="30e45abc-d44d-4e9a-8478-b562905ee7c2" containerName="neutron-api" Mar 09 09:26:18 crc kubenswrapper[4861]: I0309 09:26:18.135133 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c295c66-4e4c-429a-9769-6119f1c5f087" containerName="sg-core" Mar 09 09:26:18 crc kubenswrapper[4861]: I0309 09:26:18.135141 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="30e45abc-d44d-4e9a-8478-b562905ee7c2" containerName="neutron-httpd" Mar 09 09:26:18 crc kubenswrapper[4861]: I0309 09:26:18.135148 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="5330b450-5c51-43e8-b0c0-c9b875dc49b5" containerName="oc" Mar 09 09:26:18 crc kubenswrapper[4861]: I0309 09:26:18.135159 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c295c66-4e4c-429a-9769-6119f1c5f087" containerName="ceilometer-notification-agent" Mar 09 09:26:18 crc kubenswrapper[4861]: I0309 09:26:18.135169 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c295c66-4e4c-429a-9769-6119f1c5f087" containerName="proxy-httpd" Mar 09 09:26:18 crc kubenswrapper[4861]: I0309 09:26:18.135180 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c295c66-4e4c-429a-9769-6119f1c5f087" containerName="ceilometer-central-agent" Mar 09 09:26:18 crc kubenswrapper[4861]: I0309 09:26:18.135190 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="71492031-e589-409e-b8c8-c0a1194b97ed" containerName="horizon-log" Mar 09 09:26:18 crc kubenswrapper[4861]: I0309 09:26:18.136901 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:26:18 crc kubenswrapper[4861]: I0309 09:26:18.142883 4861 scope.go:117] "RemoveContainer" containerID="0edeacb7534e99d5e30b62e7192d7570d6d2e0651bcc92cd8192715235d0d0d4" Mar 09 09:26:18 crc kubenswrapper[4861]: I0309 09:26:18.143112 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7fb7d5546d-n665d"] Mar 09 09:26:18 crc kubenswrapper[4861]: I0309 09:26:18.144321 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 09:26:18 crc kubenswrapper[4861]: I0309 09:26:18.145159 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 09:26:18 crc kubenswrapper[4861]: I0309 09:26:18.149998 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:26:18 crc kubenswrapper[4861]: I0309 09:26:18.171649 4861 scope.go:117] "RemoveContainer" containerID="e4f92f447639a3202f3431b4008c094cc8222eb15f4dcdefb507cbe1370ec04b" Mar 09 09:26:18 crc kubenswrapper[4861]: I0309 09:26:18.187681 4861 scope.go:117] "RemoveContainer" containerID="3fd703ecac672d85228b81f88612761d619524529973bcd35d35b8dbfa04b92c" Mar 09 09:26:18 crc kubenswrapper[4861]: I0309 09:26:18.209969 4861 scope.go:117] "RemoveContainer" containerID="b5be7ddaf2cc488012e82daab81d04dc4edccf05a70400a457f9705e7d7229b6" Mar 09 09:26:18 crc kubenswrapper[4861]: I0309 09:26:18.280081 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36eb13a3-0fad-4a47-bc5f-088f0aab8e02-scripts\") pod \"ceilometer-0\" (UID: \"36eb13a3-0fad-4a47-bc5f-088f0aab8e02\") " pod="openstack/ceilometer-0" Mar 09 09:26:18 crc kubenswrapper[4861]: I0309 09:26:18.280158 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36eb13a3-0fad-4a47-bc5f-088f0aab8e02-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"36eb13a3-0fad-4a47-bc5f-088f0aab8e02\") " pod="openstack/ceilometer-0" Mar 09 09:26:18 crc kubenswrapper[4861]: I0309 09:26:18.280184 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36eb13a3-0fad-4a47-bc5f-088f0aab8e02-config-data\") pod \"ceilometer-0\" (UID: \"36eb13a3-0fad-4a47-bc5f-088f0aab8e02\") " pod="openstack/ceilometer-0" Mar 09 09:26:18 crc kubenswrapper[4861]: I0309 09:26:18.280224 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/36eb13a3-0fad-4a47-bc5f-088f0aab8e02-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"36eb13a3-0fad-4a47-bc5f-088f0aab8e02\") " pod="openstack/ceilometer-0" Mar 09 09:26:18 crc kubenswrapper[4861]: I0309 09:26:18.280252 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/36eb13a3-0fad-4a47-bc5f-088f0aab8e02-run-httpd\") pod \"ceilometer-0\" (UID: \"36eb13a3-0fad-4a47-bc5f-088f0aab8e02\") " pod="openstack/ceilometer-0" Mar 09 09:26:18 crc kubenswrapper[4861]: I0309 09:26:18.280840 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bq8bg\" (UniqueName: \"kubernetes.io/projected/36eb13a3-0fad-4a47-bc5f-088f0aab8e02-kube-api-access-bq8bg\") pod \"ceilometer-0\" (UID: \"36eb13a3-0fad-4a47-bc5f-088f0aab8e02\") " pod="openstack/ceilometer-0" Mar 09 09:26:18 crc kubenswrapper[4861]: I0309 09:26:18.281011 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/36eb13a3-0fad-4a47-bc5f-088f0aab8e02-log-httpd\") pod \"ceilometer-0\" (UID: \"36eb13a3-0fad-4a47-bc5f-088f0aab8e02\") " pod="openstack/ceilometer-0" Mar 09 09:26:18 crc kubenswrapper[4861]: I0309 09:26:18.383445 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/36eb13a3-0fad-4a47-bc5f-088f0aab8e02-log-httpd\") pod \"ceilometer-0\" (UID: \"36eb13a3-0fad-4a47-bc5f-088f0aab8e02\") " pod="openstack/ceilometer-0" Mar 09 09:26:18 crc kubenswrapper[4861]: I0309 09:26:18.383567 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36eb13a3-0fad-4a47-bc5f-088f0aab8e02-scripts\") pod \"ceilometer-0\" (UID: \"36eb13a3-0fad-4a47-bc5f-088f0aab8e02\") " pod="openstack/ceilometer-0" Mar 09 09:26:18 crc kubenswrapper[4861]: I0309 09:26:18.383631 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36eb13a3-0fad-4a47-bc5f-088f0aab8e02-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"36eb13a3-0fad-4a47-bc5f-088f0aab8e02\") " pod="openstack/ceilometer-0" Mar 09 09:26:18 crc kubenswrapper[4861]: I0309 09:26:18.383682 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36eb13a3-0fad-4a47-bc5f-088f0aab8e02-config-data\") pod \"ceilometer-0\" (UID: \"36eb13a3-0fad-4a47-bc5f-088f0aab8e02\") " pod="openstack/ceilometer-0" Mar 09 09:26:18 crc kubenswrapper[4861]: I0309 09:26:18.383712 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/36eb13a3-0fad-4a47-bc5f-088f0aab8e02-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"36eb13a3-0fad-4a47-bc5f-088f0aab8e02\") " pod="openstack/ceilometer-0" Mar 09 09:26:18 crc kubenswrapper[4861]: I0309 09:26:18.383739 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/36eb13a3-0fad-4a47-bc5f-088f0aab8e02-run-httpd\") pod \"ceilometer-0\" (UID: \"36eb13a3-0fad-4a47-bc5f-088f0aab8e02\") " pod="openstack/ceilometer-0" Mar 09 09:26:18 crc kubenswrapper[4861]: I0309 09:26:18.383800 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bq8bg\" (UniqueName: \"kubernetes.io/projected/36eb13a3-0fad-4a47-bc5f-088f0aab8e02-kube-api-access-bq8bg\") pod \"ceilometer-0\" (UID: \"36eb13a3-0fad-4a47-bc5f-088f0aab8e02\") " pod="openstack/ceilometer-0" Mar 09 09:26:18 crc kubenswrapper[4861]: I0309 09:26:18.384114 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/36eb13a3-0fad-4a47-bc5f-088f0aab8e02-log-httpd\") pod \"ceilometer-0\" (UID: \"36eb13a3-0fad-4a47-bc5f-088f0aab8e02\") " pod="openstack/ceilometer-0" Mar 09 09:26:18 crc kubenswrapper[4861]: I0309 09:26:18.384476 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/36eb13a3-0fad-4a47-bc5f-088f0aab8e02-run-httpd\") pod \"ceilometer-0\" (UID: \"36eb13a3-0fad-4a47-bc5f-088f0aab8e02\") " pod="openstack/ceilometer-0" Mar 09 09:26:18 crc kubenswrapper[4861]: I0309 09:26:18.389056 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/36eb13a3-0fad-4a47-bc5f-088f0aab8e02-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"36eb13a3-0fad-4a47-bc5f-088f0aab8e02\") " pod="openstack/ceilometer-0" Mar 09 09:26:18 crc kubenswrapper[4861]: I0309 09:26:18.389190 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36eb13a3-0fad-4a47-bc5f-088f0aab8e02-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"36eb13a3-0fad-4a47-bc5f-088f0aab8e02\") " pod="openstack/ceilometer-0" Mar 09 09:26:18 crc kubenswrapper[4861]: I0309 09:26:18.389220 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36eb13a3-0fad-4a47-bc5f-088f0aab8e02-scripts\") pod \"ceilometer-0\" (UID: \"36eb13a3-0fad-4a47-bc5f-088f0aab8e02\") " pod="openstack/ceilometer-0" Mar 09 09:26:18 crc kubenswrapper[4861]: I0309 09:26:18.389515 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36eb13a3-0fad-4a47-bc5f-088f0aab8e02-config-data\") pod \"ceilometer-0\" (UID: \"36eb13a3-0fad-4a47-bc5f-088f0aab8e02\") " pod="openstack/ceilometer-0" Mar 09 09:26:18 crc kubenswrapper[4861]: I0309 09:26:18.405015 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bq8bg\" (UniqueName: \"kubernetes.io/projected/36eb13a3-0fad-4a47-bc5f-088f0aab8e02-kube-api-access-bq8bg\") pod \"ceilometer-0\" (UID: \"36eb13a3-0fad-4a47-bc5f-088f0aab8e02\") " pod="openstack/ceilometer-0" Mar 09 09:26:18 crc kubenswrapper[4861]: I0309 09:26:18.454393 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:26:18 crc kubenswrapper[4861]: W0309 09:26:18.990540 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36eb13a3_0fad_4a47_bc5f_088f0aab8e02.slice/crio-2cd294238a3031870557707ce52c0138a9359046fbe330c529a13590c8e6c1a0 WatchSource:0}: Error finding container 2cd294238a3031870557707ce52c0138a9359046fbe330c529a13590c8e6c1a0: Status 404 returned error can't find the container with id 2cd294238a3031870557707ce52c0138a9359046fbe330c529a13590c8e6c1a0 Mar 09 09:26:18 crc kubenswrapper[4861]: I0309 09:26:18.999622 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:26:19 crc kubenswrapper[4861]: I0309 09:26:19.073240 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"36eb13a3-0fad-4a47-bc5f-088f0aab8e02","Type":"ContainerStarted","Data":"2cd294238a3031870557707ce52c0138a9359046fbe330c529a13590c8e6c1a0"} Mar 09 09:26:19 crc kubenswrapper[4861]: I0309 09:26:19.484893 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:26:19 crc kubenswrapper[4861]: I0309 09:26:19.669751 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30e45abc-d44d-4e9a-8478-b562905ee7c2" path="/var/lib/kubelet/pods/30e45abc-d44d-4e9a-8478-b562905ee7c2/volumes" Mar 09 09:26:19 crc kubenswrapper[4861]: I0309 09:26:19.670857 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c295c66-4e4c-429a-9769-6119f1c5f087" path="/var/lib/kubelet/pods/6c295c66-4e4c-429a-9769-6119f1c5f087/volumes" Mar 09 09:26:20 crc kubenswrapper[4861]: I0309 09:26:20.084332 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"36eb13a3-0fad-4a47-bc5f-088f0aab8e02","Type":"ContainerStarted","Data":"378bd39844083889540a1130823a9e5095b16051c3ff3eb3a9eb38dc5f314bb9"} Mar 09 09:26:21 crc kubenswrapper[4861]: I0309 09:26:21.095337 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"36eb13a3-0fad-4a47-bc5f-088f0aab8e02","Type":"ContainerStarted","Data":"dc011b0d3097c5d8fcb166b831a5c5191de0b1babfe195d3f66462d7260c158e"} Mar 09 09:26:21 crc kubenswrapper[4861]: I0309 09:26:21.095672 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"36eb13a3-0fad-4a47-bc5f-088f0aab8e02","Type":"ContainerStarted","Data":"a9b2451098b200e3e8fadd7b8cde9256b31d3db6fe2e8c25fd0d292c75468f84"} Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.020870 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-ct722"] Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.022038 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-ct722" Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.028703 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-c9be-account-create-update-jzpgr"] Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.030021 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c9be-account-create-update-jzpgr" Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.035707 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.077815 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-c9be-account-create-update-jzpgr"] Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.090787 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-ct722"] Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.155631 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-z4xzx"] Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.156744 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-z4xzx" Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.159064 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htqcc\" (UniqueName: \"kubernetes.io/projected/bde3ec7b-bd8f-4936-91b4-c8a7063628c8-kube-api-access-htqcc\") pod \"nova-api-c9be-account-create-update-jzpgr\" (UID: \"bde3ec7b-bd8f-4936-91b4-c8a7063628c8\") " pod="openstack/nova-api-c9be-account-create-update-jzpgr" Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.159110 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bde3ec7b-bd8f-4936-91b4-c8a7063628c8-operator-scripts\") pod \"nova-api-c9be-account-create-update-jzpgr\" (UID: \"bde3ec7b-bd8f-4936-91b4-c8a7063628c8\") " pod="openstack/nova-api-c9be-account-create-update-jzpgr" Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.159189 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9rq8\" (UniqueName: \"kubernetes.io/projected/c1271b64-efb8-425e-8718-1e28003a5722-kube-api-access-c9rq8\") pod \"nova-api-db-create-ct722\" (UID: \"c1271b64-efb8-425e-8718-1e28003a5722\") " pod="openstack/nova-api-db-create-ct722" Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.159263 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1271b64-efb8-425e-8718-1e28003a5722-operator-scripts\") pod \"nova-api-db-create-ct722\" (UID: \"c1271b64-efb8-425e-8718-1e28003a5722\") " pod="openstack/nova-api-db-create-ct722" Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.195221 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-z4xzx"] Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.244807 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-4f2ds"] Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.247539 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-4f2ds" Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.261531 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-4f2ds"] Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.262125 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2fd4\" (UniqueName: \"kubernetes.io/projected/e727b52d-d582-4d07-a852-edd0a15e1ba7-kube-api-access-h2fd4\") pod \"nova-cell0-db-create-z4xzx\" (UID: \"e727b52d-d582-4d07-a852-edd0a15e1ba7\") " pod="openstack/nova-cell0-db-create-z4xzx" Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.262180 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htqcc\" (UniqueName: \"kubernetes.io/projected/bde3ec7b-bd8f-4936-91b4-c8a7063628c8-kube-api-access-htqcc\") pod \"nova-api-c9be-account-create-update-jzpgr\" (UID: \"bde3ec7b-bd8f-4936-91b4-c8a7063628c8\") " pod="openstack/nova-api-c9be-account-create-update-jzpgr" Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.262216 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bde3ec7b-bd8f-4936-91b4-c8a7063628c8-operator-scripts\") pod \"nova-api-c9be-account-create-update-jzpgr\" (UID: \"bde3ec7b-bd8f-4936-91b4-c8a7063628c8\") " pod="openstack/nova-api-c9be-account-create-update-jzpgr" Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.262292 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9rq8\" (UniqueName: \"kubernetes.io/projected/c1271b64-efb8-425e-8718-1e28003a5722-kube-api-access-c9rq8\") pod \"nova-api-db-create-ct722\" (UID: \"c1271b64-efb8-425e-8718-1e28003a5722\") " pod="openstack/nova-api-db-create-ct722" Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.262323 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e727b52d-d582-4d07-a852-edd0a15e1ba7-operator-scripts\") pod \"nova-cell0-db-create-z4xzx\" (UID: \"e727b52d-d582-4d07-a852-edd0a15e1ba7\") " pod="openstack/nova-cell0-db-create-z4xzx" Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.262404 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1271b64-efb8-425e-8718-1e28003a5722-operator-scripts\") pod \"nova-api-db-create-ct722\" (UID: \"c1271b64-efb8-425e-8718-1e28003a5722\") " pod="openstack/nova-api-db-create-ct722" Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.263472 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bde3ec7b-bd8f-4936-91b4-c8a7063628c8-operator-scripts\") pod \"nova-api-c9be-account-create-update-jzpgr\" (UID: \"bde3ec7b-bd8f-4936-91b4-c8a7063628c8\") " pod="openstack/nova-api-c9be-account-create-update-jzpgr" Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.263493 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1271b64-efb8-425e-8718-1e28003a5722-operator-scripts\") pod \"nova-api-db-create-ct722\" (UID: \"c1271b64-efb8-425e-8718-1e28003a5722\") " pod="openstack/nova-api-db-create-ct722" Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.291922 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-4ee7-account-create-update-fhsvd"] Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.295450 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4ee7-account-create-update-fhsvd" Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.299487 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.303062 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htqcc\" (UniqueName: \"kubernetes.io/projected/bde3ec7b-bd8f-4936-91b4-c8a7063628c8-kube-api-access-htqcc\") pod \"nova-api-c9be-account-create-update-jzpgr\" (UID: \"bde3ec7b-bd8f-4936-91b4-c8a7063628c8\") " pod="openstack/nova-api-c9be-account-create-update-jzpgr" Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.325146 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-4ee7-account-create-update-fhsvd"] Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.331451 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9rq8\" (UniqueName: \"kubernetes.io/projected/c1271b64-efb8-425e-8718-1e28003a5722-kube-api-access-c9rq8\") pod \"nova-api-db-create-ct722\" (UID: \"c1271b64-efb8-425e-8718-1e28003a5722\") " pod="openstack/nova-api-db-create-ct722" Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.341924 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-ct722" Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.349418 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c9be-account-create-update-jzpgr" Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.367131 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e727b52d-d582-4d07-a852-edd0a15e1ba7-operator-scripts\") pod \"nova-cell0-db-create-z4xzx\" (UID: \"e727b52d-d582-4d07-a852-edd0a15e1ba7\") " pod="openstack/nova-cell0-db-create-z4xzx" Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.367220 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf5b6714-95ce-4470-9536-10e1d281b52e-operator-scripts\") pod \"nova-cell1-db-create-4f2ds\" (UID: \"bf5b6714-95ce-4470-9536-10e1d281b52e\") " pod="openstack/nova-cell1-db-create-4f2ds" Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.367251 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gb447\" (UniqueName: \"kubernetes.io/projected/bf5b6714-95ce-4470-9536-10e1d281b52e-kube-api-access-gb447\") pod \"nova-cell1-db-create-4f2ds\" (UID: \"bf5b6714-95ce-4470-9536-10e1d281b52e\") " pod="openstack/nova-cell1-db-create-4f2ds" Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.367294 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2fd4\" (UniqueName: \"kubernetes.io/projected/e727b52d-d582-4d07-a852-edd0a15e1ba7-kube-api-access-h2fd4\") pod \"nova-cell0-db-create-z4xzx\" (UID: \"e727b52d-d582-4d07-a852-edd0a15e1ba7\") " pod="openstack/nova-cell0-db-create-z4xzx" Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.368351 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e727b52d-d582-4d07-a852-edd0a15e1ba7-operator-scripts\") pod \"nova-cell0-db-create-z4xzx\" (UID: \"e727b52d-d582-4d07-a852-edd0a15e1ba7\") " pod="openstack/nova-cell0-db-create-z4xzx" Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.387991 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2fd4\" (UniqueName: \"kubernetes.io/projected/e727b52d-d582-4d07-a852-edd0a15e1ba7-kube-api-access-h2fd4\") pod \"nova-cell0-db-create-z4xzx\" (UID: \"e727b52d-d582-4d07-a852-edd0a15e1ba7\") " pod="openstack/nova-cell0-db-create-z4xzx" Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.441988 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-ddfc-account-create-update-s2nh2"] Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.443137 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-ddfc-account-create-update-s2nh2" Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.451528 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.466153 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-ddfc-account-create-update-s2nh2"] Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.473690 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf5b6714-95ce-4470-9536-10e1d281b52e-operator-scripts\") pod \"nova-cell1-db-create-4f2ds\" (UID: \"bf5b6714-95ce-4470-9536-10e1d281b52e\") " pod="openstack/nova-cell1-db-create-4f2ds" Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.473743 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gb447\" (UniqueName: \"kubernetes.io/projected/bf5b6714-95ce-4470-9536-10e1d281b52e-kube-api-access-gb447\") pod \"nova-cell1-db-create-4f2ds\" (UID: \"bf5b6714-95ce-4470-9536-10e1d281b52e\") " pod="openstack/nova-cell1-db-create-4f2ds" Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.473777 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53c84cc0-8dc4-44c5-89d7-bb118f6b4fe2-operator-scripts\") pod \"nova-cell0-4ee7-account-create-update-fhsvd\" (UID: \"53c84cc0-8dc4-44c5-89d7-bb118f6b4fe2\") " pod="openstack/nova-cell0-4ee7-account-create-update-fhsvd" Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.473805 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcqpc\" (UniqueName: \"kubernetes.io/projected/53c84cc0-8dc4-44c5-89d7-bb118f6b4fe2-kube-api-access-hcqpc\") pod \"nova-cell0-4ee7-account-create-update-fhsvd\" (UID: \"53c84cc0-8dc4-44c5-89d7-bb118f6b4fe2\") " pod="openstack/nova-cell0-4ee7-account-create-update-fhsvd" Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.481904 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf5b6714-95ce-4470-9536-10e1d281b52e-operator-scripts\") pod \"nova-cell1-db-create-4f2ds\" (UID: \"bf5b6714-95ce-4470-9536-10e1d281b52e\") " pod="openstack/nova-cell1-db-create-4f2ds" Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.485796 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-z4xzx" Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.494486 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gb447\" (UniqueName: \"kubernetes.io/projected/bf5b6714-95ce-4470-9536-10e1d281b52e-kube-api-access-gb447\") pod \"nova-cell1-db-create-4f2ds\" (UID: \"bf5b6714-95ce-4470-9536-10e1d281b52e\") " pod="openstack/nova-cell1-db-create-4f2ds" Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.574124 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-4f2ds" Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.575594 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53c84cc0-8dc4-44c5-89d7-bb118f6b4fe2-operator-scripts\") pod \"nova-cell0-4ee7-account-create-update-fhsvd\" (UID: \"53c84cc0-8dc4-44c5-89d7-bb118f6b4fe2\") " pod="openstack/nova-cell0-4ee7-account-create-update-fhsvd" Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.575681 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcqpc\" (UniqueName: \"kubernetes.io/projected/53c84cc0-8dc4-44c5-89d7-bb118f6b4fe2-kube-api-access-hcqpc\") pod \"nova-cell0-4ee7-account-create-update-fhsvd\" (UID: \"53c84cc0-8dc4-44c5-89d7-bb118f6b4fe2\") " pod="openstack/nova-cell0-4ee7-account-create-update-fhsvd" Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.575746 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19523fad-3ad4-4c4a-b329-90f8136ce34c-operator-scripts\") pod \"nova-cell1-ddfc-account-create-update-s2nh2\" (UID: \"19523fad-3ad4-4c4a-b329-90f8136ce34c\") " pod="openstack/nova-cell1-ddfc-account-create-update-s2nh2" Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.575765 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-456wq\" (UniqueName: \"kubernetes.io/projected/19523fad-3ad4-4c4a-b329-90f8136ce34c-kube-api-access-456wq\") pod \"nova-cell1-ddfc-account-create-update-s2nh2\" (UID: \"19523fad-3ad4-4c4a-b329-90f8136ce34c\") " pod="openstack/nova-cell1-ddfc-account-create-update-s2nh2" Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.576702 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53c84cc0-8dc4-44c5-89d7-bb118f6b4fe2-operator-scripts\") pod \"nova-cell0-4ee7-account-create-update-fhsvd\" (UID: \"53c84cc0-8dc4-44c5-89d7-bb118f6b4fe2\") " pod="openstack/nova-cell0-4ee7-account-create-update-fhsvd" Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.607949 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcqpc\" (UniqueName: \"kubernetes.io/projected/53c84cc0-8dc4-44c5-89d7-bb118f6b4fe2-kube-api-access-hcqpc\") pod \"nova-cell0-4ee7-account-create-update-fhsvd\" (UID: \"53c84cc0-8dc4-44c5-89d7-bb118f6b4fe2\") " pod="openstack/nova-cell0-4ee7-account-create-update-fhsvd" Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.608517 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4ee7-account-create-update-fhsvd" Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.678795 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19523fad-3ad4-4c4a-b329-90f8136ce34c-operator-scripts\") pod \"nova-cell1-ddfc-account-create-update-s2nh2\" (UID: \"19523fad-3ad4-4c4a-b329-90f8136ce34c\") " pod="openstack/nova-cell1-ddfc-account-create-update-s2nh2" Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.678852 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-456wq\" (UniqueName: \"kubernetes.io/projected/19523fad-3ad4-4c4a-b329-90f8136ce34c-kube-api-access-456wq\") pod \"nova-cell1-ddfc-account-create-update-s2nh2\" (UID: \"19523fad-3ad4-4c4a-b329-90f8136ce34c\") " pod="openstack/nova-cell1-ddfc-account-create-update-s2nh2" Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.679932 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19523fad-3ad4-4c4a-b329-90f8136ce34c-operator-scripts\") pod \"nova-cell1-ddfc-account-create-update-s2nh2\" (UID: \"19523fad-3ad4-4c4a-b329-90f8136ce34c\") " pod="openstack/nova-cell1-ddfc-account-create-update-s2nh2" Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.701932 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-456wq\" (UniqueName: \"kubernetes.io/projected/19523fad-3ad4-4c4a-b329-90f8136ce34c-kube-api-access-456wq\") pod \"nova-cell1-ddfc-account-create-update-s2nh2\" (UID: \"19523fad-3ad4-4c4a-b329-90f8136ce34c\") " pod="openstack/nova-cell1-ddfc-account-create-update-s2nh2" Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.931295 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-c9be-account-create-update-jzpgr"] Mar 09 09:26:22 crc kubenswrapper[4861]: I0309 09:26:22.937910 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-ddfc-account-create-update-s2nh2" Mar 09 09:26:23 crc kubenswrapper[4861]: I0309 09:26:23.106395 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-ct722"] Mar 09 09:26:23 crc kubenswrapper[4861]: I0309 09:26:23.152317 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-ct722" event={"ID":"c1271b64-efb8-425e-8718-1e28003a5722","Type":"ContainerStarted","Data":"7b9a6c68944b06c13cedbd2b2f060267f7895dcf1b853ee6b822e3cc6a9f3dde"} Mar 09 09:26:23 crc kubenswrapper[4861]: I0309 09:26:23.157981 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c9be-account-create-update-jzpgr" event={"ID":"bde3ec7b-bd8f-4936-91b4-c8a7063628c8","Type":"ContainerStarted","Data":"b909fa2f8bea7c5cf70d927e7d3fbc9ea8cc83ce75b0edab6fc76b0fe0cd99d1"} Mar 09 09:26:23 crc kubenswrapper[4861]: I0309 09:26:23.200449 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"36eb13a3-0fad-4a47-bc5f-088f0aab8e02","Type":"ContainerStarted","Data":"1cbca8a8f9f11e3541a4fffe54f11b0b1d7d9ed5f6be82ecb2870f924fb647bf"} Mar 09 09:26:23 crc kubenswrapper[4861]: I0309 09:26:23.200574 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="36eb13a3-0fad-4a47-bc5f-088f0aab8e02" containerName="ceilometer-central-agent" containerID="cri-o://378bd39844083889540a1130823a9e5095b16051c3ff3eb3a9eb38dc5f314bb9" gracePeriod=30 Mar 09 09:26:23 crc kubenswrapper[4861]: I0309 09:26:23.200592 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 09:26:23 crc kubenswrapper[4861]: I0309 09:26:23.200711 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="36eb13a3-0fad-4a47-bc5f-088f0aab8e02" containerName="proxy-httpd" containerID="cri-o://1cbca8a8f9f11e3541a4fffe54f11b0b1d7d9ed5f6be82ecb2870f924fb647bf" gracePeriod=30 Mar 09 09:26:23 crc kubenswrapper[4861]: I0309 09:26:23.200752 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="36eb13a3-0fad-4a47-bc5f-088f0aab8e02" containerName="sg-core" containerID="cri-o://dc011b0d3097c5d8fcb166b831a5c5191de0b1babfe195d3f66462d7260c158e" gracePeriod=30 Mar 09 09:26:23 crc kubenswrapper[4861]: I0309 09:26:23.200784 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="36eb13a3-0fad-4a47-bc5f-088f0aab8e02" containerName="ceilometer-notification-agent" containerID="cri-o://a9b2451098b200e3e8fadd7b8cde9256b31d3db6fe2e8c25fd0d292c75468f84" gracePeriod=30 Mar 09 09:26:23 crc kubenswrapper[4861]: I0309 09:26:23.211338 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-z4xzx"] Mar 09 09:26:23 crc kubenswrapper[4861]: I0309 09:26:23.234162 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.435569141 podStartE2EDuration="5.234145884s" podCreationTimestamp="2026-03-09 09:26:18 +0000 UTC" firstStartedPulling="2026-03-09 09:26:18.992551647 +0000 UTC m=+1222.077591048" lastFinishedPulling="2026-03-09 09:26:22.79112839 +0000 UTC m=+1225.876167791" observedRunningTime="2026-03-09 09:26:23.232910397 +0000 UTC m=+1226.317949808" watchObservedRunningTime="2026-03-09 09:26:23.234145884 +0000 UTC m=+1226.319185285" Mar 09 09:26:23 crc kubenswrapper[4861]: I0309 09:26:23.315958 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-4f2ds"] Mar 09 09:26:23 crc kubenswrapper[4861]: I0309 09:26:23.322213 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-4ee7-account-create-update-fhsvd"] Mar 09 09:26:23 crc kubenswrapper[4861]: I0309 09:26:23.517827 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-ddfc-account-create-update-s2nh2"] Mar 09 09:26:24 crc kubenswrapper[4861]: I0309 09:26:24.210354 4861 generic.go:334] "Generic (PLEG): container finished" podID="e727b52d-d582-4d07-a852-edd0a15e1ba7" containerID="152f30f73b4b29dceb1ed0035e331f3a29771538542812988eeffc9e3474a5f9" exitCode=0 Mar 09 09:26:24 crc kubenswrapper[4861]: I0309 09:26:24.210705 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-z4xzx" event={"ID":"e727b52d-d582-4d07-a852-edd0a15e1ba7","Type":"ContainerDied","Data":"152f30f73b4b29dceb1ed0035e331f3a29771538542812988eeffc9e3474a5f9"} Mar 09 09:26:24 crc kubenswrapper[4861]: I0309 09:26:24.210732 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-z4xzx" event={"ID":"e727b52d-d582-4d07-a852-edd0a15e1ba7","Type":"ContainerStarted","Data":"6ffec9b8b88e2da2f23eabb284310fbc393735ba9a282d09915bf41fbe93f5ed"} Mar 09 09:26:24 crc kubenswrapper[4861]: I0309 09:26:24.212599 4861 generic.go:334] "Generic (PLEG): container finished" podID="bf5b6714-95ce-4470-9536-10e1d281b52e" containerID="8dba9ea5cbb350757abc3c6c107260c1845bbb08db4d254372d925dd47300f02" exitCode=0 Mar 09 09:26:24 crc kubenswrapper[4861]: I0309 09:26:24.212673 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-4f2ds" event={"ID":"bf5b6714-95ce-4470-9536-10e1d281b52e","Type":"ContainerDied","Data":"8dba9ea5cbb350757abc3c6c107260c1845bbb08db4d254372d925dd47300f02"} Mar 09 09:26:24 crc kubenswrapper[4861]: I0309 09:26:24.212714 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-4f2ds" event={"ID":"bf5b6714-95ce-4470-9536-10e1d281b52e","Type":"ContainerStarted","Data":"d6c6c3e321c22ff8ecfb2f170e6cbe1d8bf02332b96c5404d237c5700b376bd5"} Mar 09 09:26:24 crc kubenswrapper[4861]: I0309 09:26:24.215347 4861 generic.go:334] "Generic (PLEG): container finished" podID="19523fad-3ad4-4c4a-b329-90f8136ce34c" containerID="dcb93a23b30b98b25417e3ee5e031f7a57d13ca01b3b417824757cfd480d919b" exitCode=0 Mar 09 09:26:24 crc kubenswrapper[4861]: I0309 09:26:24.215407 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-ddfc-account-create-update-s2nh2" event={"ID":"19523fad-3ad4-4c4a-b329-90f8136ce34c","Type":"ContainerDied","Data":"dcb93a23b30b98b25417e3ee5e031f7a57d13ca01b3b417824757cfd480d919b"} Mar 09 09:26:24 crc kubenswrapper[4861]: I0309 09:26:24.215423 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-ddfc-account-create-update-s2nh2" event={"ID":"19523fad-3ad4-4c4a-b329-90f8136ce34c","Type":"ContainerStarted","Data":"5015cf83045548ad8e6db6d7a8b3157741b534fccaae188670aa29358cfe7ca6"} Mar 09 09:26:24 crc kubenswrapper[4861]: I0309 09:26:24.216704 4861 generic.go:334] "Generic (PLEG): container finished" podID="c1271b64-efb8-425e-8718-1e28003a5722" containerID="f93ea925aa9afcb72e0a19f87a27a62a1977b3537b78c1b0133b9e09d8c7df4b" exitCode=0 Mar 09 09:26:24 crc kubenswrapper[4861]: I0309 09:26:24.216768 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-ct722" event={"ID":"c1271b64-efb8-425e-8718-1e28003a5722","Type":"ContainerDied","Data":"f93ea925aa9afcb72e0a19f87a27a62a1977b3537b78c1b0133b9e09d8c7df4b"} Mar 09 09:26:24 crc kubenswrapper[4861]: I0309 09:26:24.218012 4861 generic.go:334] "Generic (PLEG): container finished" podID="bde3ec7b-bd8f-4936-91b4-c8a7063628c8" containerID="7d9b3cf1b6dbf0b577036f782d8c13f846201986b4f1aaba042eaed9a0fec4c8" exitCode=0 Mar 09 09:26:24 crc kubenswrapper[4861]: I0309 09:26:24.218050 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c9be-account-create-update-jzpgr" event={"ID":"bde3ec7b-bd8f-4936-91b4-c8a7063628c8","Type":"ContainerDied","Data":"7d9b3cf1b6dbf0b577036f782d8c13f846201986b4f1aaba042eaed9a0fec4c8"} Mar 09 09:26:24 crc kubenswrapper[4861]: I0309 09:26:24.219327 4861 generic.go:334] "Generic (PLEG): container finished" podID="53c84cc0-8dc4-44c5-89d7-bb118f6b4fe2" containerID="bb043b3f327d1cfc30bfb89247439e71ac9e61ef07686e06bb68810faf01e8c7" exitCode=0 Mar 09 09:26:24 crc kubenswrapper[4861]: I0309 09:26:24.219364 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4ee7-account-create-update-fhsvd" event={"ID":"53c84cc0-8dc4-44c5-89d7-bb118f6b4fe2","Type":"ContainerDied","Data":"bb043b3f327d1cfc30bfb89247439e71ac9e61ef07686e06bb68810faf01e8c7"} Mar 09 09:26:24 crc kubenswrapper[4861]: I0309 09:26:24.219391 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4ee7-account-create-update-fhsvd" event={"ID":"53c84cc0-8dc4-44c5-89d7-bb118f6b4fe2","Type":"ContainerStarted","Data":"9203e8599c38bc61898c04606ca7dc84ce46b39bb3ef7bc469cace5cbf50b3e7"} Mar 09 09:26:24 crc kubenswrapper[4861]: I0309 09:26:24.222872 4861 generic.go:334] "Generic (PLEG): container finished" podID="36eb13a3-0fad-4a47-bc5f-088f0aab8e02" containerID="1cbca8a8f9f11e3541a4fffe54f11b0b1d7d9ed5f6be82ecb2870f924fb647bf" exitCode=0 Mar 09 09:26:24 crc kubenswrapper[4861]: I0309 09:26:24.222901 4861 generic.go:334] "Generic (PLEG): container finished" podID="36eb13a3-0fad-4a47-bc5f-088f0aab8e02" containerID="dc011b0d3097c5d8fcb166b831a5c5191de0b1babfe195d3f66462d7260c158e" exitCode=2 Mar 09 09:26:24 crc kubenswrapper[4861]: I0309 09:26:24.222912 4861 generic.go:334] "Generic (PLEG): container finished" podID="36eb13a3-0fad-4a47-bc5f-088f0aab8e02" containerID="a9b2451098b200e3e8fadd7b8cde9256b31d3db6fe2e8c25fd0d292c75468f84" exitCode=0 Mar 09 09:26:24 crc kubenswrapper[4861]: I0309 09:26:24.222934 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"36eb13a3-0fad-4a47-bc5f-088f0aab8e02","Type":"ContainerDied","Data":"1cbca8a8f9f11e3541a4fffe54f11b0b1d7d9ed5f6be82ecb2870f924fb647bf"} Mar 09 09:26:24 crc kubenswrapper[4861]: I0309 09:26:24.222956 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"36eb13a3-0fad-4a47-bc5f-088f0aab8e02","Type":"ContainerDied","Data":"dc011b0d3097c5d8fcb166b831a5c5191de0b1babfe195d3f66462d7260c158e"} Mar 09 09:26:24 crc kubenswrapper[4861]: I0309 09:26:24.222971 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"36eb13a3-0fad-4a47-bc5f-088f0aab8e02","Type":"ContainerDied","Data":"a9b2451098b200e3e8fadd7b8cde9256b31d3db6fe2e8c25fd0d292c75468f84"} Mar 09 09:26:25 crc kubenswrapper[4861]: I0309 09:26:25.658918 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-4f2ds" Mar 09 09:26:25 crc kubenswrapper[4861]: I0309 09:26:25.742256 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gb447\" (UniqueName: \"kubernetes.io/projected/bf5b6714-95ce-4470-9536-10e1d281b52e-kube-api-access-gb447\") pod \"bf5b6714-95ce-4470-9536-10e1d281b52e\" (UID: \"bf5b6714-95ce-4470-9536-10e1d281b52e\") " Mar 09 09:26:25 crc kubenswrapper[4861]: I0309 09:26:25.742322 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf5b6714-95ce-4470-9536-10e1d281b52e-operator-scripts\") pod \"bf5b6714-95ce-4470-9536-10e1d281b52e\" (UID: \"bf5b6714-95ce-4470-9536-10e1d281b52e\") " Mar 09 09:26:25 crc kubenswrapper[4861]: I0309 09:26:25.744671 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf5b6714-95ce-4470-9536-10e1d281b52e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bf5b6714-95ce-4470-9536-10e1d281b52e" (UID: "bf5b6714-95ce-4470-9536-10e1d281b52e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:26:25 crc kubenswrapper[4861]: I0309 09:26:25.765637 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf5b6714-95ce-4470-9536-10e1d281b52e-kube-api-access-gb447" (OuterVolumeSpecName: "kube-api-access-gb447") pod "bf5b6714-95ce-4470-9536-10e1d281b52e" (UID: "bf5b6714-95ce-4470-9536-10e1d281b52e"). InnerVolumeSpecName "kube-api-access-gb447". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:26:25 crc kubenswrapper[4861]: I0309 09:26:25.846080 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gb447\" (UniqueName: \"kubernetes.io/projected/bf5b6714-95ce-4470-9536-10e1d281b52e-kube-api-access-gb447\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:25 crc kubenswrapper[4861]: I0309 09:26:25.846119 4861 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf5b6714-95ce-4470-9536-10e1d281b52e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:25 crc kubenswrapper[4861]: I0309 09:26:25.972873 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-ddfc-account-create-update-s2nh2" Mar 09 09:26:25 crc kubenswrapper[4861]: I0309 09:26:25.978796 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c9be-account-create-update-jzpgr" Mar 09 09:26:25 crc kubenswrapper[4861]: I0309 09:26:25.982852 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-z4xzx" Mar 09 09:26:25 crc kubenswrapper[4861]: I0309 09:26:25.990707 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4ee7-account-create-update-fhsvd" Mar 09 09:26:26 crc kubenswrapper[4861]: I0309 09:26:26.007982 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-ct722" Mar 09 09:26:26 crc kubenswrapper[4861]: I0309 09:26:26.050048 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2fd4\" (UniqueName: \"kubernetes.io/projected/e727b52d-d582-4d07-a852-edd0a15e1ba7-kube-api-access-h2fd4\") pod \"e727b52d-d582-4d07-a852-edd0a15e1ba7\" (UID: \"e727b52d-d582-4d07-a852-edd0a15e1ba7\") " Mar 09 09:26:26 crc kubenswrapper[4861]: I0309 09:26:26.050106 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e727b52d-d582-4d07-a852-edd0a15e1ba7-operator-scripts\") pod \"e727b52d-d582-4d07-a852-edd0a15e1ba7\" (UID: \"e727b52d-d582-4d07-a852-edd0a15e1ba7\") " Mar 09 09:26:26 crc kubenswrapper[4861]: I0309 09:26:26.050162 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bde3ec7b-bd8f-4936-91b4-c8a7063628c8-operator-scripts\") pod \"bde3ec7b-bd8f-4936-91b4-c8a7063628c8\" (UID: \"bde3ec7b-bd8f-4936-91b4-c8a7063628c8\") " Mar 09 09:26:26 crc kubenswrapper[4861]: I0309 09:26:26.050249 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19523fad-3ad4-4c4a-b329-90f8136ce34c-operator-scripts\") pod \"19523fad-3ad4-4c4a-b329-90f8136ce34c\" (UID: \"19523fad-3ad4-4c4a-b329-90f8136ce34c\") " Mar 09 09:26:26 crc kubenswrapper[4861]: I0309 09:26:26.050296 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-456wq\" (UniqueName: \"kubernetes.io/projected/19523fad-3ad4-4c4a-b329-90f8136ce34c-kube-api-access-456wq\") pod \"19523fad-3ad4-4c4a-b329-90f8136ce34c\" (UID: \"19523fad-3ad4-4c4a-b329-90f8136ce34c\") " Mar 09 09:26:26 crc kubenswrapper[4861]: I0309 09:26:26.050336 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htqcc\" (UniqueName: \"kubernetes.io/projected/bde3ec7b-bd8f-4936-91b4-c8a7063628c8-kube-api-access-htqcc\") pod \"bde3ec7b-bd8f-4936-91b4-c8a7063628c8\" (UID: \"bde3ec7b-bd8f-4936-91b4-c8a7063628c8\") " Mar 09 09:26:26 crc kubenswrapper[4861]: I0309 09:26:26.050820 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19523fad-3ad4-4c4a-b329-90f8136ce34c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "19523fad-3ad4-4c4a-b329-90f8136ce34c" (UID: "19523fad-3ad4-4c4a-b329-90f8136ce34c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:26:26 crc kubenswrapper[4861]: I0309 09:26:26.050961 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e727b52d-d582-4d07-a852-edd0a15e1ba7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e727b52d-d582-4d07-a852-edd0a15e1ba7" (UID: "e727b52d-d582-4d07-a852-edd0a15e1ba7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:26:26 crc kubenswrapper[4861]: I0309 09:26:26.051114 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bde3ec7b-bd8f-4936-91b4-c8a7063628c8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bde3ec7b-bd8f-4936-91b4-c8a7063628c8" (UID: "bde3ec7b-bd8f-4936-91b4-c8a7063628c8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:26:26 crc kubenswrapper[4861]: I0309 09:26:26.056893 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bde3ec7b-bd8f-4936-91b4-c8a7063628c8-kube-api-access-htqcc" (OuterVolumeSpecName: "kube-api-access-htqcc") pod "bde3ec7b-bd8f-4936-91b4-c8a7063628c8" (UID: "bde3ec7b-bd8f-4936-91b4-c8a7063628c8"). InnerVolumeSpecName "kube-api-access-htqcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:26:26 crc kubenswrapper[4861]: I0309 09:26:26.056923 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19523fad-3ad4-4c4a-b329-90f8136ce34c-kube-api-access-456wq" (OuterVolumeSpecName: "kube-api-access-456wq") pod "19523fad-3ad4-4c4a-b329-90f8136ce34c" (UID: "19523fad-3ad4-4c4a-b329-90f8136ce34c"). InnerVolumeSpecName "kube-api-access-456wq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:26:26 crc kubenswrapper[4861]: I0309 09:26:26.057514 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e727b52d-d582-4d07-a852-edd0a15e1ba7-kube-api-access-h2fd4" (OuterVolumeSpecName: "kube-api-access-h2fd4") pod "e727b52d-d582-4d07-a852-edd0a15e1ba7" (UID: "e727b52d-d582-4d07-a852-edd0a15e1ba7"). InnerVolumeSpecName "kube-api-access-h2fd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:26:26 crc kubenswrapper[4861]: I0309 09:26:26.152004 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9rq8\" (UniqueName: \"kubernetes.io/projected/c1271b64-efb8-425e-8718-1e28003a5722-kube-api-access-c9rq8\") pod \"c1271b64-efb8-425e-8718-1e28003a5722\" (UID: \"c1271b64-efb8-425e-8718-1e28003a5722\") " Mar 09 09:26:26 crc kubenswrapper[4861]: I0309 09:26:26.152137 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1271b64-efb8-425e-8718-1e28003a5722-operator-scripts\") pod \"c1271b64-efb8-425e-8718-1e28003a5722\" (UID: \"c1271b64-efb8-425e-8718-1e28003a5722\") " Mar 09 09:26:26 crc kubenswrapper[4861]: I0309 09:26:26.152245 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53c84cc0-8dc4-44c5-89d7-bb118f6b4fe2-operator-scripts\") pod \"53c84cc0-8dc4-44c5-89d7-bb118f6b4fe2\" (UID: \"53c84cc0-8dc4-44c5-89d7-bb118f6b4fe2\") " Mar 09 09:26:26 crc kubenswrapper[4861]: I0309 09:26:26.152281 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcqpc\" (UniqueName: \"kubernetes.io/projected/53c84cc0-8dc4-44c5-89d7-bb118f6b4fe2-kube-api-access-hcqpc\") pod \"53c84cc0-8dc4-44c5-89d7-bb118f6b4fe2\" (UID: \"53c84cc0-8dc4-44c5-89d7-bb118f6b4fe2\") " Mar 09 09:26:26 crc kubenswrapper[4861]: I0309 09:26:26.152652 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1271b64-efb8-425e-8718-1e28003a5722-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c1271b64-efb8-425e-8718-1e28003a5722" (UID: "c1271b64-efb8-425e-8718-1e28003a5722"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:26:26 crc kubenswrapper[4861]: I0309 09:26:26.152714 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53c84cc0-8dc4-44c5-89d7-bb118f6b4fe2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "53c84cc0-8dc4-44c5-89d7-bb118f6b4fe2" (UID: "53c84cc0-8dc4-44c5-89d7-bb118f6b4fe2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:26:26 crc kubenswrapper[4861]: I0309 09:26:26.153140 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-456wq\" (UniqueName: \"kubernetes.io/projected/19523fad-3ad4-4c4a-b329-90f8136ce34c-kube-api-access-456wq\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:26 crc kubenswrapper[4861]: I0309 09:26:26.153164 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htqcc\" (UniqueName: \"kubernetes.io/projected/bde3ec7b-bd8f-4936-91b4-c8a7063628c8-kube-api-access-htqcc\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:26 crc kubenswrapper[4861]: I0309 09:26:26.153176 4861 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1271b64-efb8-425e-8718-1e28003a5722-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:26 crc kubenswrapper[4861]: I0309 09:26:26.153187 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2fd4\" (UniqueName: \"kubernetes.io/projected/e727b52d-d582-4d07-a852-edd0a15e1ba7-kube-api-access-h2fd4\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:26 crc kubenswrapper[4861]: I0309 09:26:26.153199 4861 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e727b52d-d582-4d07-a852-edd0a15e1ba7-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:26 crc kubenswrapper[4861]: I0309 09:26:26.153207 4861 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/53c84cc0-8dc4-44c5-89d7-bb118f6b4fe2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:26 crc kubenswrapper[4861]: I0309 09:26:26.153216 4861 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bde3ec7b-bd8f-4936-91b4-c8a7063628c8-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:26 crc kubenswrapper[4861]: I0309 09:26:26.153224 4861 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/19523fad-3ad4-4c4a-b329-90f8136ce34c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:26 crc kubenswrapper[4861]: I0309 09:26:26.155020 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1271b64-efb8-425e-8718-1e28003a5722-kube-api-access-c9rq8" (OuterVolumeSpecName: "kube-api-access-c9rq8") pod "c1271b64-efb8-425e-8718-1e28003a5722" (UID: "c1271b64-efb8-425e-8718-1e28003a5722"). InnerVolumeSpecName "kube-api-access-c9rq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:26:26 crc kubenswrapper[4861]: I0309 09:26:26.155463 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53c84cc0-8dc4-44c5-89d7-bb118f6b4fe2-kube-api-access-hcqpc" (OuterVolumeSpecName: "kube-api-access-hcqpc") pod "53c84cc0-8dc4-44c5-89d7-bb118f6b4fe2" (UID: "53c84cc0-8dc4-44c5-89d7-bb118f6b4fe2"). InnerVolumeSpecName "kube-api-access-hcqpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:26:26 crc kubenswrapper[4861]: I0309 09:26:26.243559 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c9be-account-create-update-jzpgr" event={"ID":"bde3ec7b-bd8f-4936-91b4-c8a7063628c8","Type":"ContainerDied","Data":"b909fa2f8bea7c5cf70d927e7d3fbc9ea8cc83ce75b0edab6fc76b0fe0cd99d1"} Mar 09 09:26:26 crc kubenswrapper[4861]: I0309 09:26:26.243644 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b909fa2f8bea7c5cf70d927e7d3fbc9ea8cc83ce75b0edab6fc76b0fe0cd99d1" Mar 09 09:26:26 crc kubenswrapper[4861]: I0309 09:26:26.243627 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c9be-account-create-update-jzpgr" Mar 09 09:26:26 crc kubenswrapper[4861]: I0309 09:26:26.245092 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4ee7-account-create-update-fhsvd" event={"ID":"53c84cc0-8dc4-44c5-89d7-bb118f6b4fe2","Type":"ContainerDied","Data":"9203e8599c38bc61898c04606ca7dc84ce46b39bb3ef7bc469cace5cbf50b3e7"} Mar 09 09:26:26 crc kubenswrapper[4861]: I0309 09:26:26.245164 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9203e8599c38bc61898c04606ca7dc84ce46b39bb3ef7bc469cace5cbf50b3e7" Mar 09 09:26:26 crc kubenswrapper[4861]: I0309 09:26:26.245137 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4ee7-account-create-update-fhsvd" Mar 09 09:26:26 crc kubenswrapper[4861]: I0309 09:26:26.246485 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-z4xzx" event={"ID":"e727b52d-d582-4d07-a852-edd0a15e1ba7","Type":"ContainerDied","Data":"6ffec9b8b88e2da2f23eabb284310fbc393735ba9a282d09915bf41fbe93f5ed"} Mar 09 09:26:26 crc kubenswrapper[4861]: I0309 09:26:26.246513 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ffec9b8b88e2da2f23eabb284310fbc393735ba9a282d09915bf41fbe93f5ed" Mar 09 09:26:26 crc kubenswrapper[4861]: I0309 09:26:26.246552 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-z4xzx" Mar 09 09:26:26 crc kubenswrapper[4861]: I0309 09:26:26.251518 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-4f2ds" event={"ID":"bf5b6714-95ce-4470-9536-10e1d281b52e","Type":"ContainerDied","Data":"d6c6c3e321c22ff8ecfb2f170e6cbe1d8bf02332b96c5404d237c5700b376bd5"} Mar 09 09:26:26 crc kubenswrapper[4861]: I0309 09:26:26.251552 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6c6c3e321c22ff8ecfb2f170e6cbe1d8bf02332b96c5404d237c5700b376bd5" Mar 09 09:26:26 crc kubenswrapper[4861]: I0309 09:26:26.251527 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-4f2ds" Mar 09 09:26:26 crc kubenswrapper[4861]: I0309 09:26:26.253988 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-ddfc-account-create-update-s2nh2" Mar 09 09:26:26 crc kubenswrapper[4861]: I0309 09:26:26.253971 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-ddfc-account-create-update-s2nh2" event={"ID":"19523fad-3ad4-4c4a-b329-90f8136ce34c","Type":"ContainerDied","Data":"5015cf83045548ad8e6db6d7a8b3157741b534fccaae188670aa29358cfe7ca6"} Mar 09 09:26:26 crc kubenswrapper[4861]: I0309 09:26:26.254133 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5015cf83045548ad8e6db6d7a8b3157741b534fccaae188670aa29358cfe7ca6" Mar 09 09:26:26 crc kubenswrapper[4861]: I0309 09:26:26.254993 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcqpc\" (UniqueName: \"kubernetes.io/projected/53c84cc0-8dc4-44c5-89d7-bb118f6b4fe2-kube-api-access-hcqpc\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:26 crc kubenswrapper[4861]: I0309 09:26:26.255013 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9rq8\" (UniqueName: \"kubernetes.io/projected/c1271b64-efb8-425e-8718-1e28003a5722-kube-api-access-c9rq8\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:26 crc kubenswrapper[4861]: I0309 09:26:26.255635 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-ct722" event={"ID":"c1271b64-efb8-425e-8718-1e28003a5722","Type":"ContainerDied","Data":"7b9a6c68944b06c13cedbd2b2f060267f7895dcf1b853ee6b822e3cc6a9f3dde"} Mar 09 09:26:26 crc kubenswrapper[4861]: I0309 09:26:26.255670 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b9a6c68944b06c13cedbd2b2f060267f7895dcf1b853ee6b822e3cc6a9f3dde" Mar 09 09:26:26 crc kubenswrapper[4861]: I0309 09:26:26.255675 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-ct722" Mar 09 09:26:26 crc kubenswrapper[4861]: E0309 09:26:26.880815 4861 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48fbe4a1_81ab_4a46_8150_821bc8afa220.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48fbe4a1_81ab_4a46_8150_821bc8afa220.slice/crio-3cace9b3826297e13b51f776c4a63465d7f30becfc9c4dce3918a85756fc792e\": RecentStats: unable to find data in memory cache]" Mar 09 09:26:27 crc kubenswrapper[4861]: I0309 09:26:27.616451 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6zv4b"] Mar 09 09:26:27 crc kubenswrapper[4861]: E0309 09:26:27.616853 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1271b64-efb8-425e-8718-1e28003a5722" containerName="mariadb-database-create" Mar 09 09:26:27 crc kubenswrapper[4861]: I0309 09:26:27.616870 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1271b64-efb8-425e-8718-1e28003a5722" containerName="mariadb-database-create" Mar 09 09:26:27 crc kubenswrapper[4861]: E0309 09:26:27.616884 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19523fad-3ad4-4c4a-b329-90f8136ce34c" containerName="mariadb-account-create-update" Mar 09 09:26:27 crc kubenswrapper[4861]: I0309 09:26:27.616890 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="19523fad-3ad4-4c4a-b329-90f8136ce34c" containerName="mariadb-account-create-update" Mar 09 09:26:27 crc kubenswrapper[4861]: E0309 09:26:27.616903 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e727b52d-d582-4d07-a852-edd0a15e1ba7" containerName="mariadb-database-create" Mar 09 09:26:27 crc kubenswrapper[4861]: I0309 09:26:27.616909 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="e727b52d-d582-4d07-a852-edd0a15e1ba7" containerName="mariadb-database-create" Mar 09 09:26:27 crc kubenswrapper[4861]: E0309 09:26:27.616919 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53c84cc0-8dc4-44c5-89d7-bb118f6b4fe2" containerName="mariadb-account-create-update" Mar 09 09:26:27 crc kubenswrapper[4861]: I0309 09:26:27.616924 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="53c84cc0-8dc4-44c5-89d7-bb118f6b4fe2" containerName="mariadb-account-create-update" Mar 09 09:26:27 crc kubenswrapper[4861]: E0309 09:26:27.616938 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bde3ec7b-bd8f-4936-91b4-c8a7063628c8" containerName="mariadb-account-create-update" Mar 09 09:26:27 crc kubenswrapper[4861]: I0309 09:26:27.616943 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="bde3ec7b-bd8f-4936-91b4-c8a7063628c8" containerName="mariadb-account-create-update" Mar 09 09:26:27 crc kubenswrapper[4861]: E0309 09:26:27.616954 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf5b6714-95ce-4470-9536-10e1d281b52e" containerName="mariadb-database-create" Mar 09 09:26:27 crc kubenswrapper[4861]: I0309 09:26:27.616960 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf5b6714-95ce-4470-9536-10e1d281b52e" containerName="mariadb-database-create" Mar 09 09:26:27 crc kubenswrapper[4861]: I0309 09:26:27.617116 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="19523fad-3ad4-4c4a-b329-90f8136ce34c" containerName="mariadb-account-create-update" Mar 09 09:26:27 crc kubenswrapper[4861]: I0309 09:26:27.617133 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf5b6714-95ce-4470-9536-10e1d281b52e" containerName="mariadb-database-create" Mar 09 09:26:27 crc kubenswrapper[4861]: I0309 09:26:27.617146 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="e727b52d-d582-4d07-a852-edd0a15e1ba7" containerName="mariadb-database-create" Mar 09 09:26:27 crc kubenswrapper[4861]: I0309 09:26:27.617156 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="bde3ec7b-bd8f-4936-91b4-c8a7063628c8" containerName="mariadb-account-create-update" Mar 09 09:26:27 crc kubenswrapper[4861]: I0309 09:26:27.617164 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1271b64-efb8-425e-8718-1e28003a5722" containerName="mariadb-database-create" Mar 09 09:26:27 crc kubenswrapper[4861]: I0309 09:26:27.617177 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="53c84cc0-8dc4-44c5-89d7-bb118f6b4fe2" containerName="mariadb-account-create-update" Mar 09 09:26:27 crc kubenswrapper[4861]: I0309 09:26:27.617860 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-6zv4b" Mar 09 09:26:27 crc kubenswrapper[4861]: I0309 09:26:27.624486 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 09 09:26:27 crc kubenswrapper[4861]: I0309 09:26:27.624531 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 09 09:26:27 crc kubenswrapper[4861]: I0309 09:26:27.624658 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-g64g6" Mar 09 09:26:27 crc kubenswrapper[4861]: I0309 09:26:27.640558 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6zv4b"] Mar 09 09:26:27 crc kubenswrapper[4861]: I0309 09:26:27.779427 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4599154b-2118-461d-9999-d07931415f9c-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-6zv4b\" (UID: \"4599154b-2118-461d-9999-d07931415f9c\") " pod="openstack/nova-cell0-conductor-db-sync-6zv4b" Mar 09 09:26:27 crc kubenswrapper[4861]: I0309 09:26:27.779498 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4599154b-2118-461d-9999-d07931415f9c-config-data\") pod \"nova-cell0-conductor-db-sync-6zv4b\" (UID: \"4599154b-2118-461d-9999-d07931415f9c\") " pod="openstack/nova-cell0-conductor-db-sync-6zv4b" Mar 09 09:26:27 crc kubenswrapper[4861]: I0309 09:26:27.779602 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cf5jx\" (UniqueName: \"kubernetes.io/projected/4599154b-2118-461d-9999-d07931415f9c-kube-api-access-cf5jx\") pod \"nova-cell0-conductor-db-sync-6zv4b\" (UID: \"4599154b-2118-461d-9999-d07931415f9c\") " pod="openstack/nova-cell0-conductor-db-sync-6zv4b" Mar 09 09:26:27 crc kubenswrapper[4861]: I0309 09:26:27.779700 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4599154b-2118-461d-9999-d07931415f9c-scripts\") pod \"nova-cell0-conductor-db-sync-6zv4b\" (UID: \"4599154b-2118-461d-9999-d07931415f9c\") " pod="openstack/nova-cell0-conductor-db-sync-6zv4b" Mar 09 09:26:27 crc kubenswrapper[4861]: I0309 09:26:27.881075 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cf5jx\" (UniqueName: \"kubernetes.io/projected/4599154b-2118-461d-9999-d07931415f9c-kube-api-access-cf5jx\") pod \"nova-cell0-conductor-db-sync-6zv4b\" (UID: \"4599154b-2118-461d-9999-d07931415f9c\") " pod="openstack/nova-cell0-conductor-db-sync-6zv4b" Mar 09 09:26:27 crc kubenswrapper[4861]: I0309 09:26:27.881155 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4599154b-2118-461d-9999-d07931415f9c-scripts\") pod \"nova-cell0-conductor-db-sync-6zv4b\" (UID: \"4599154b-2118-461d-9999-d07931415f9c\") " pod="openstack/nova-cell0-conductor-db-sync-6zv4b" Mar 09 09:26:27 crc kubenswrapper[4861]: I0309 09:26:27.881232 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4599154b-2118-461d-9999-d07931415f9c-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-6zv4b\" (UID: \"4599154b-2118-461d-9999-d07931415f9c\") " pod="openstack/nova-cell0-conductor-db-sync-6zv4b" Mar 09 09:26:27 crc kubenswrapper[4861]: I0309 09:26:27.881260 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4599154b-2118-461d-9999-d07931415f9c-config-data\") pod \"nova-cell0-conductor-db-sync-6zv4b\" (UID: \"4599154b-2118-461d-9999-d07931415f9c\") " pod="openstack/nova-cell0-conductor-db-sync-6zv4b" Mar 09 09:26:27 crc kubenswrapper[4861]: I0309 09:26:27.889683 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4599154b-2118-461d-9999-d07931415f9c-scripts\") pod \"nova-cell0-conductor-db-sync-6zv4b\" (UID: \"4599154b-2118-461d-9999-d07931415f9c\") " pod="openstack/nova-cell0-conductor-db-sync-6zv4b" Mar 09 09:26:27 crc kubenswrapper[4861]: I0309 09:26:27.889787 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4599154b-2118-461d-9999-d07931415f9c-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-6zv4b\" (UID: \"4599154b-2118-461d-9999-d07931415f9c\") " pod="openstack/nova-cell0-conductor-db-sync-6zv4b" Mar 09 09:26:27 crc kubenswrapper[4861]: I0309 09:26:27.890885 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4599154b-2118-461d-9999-d07931415f9c-config-data\") pod \"nova-cell0-conductor-db-sync-6zv4b\" (UID: \"4599154b-2118-461d-9999-d07931415f9c\") " pod="openstack/nova-cell0-conductor-db-sync-6zv4b" Mar 09 09:26:27 crc kubenswrapper[4861]: I0309 09:26:27.917501 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cf5jx\" (UniqueName: \"kubernetes.io/projected/4599154b-2118-461d-9999-d07931415f9c-kube-api-access-cf5jx\") pod \"nova-cell0-conductor-db-sync-6zv4b\" (UID: \"4599154b-2118-461d-9999-d07931415f9c\") " pod="openstack/nova-cell0-conductor-db-sync-6zv4b" Mar 09 09:26:27 crc kubenswrapper[4861]: I0309 09:26:27.941279 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-6zv4b" Mar 09 09:26:28 crc kubenswrapper[4861]: I0309 09:26:28.209854 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6zv4b"] Mar 09 09:26:28 crc kubenswrapper[4861]: W0309 09:26:28.220601 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4599154b_2118_461d_9999_d07931415f9c.slice/crio-8506cffe2da3cf31dfc1994ad585554c3e0566cb785d3007e1ef35e9b324a0cd WatchSource:0}: Error finding container 8506cffe2da3cf31dfc1994ad585554c3e0566cb785d3007e1ef35e9b324a0cd: Status 404 returned error can't find the container with id 8506cffe2da3cf31dfc1994ad585554c3e0566cb785d3007e1ef35e9b324a0cd Mar 09 09:26:28 crc kubenswrapper[4861]: I0309 09:26:28.275264 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-6zv4b" event={"ID":"4599154b-2118-461d-9999-d07931415f9c","Type":"ContainerStarted","Data":"8506cffe2da3cf31dfc1994ad585554c3e0566cb785d3007e1ef35e9b324a0cd"} Mar 09 09:26:29 crc kubenswrapper[4861]: I0309 09:26:29.628117 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 09:26:29 crc kubenswrapper[4861]: I0309 09:26:29.628622 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="628d8d0c-948a-4878-ac3f-d1c35befe1d0" containerName="glance-log" containerID="cri-o://66cc06ff1175b455f0e6e180f5bb9076e45beb7156969699f7fcd6f894ab0395" gracePeriod=30 Mar 09 09:26:29 crc kubenswrapper[4861]: I0309 09:26:29.628766 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="628d8d0c-948a-4878-ac3f-d1c35befe1d0" containerName="glance-httpd" containerID="cri-o://69366dc012006a67b0c8071919516dc593bdd15df6f0b37444ccb5fd57bd1e59" gracePeriod=30 Mar 09 09:26:30 crc kubenswrapper[4861]: I0309 09:26:30.301129 4861 generic.go:334] "Generic (PLEG): container finished" podID="628d8d0c-948a-4878-ac3f-d1c35befe1d0" containerID="66cc06ff1175b455f0e6e180f5bb9076e45beb7156969699f7fcd6f894ab0395" exitCode=143 Mar 09 09:26:30 crc kubenswrapper[4861]: I0309 09:26:30.301210 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"628d8d0c-948a-4878-ac3f-d1c35befe1d0","Type":"ContainerDied","Data":"66cc06ff1175b455f0e6e180f5bb9076e45beb7156969699f7fcd6f894ab0395"} Mar 09 09:26:30 crc kubenswrapper[4861]: I0309 09:26:30.441915 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 09:26:30 crc kubenswrapper[4861]: I0309 09:26:30.442189 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="ace45eda-d816-4b28-9be4-88ba845234d5" containerName="glance-log" containerID="cri-o://7f31414b1c79420e6a1f936e6325a3fd7e3796d6f4c036328d8a8a3e0db5bc87" gracePeriod=30 Mar 09 09:26:30 crc kubenswrapper[4861]: I0309 09:26:30.442311 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="ace45eda-d816-4b28-9be4-88ba845234d5" containerName="glance-httpd" containerID="cri-o://2c32fdf4263718449b53d53ce307c950e72cf7461ab5c95d7dad1d0a312e2f13" gracePeriod=30 Mar 09 09:26:31 crc kubenswrapper[4861]: I0309 09:26:31.311882 4861 generic.go:334] "Generic (PLEG): container finished" podID="36eb13a3-0fad-4a47-bc5f-088f0aab8e02" containerID="378bd39844083889540a1130823a9e5095b16051c3ff3eb3a9eb38dc5f314bb9" exitCode=0 Mar 09 09:26:31 crc kubenswrapper[4861]: I0309 09:26:31.311969 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"36eb13a3-0fad-4a47-bc5f-088f0aab8e02","Type":"ContainerDied","Data":"378bd39844083889540a1130823a9e5095b16051c3ff3eb3a9eb38dc5f314bb9"} Mar 09 09:26:31 crc kubenswrapper[4861]: I0309 09:26:31.316641 4861 generic.go:334] "Generic (PLEG): container finished" podID="ace45eda-d816-4b28-9be4-88ba845234d5" containerID="7f31414b1c79420e6a1f936e6325a3fd7e3796d6f4c036328d8a8a3e0db5bc87" exitCode=143 Mar 09 09:26:31 crc kubenswrapper[4861]: I0309 09:26:31.316719 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ace45eda-d816-4b28-9be4-88ba845234d5","Type":"ContainerDied","Data":"7f31414b1c79420e6a1f936e6325a3fd7e3796d6f4c036328d8a8a3e0db5bc87"} Mar 09 09:26:33 crc kubenswrapper[4861]: I0309 09:26:33.345552 4861 generic.go:334] "Generic (PLEG): container finished" podID="628d8d0c-948a-4878-ac3f-d1c35befe1d0" containerID="69366dc012006a67b0c8071919516dc593bdd15df6f0b37444ccb5fd57bd1e59" exitCode=0 Mar 09 09:26:33 crc kubenswrapper[4861]: I0309 09:26:33.345588 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"628d8d0c-948a-4878-ac3f-d1c35befe1d0","Type":"ContainerDied","Data":"69366dc012006a67b0c8071919516dc593bdd15df6f0b37444ccb5fd57bd1e59"} Mar 09 09:26:34 crc kubenswrapper[4861]: I0309 09:26:34.363250 4861 generic.go:334] "Generic (PLEG): container finished" podID="ace45eda-d816-4b28-9be4-88ba845234d5" containerID="2c32fdf4263718449b53d53ce307c950e72cf7461ab5c95d7dad1d0a312e2f13" exitCode=0 Mar 09 09:26:34 crc kubenswrapper[4861]: I0309 09:26:34.363482 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ace45eda-d816-4b28-9be4-88ba845234d5","Type":"ContainerDied","Data":"2c32fdf4263718449b53d53ce307c950e72cf7461ab5c95d7dad1d0a312e2f13"} Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.249930 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.375143 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36eb13a3-0fad-4a47-bc5f-088f0aab8e02-combined-ca-bundle\") pod \"36eb13a3-0fad-4a47-bc5f-088f0aab8e02\" (UID: \"36eb13a3-0fad-4a47-bc5f-088f0aab8e02\") " Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.375201 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/36eb13a3-0fad-4a47-bc5f-088f0aab8e02-log-httpd\") pod \"36eb13a3-0fad-4a47-bc5f-088f0aab8e02\" (UID: \"36eb13a3-0fad-4a47-bc5f-088f0aab8e02\") " Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.375253 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36eb13a3-0fad-4a47-bc5f-088f0aab8e02-scripts\") pod \"36eb13a3-0fad-4a47-bc5f-088f0aab8e02\" (UID: \"36eb13a3-0fad-4a47-bc5f-088f0aab8e02\") " Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.375352 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/36eb13a3-0fad-4a47-bc5f-088f0aab8e02-run-httpd\") pod \"36eb13a3-0fad-4a47-bc5f-088f0aab8e02\" (UID: \"36eb13a3-0fad-4a47-bc5f-088f0aab8e02\") " Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.375471 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/36eb13a3-0fad-4a47-bc5f-088f0aab8e02-sg-core-conf-yaml\") pod \"36eb13a3-0fad-4a47-bc5f-088f0aab8e02\" (UID: \"36eb13a3-0fad-4a47-bc5f-088f0aab8e02\") " Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.375534 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bq8bg\" (UniqueName: \"kubernetes.io/projected/36eb13a3-0fad-4a47-bc5f-088f0aab8e02-kube-api-access-bq8bg\") pod \"36eb13a3-0fad-4a47-bc5f-088f0aab8e02\" (UID: \"36eb13a3-0fad-4a47-bc5f-088f0aab8e02\") " Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.375582 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36eb13a3-0fad-4a47-bc5f-088f0aab8e02-config-data\") pod \"36eb13a3-0fad-4a47-bc5f-088f0aab8e02\" (UID: \"36eb13a3-0fad-4a47-bc5f-088f0aab8e02\") " Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.377336 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36eb13a3-0fad-4a47-bc5f-088f0aab8e02-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "36eb13a3-0fad-4a47-bc5f-088f0aab8e02" (UID: "36eb13a3-0fad-4a47-bc5f-088f0aab8e02"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.382477 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36eb13a3-0fad-4a47-bc5f-088f0aab8e02-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "36eb13a3-0fad-4a47-bc5f-088f0aab8e02" (UID: "36eb13a3-0fad-4a47-bc5f-088f0aab8e02"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.390887 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36eb13a3-0fad-4a47-bc5f-088f0aab8e02-scripts" (OuterVolumeSpecName: "scripts") pod "36eb13a3-0fad-4a47-bc5f-088f0aab8e02" (UID: "36eb13a3-0fad-4a47-bc5f-088f0aab8e02"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.407968 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.408043 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"36eb13a3-0fad-4a47-bc5f-088f0aab8e02","Type":"ContainerDied","Data":"2cd294238a3031870557707ce52c0138a9359046fbe330c529a13590c8e6c1a0"} Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.408512 4861 scope.go:117] "RemoveContainer" containerID="1cbca8a8f9f11e3541a4fffe54f11b0b1d7d9ed5f6be82ecb2870f924fb647bf" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.409111 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36eb13a3-0fad-4a47-bc5f-088f0aab8e02-kube-api-access-bq8bg" (OuterVolumeSpecName: "kube-api-access-bq8bg") pod "36eb13a3-0fad-4a47-bc5f-088f0aab8e02" (UID: "36eb13a3-0fad-4a47-bc5f-088f0aab8e02"). InnerVolumeSpecName "kube-api-access-bq8bg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.426797 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36eb13a3-0fad-4a47-bc5f-088f0aab8e02-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "36eb13a3-0fad-4a47-bc5f-088f0aab8e02" (UID: "36eb13a3-0fad-4a47-bc5f-088f0aab8e02"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.427282 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-6zv4b" event={"ID":"4599154b-2118-461d-9999-d07931415f9c","Type":"ContainerStarted","Data":"b6a18808b1a1ab9736d1bfd5c639a8629172104f526890dce6521c53541fa906"} Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.454648 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-6zv4b" podStartSLOduration=1.626696712 podStartE2EDuration="9.454629997s" podCreationTimestamp="2026-03-09 09:26:27 +0000 UTC" firstStartedPulling="2026-03-09 09:26:28.224728885 +0000 UTC m=+1231.309768286" lastFinishedPulling="2026-03-09 09:26:36.05266217 +0000 UTC m=+1239.137701571" observedRunningTime="2026-03-09 09:26:36.447538913 +0000 UTC m=+1239.532578314" watchObservedRunningTime="2026-03-09 09:26:36.454629997 +0000 UTC m=+1239.539669398" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.463626 4861 scope.go:117] "RemoveContainer" containerID="dc011b0d3097c5d8fcb166b831a5c5191de0b1babfe195d3f66462d7260c158e" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.477972 4861 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/36eb13a3-0fad-4a47-bc5f-088f0aab8e02-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.478015 4861 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/36eb13a3-0fad-4a47-bc5f-088f0aab8e02-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.478027 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bq8bg\" (UniqueName: \"kubernetes.io/projected/36eb13a3-0fad-4a47-bc5f-088f0aab8e02-kube-api-access-bq8bg\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.478038 4861 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/36eb13a3-0fad-4a47-bc5f-088f0aab8e02-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.478048 4861 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36eb13a3-0fad-4a47-bc5f-088f0aab8e02-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.489854 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.493989 4861 scope.go:117] "RemoveContainer" containerID="a9b2451098b200e3e8fadd7b8cde9256b31d3db6fe2e8c25fd0d292c75468f84" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.495054 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.501315 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36eb13a3-0fad-4a47-bc5f-088f0aab8e02-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "36eb13a3-0fad-4a47-bc5f-088f0aab8e02" (UID: "36eb13a3-0fad-4a47-bc5f-088f0aab8e02"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.524014 4861 scope.go:117] "RemoveContainer" containerID="378bd39844083889540a1130823a9e5095b16051c3ff3eb3a9eb38dc5f314bb9" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.581239 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/628d8d0c-948a-4878-ac3f-d1c35befe1d0-scripts\") pod \"628d8d0c-948a-4878-ac3f-d1c35befe1d0\" (UID: \"628d8d0c-948a-4878-ac3f-d1c35befe1d0\") " Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.581487 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/628d8d0c-948a-4878-ac3f-d1c35befe1d0-combined-ca-bundle\") pod \"628d8d0c-948a-4878-ac3f-d1c35befe1d0\" (UID: \"628d8d0c-948a-4878-ac3f-d1c35befe1d0\") " Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.581573 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/628d8d0c-948a-4878-ac3f-d1c35befe1d0-logs\") pod \"628d8d0c-948a-4878-ac3f-d1c35befe1d0\" (UID: \"628d8d0c-948a-4878-ac3f-d1c35befe1d0\") " Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.581691 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/628d8d0c-948a-4878-ac3f-d1c35befe1d0-public-tls-certs\") pod \"628d8d0c-948a-4878-ac3f-d1c35befe1d0\" (UID: \"628d8d0c-948a-4878-ac3f-d1c35befe1d0\") " Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.581743 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/628d8d0c-948a-4878-ac3f-d1c35befe1d0-httpd-run\") pod \"628d8d0c-948a-4878-ac3f-d1c35befe1d0\" (UID: \"628d8d0c-948a-4878-ac3f-d1c35befe1d0\") " Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.581793 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"628d8d0c-948a-4878-ac3f-d1c35befe1d0\" (UID: \"628d8d0c-948a-4878-ac3f-d1c35befe1d0\") " Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.581875 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/628d8d0c-948a-4878-ac3f-d1c35befe1d0-config-data\") pod \"628d8d0c-948a-4878-ac3f-d1c35befe1d0\" (UID: \"628d8d0c-948a-4878-ac3f-d1c35befe1d0\") " Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.581930 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hv4jh\" (UniqueName: \"kubernetes.io/projected/628d8d0c-948a-4878-ac3f-d1c35befe1d0-kube-api-access-hv4jh\") pod \"628d8d0c-948a-4878-ac3f-d1c35befe1d0\" (UID: \"628d8d0c-948a-4878-ac3f-d1c35befe1d0\") " Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.582967 4861 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36eb13a3-0fad-4a47-bc5f-088f0aab8e02-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.583180 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/628d8d0c-948a-4878-ac3f-d1c35befe1d0-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "628d8d0c-948a-4878-ac3f-d1c35befe1d0" (UID: "628d8d0c-948a-4878-ac3f-d1c35befe1d0"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.583556 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/628d8d0c-948a-4878-ac3f-d1c35befe1d0-logs" (OuterVolumeSpecName: "logs") pod "628d8d0c-948a-4878-ac3f-d1c35befe1d0" (UID: "628d8d0c-948a-4878-ac3f-d1c35befe1d0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.584843 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36eb13a3-0fad-4a47-bc5f-088f0aab8e02-config-data" (OuterVolumeSpecName: "config-data") pod "36eb13a3-0fad-4a47-bc5f-088f0aab8e02" (UID: "36eb13a3-0fad-4a47-bc5f-088f0aab8e02"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.588043 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/628d8d0c-948a-4878-ac3f-d1c35befe1d0-kube-api-access-hv4jh" (OuterVolumeSpecName: "kube-api-access-hv4jh") pod "628d8d0c-948a-4878-ac3f-d1c35befe1d0" (UID: "628d8d0c-948a-4878-ac3f-d1c35befe1d0"). InnerVolumeSpecName "kube-api-access-hv4jh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.591540 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/628d8d0c-948a-4878-ac3f-d1c35befe1d0-scripts" (OuterVolumeSpecName: "scripts") pod "628d8d0c-948a-4878-ac3f-d1c35befe1d0" (UID: "628d8d0c-948a-4878-ac3f-d1c35befe1d0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.600318 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "628d8d0c-948a-4878-ac3f-d1c35befe1d0" (UID: "628d8d0c-948a-4878-ac3f-d1c35befe1d0"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.667532 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/628d8d0c-948a-4878-ac3f-d1c35befe1d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "628d8d0c-948a-4878-ac3f-d1c35befe1d0" (UID: "628d8d0c-948a-4878-ac3f-d1c35befe1d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.683490 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/628d8d0c-948a-4878-ac3f-d1c35befe1d0-config-data" (OuterVolumeSpecName: "config-data") pod "628d8d0c-948a-4878-ac3f-d1c35befe1d0" (UID: "628d8d0c-948a-4878-ac3f-d1c35befe1d0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.684088 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ace45eda-d816-4b28-9be4-88ba845234d5-config-data\") pod \"ace45eda-d816-4b28-9be4-88ba845234d5\" (UID: \"ace45eda-d816-4b28-9be4-88ba845234d5\") " Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.684122 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ace45eda-d816-4b28-9be4-88ba845234d5-logs\") pod \"ace45eda-d816-4b28-9be4-88ba845234d5\" (UID: \"ace45eda-d816-4b28-9be4-88ba845234d5\") " Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.684144 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ace45eda-d816-4b28-9be4-88ba845234d5\" (UID: \"ace45eda-d816-4b28-9be4-88ba845234d5\") " Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.684206 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ace45eda-d816-4b28-9be4-88ba845234d5-httpd-run\") pod \"ace45eda-d816-4b28-9be4-88ba845234d5\" (UID: \"ace45eda-d816-4b28-9be4-88ba845234d5\") " Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.684234 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ace45eda-d816-4b28-9be4-88ba845234d5-combined-ca-bundle\") pod \"ace45eda-d816-4b28-9be4-88ba845234d5\" (UID: \"ace45eda-d816-4b28-9be4-88ba845234d5\") " Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.684282 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ace45eda-d816-4b28-9be4-88ba845234d5-internal-tls-certs\") pod \"ace45eda-d816-4b28-9be4-88ba845234d5\" (UID: \"ace45eda-d816-4b28-9be4-88ba845234d5\") " Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.684324 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jt7ng\" (UniqueName: \"kubernetes.io/projected/ace45eda-d816-4b28-9be4-88ba845234d5-kube-api-access-jt7ng\") pod \"ace45eda-d816-4b28-9be4-88ba845234d5\" (UID: \"ace45eda-d816-4b28-9be4-88ba845234d5\") " Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.684343 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ace45eda-d816-4b28-9be4-88ba845234d5-scripts\") pod \"ace45eda-d816-4b28-9be4-88ba845234d5\" (UID: \"ace45eda-d816-4b28-9be4-88ba845234d5\") " Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.684699 4861 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/628d8d0c-948a-4878-ac3f-d1c35befe1d0-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.684714 4861 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/628d8d0c-948a-4878-ac3f-d1c35befe1d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.684724 4861 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/628d8d0c-948a-4878-ac3f-d1c35befe1d0-logs\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.684732 4861 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/628d8d0c-948a-4878-ac3f-d1c35befe1d0-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.684751 4861 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.684760 4861 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36eb13a3-0fad-4a47-bc5f-088f0aab8e02-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.684767 4861 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/628d8d0c-948a-4878-ac3f-d1c35befe1d0-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.684776 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hv4jh\" (UniqueName: \"kubernetes.io/projected/628d8d0c-948a-4878-ac3f-d1c35befe1d0-kube-api-access-hv4jh\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.688026 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ace45eda-d816-4b28-9be4-88ba845234d5-logs" (OuterVolumeSpecName: "logs") pod "ace45eda-d816-4b28-9be4-88ba845234d5" (UID: "ace45eda-d816-4b28-9be4-88ba845234d5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.688266 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ace45eda-d816-4b28-9be4-88ba845234d5-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ace45eda-d816-4b28-9be4-88ba845234d5" (UID: "ace45eda-d816-4b28-9be4-88ba845234d5"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.692337 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ace45eda-d816-4b28-9be4-88ba845234d5-scripts" (OuterVolumeSpecName: "scripts") pod "ace45eda-d816-4b28-9be4-88ba845234d5" (UID: "ace45eda-d816-4b28-9be4-88ba845234d5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.693031 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "ace45eda-d816-4b28-9be4-88ba845234d5" (UID: "ace45eda-d816-4b28-9be4-88ba845234d5"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.695608 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ace45eda-d816-4b28-9be4-88ba845234d5-kube-api-access-jt7ng" (OuterVolumeSpecName: "kube-api-access-jt7ng") pod "ace45eda-d816-4b28-9be4-88ba845234d5" (UID: "ace45eda-d816-4b28-9be4-88ba845234d5"). InnerVolumeSpecName "kube-api-access-jt7ng". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.705972 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/628d8d0c-948a-4878-ac3f-d1c35befe1d0-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "628d8d0c-948a-4878-ac3f-d1c35befe1d0" (UID: "628d8d0c-948a-4878-ac3f-d1c35befe1d0"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.729274 4861 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.749045 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ace45eda-d816-4b28-9be4-88ba845234d5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ace45eda-d816-4b28-9be4-88ba845234d5" (UID: "ace45eda-d816-4b28-9be4-88ba845234d5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.790582 4861 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ace45eda-d816-4b28-9be4-88ba845234d5-logs\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.790623 4861 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.790633 4861 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ace45eda-d816-4b28-9be4-88ba845234d5-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.790641 4861 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ace45eda-d816-4b28-9be4-88ba845234d5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.790651 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jt7ng\" (UniqueName: \"kubernetes.io/projected/ace45eda-d816-4b28-9be4-88ba845234d5-kube-api-access-jt7ng\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.790660 4861 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ace45eda-d816-4b28-9be4-88ba845234d5-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.790669 4861 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/628d8d0c-948a-4878-ac3f-d1c35befe1d0-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.790677 4861 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.817538 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.840017 4861 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.851573 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ace45eda-d816-4b28-9be4-88ba845234d5-config-data" (OuterVolumeSpecName: "config-data") pod "ace45eda-d816-4b28-9be4-88ba845234d5" (UID: "ace45eda-d816-4b28-9be4-88ba845234d5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.865417 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.907718 4861 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ace45eda-d816-4b28-9be4-88ba845234d5-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.907749 4861 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.942462 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:26:36 crc kubenswrapper[4861]: E0309 09:26:36.945451 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36eb13a3-0fad-4a47-bc5f-088f0aab8e02" containerName="ceilometer-central-agent" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.946614 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="36eb13a3-0fad-4a47-bc5f-088f0aab8e02" containerName="ceilometer-central-agent" Mar 09 09:26:36 crc kubenswrapper[4861]: E0309 09:26:36.946754 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="628d8d0c-948a-4878-ac3f-d1c35befe1d0" containerName="glance-httpd" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.947029 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="628d8d0c-948a-4878-ac3f-d1c35befe1d0" containerName="glance-httpd" Mar 09 09:26:36 crc kubenswrapper[4861]: E0309 09:26:36.947148 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36eb13a3-0fad-4a47-bc5f-088f0aab8e02" containerName="sg-core" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.947237 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="36eb13a3-0fad-4a47-bc5f-088f0aab8e02" containerName="sg-core" Mar 09 09:26:36 crc kubenswrapper[4861]: E0309 09:26:36.947648 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="628d8d0c-948a-4878-ac3f-d1c35befe1d0" containerName="glance-log" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.948444 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="628d8d0c-948a-4878-ac3f-d1c35befe1d0" containerName="glance-log" Mar 09 09:26:36 crc kubenswrapper[4861]: E0309 09:26:36.948659 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36eb13a3-0fad-4a47-bc5f-088f0aab8e02" containerName="ceilometer-notification-agent" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.948775 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="36eb13a3-0fad-4a47-bc5f-088f0aab8e02" containerName="ceilometer-notification-agent" Mar 09 09:26:36 crc kubenswrapper[4861]: E0309 09:26:36.948877 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ace45eda-d816-4b28-9be4-88ba845234d5" containerName="glance-httpd" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.948957 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="ace45eda-d816-4b28-9be4-88ba845234d5" containerName="glance-httpd" Mar 09 09:26:36 crc kubenswrapper[4861]: E0309 09:26:36.949080 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36eb13a3-0fad-4a47-bc5f-088f0aab8e02" containerName="proxy-httpd" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.949159 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="36eb13a3-0fad-4a47-bc5f-088f0aab8e02" containerName="proxy-httpd" Mar 09 09:26:36 crc kubenswrapper[4861]: E0309 09:26:36.949358 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ace45eda-d816-4b28-9be4-88ba845234d5" containerName="glance-log" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.949497 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="ace45eda-d816-4b28-9be4-88ba845234d5" containerName="glance-log" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.951514 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="36eb13a3-0fad-4a47-bc5f-088f0aab8e02" containerName="proxy-httpd" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.952904 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="36eb13a3-0fad-4a47-bc5f-088f0aab8e02" containerName="ceilometer-notification-agent" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.953134 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="628d8d0c-948a-4878-ac3f-d1c35befe1d0" containerName="glance-log" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.953220 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="36eb13a3-0fad-4a47-bc5f-088f0aab8e02" containerName="ceilometer-central-agent" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.953350 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="36eb13a3-0fad-4a47-bc5f-088f0aab8e02" containerName="sg-core" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.953519 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="628d8d0c-948a-4878-ac3f-d1c35befe1d0" containerName="glance-httpd" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.953766 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="ace45eda-d816-4b28-9be4-88ba845234d5" containerName="glance-log" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.953897 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="ace45eda-d816-4b28-9be4-88ba845234d5" containerName="glance-httpd" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.970483 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:26:36 crc kubenswrapper[4861]: I0309 09:26:36.971467 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ace45eda-d816-4b28-9be4-88ba845234d5-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ace45eda-d816-4b28-9be4-88ba845234d5" (UID: "ace45eda-d816-4b28-9be4-88ba845234d5"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.003495 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.004605 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.005846 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.013208 4861 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ace45eda-d816-4b28-9be4-88ba845234d5-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.116069 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2943ec09-c4ed-4275-8528-9288665739f9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2943ec09-c4ed-4275-8528-9288665739f9\") " pod="openstack/ceilometer-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.116762 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2943ec09-c4ed-4275-8528-9288665739f9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2943ec09-c4ed-4275-8528-9288665739f9\") " pod="openstack/ceilometer-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.117043 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2943ec09-c4ed-4275-8528-9288665739f9-scripts\") pod \"ceilometer-0\" (UID: \"2943ec09-c4ed-4275-8528-9288665739f9\") " pod="openstack/ceilometer-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.117173 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2943ec09-c4ed-4275-8528-9288665739f9-config-data\") pod \"ceilometer-0\" (UID: \"2943ec09-c4ed-4275-8528-9288665739f9\") " pod="openstack/ceilometer-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.117440 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjgrp\" (UniqueName: \"kubernetes.io/projected/2943ec09-c4ed-4275-8528-9288665739f9-kube-api-access-qjgrp\") pod \"ceilometer-0\" (UID: \"2943ec09-c4ed-4275-8528-9288665739f9\") " pod="openstack/ceilometer-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.117661 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2943ec09-c4ed-4275-8528-9288665739f9-log-httpd\") pod \"ceilometer-0\" (UID: \"2943ec09-c4ed-4275-8528-9288665739f9\") " pod="openstack/ceilometer-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.117855 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2943ec09-c4ed-4275-8528-9288665739f9-run-httpd\") pod \"ceilometer-0\" (UID: \"2943ec09-c4ed-4275-8528-9288665739f9\") " pod="openstack/ceilometer-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.219074 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2943ec09-c4ed-4275-8528-9288665739f9-config-data\") pod \"ceilometer-0\" (UID: \"2943ec09-c4ed-4275-8528-9288665739f9\") " pod="openstack/ceilometer-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.219134 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjgrp\" (UniqueName: \"kubernetes.io/projected/2943ec09-c4ed-4275-8528-9288665739f9-kube-api-access-qjgrp\") pod \"ceilometer-0\" (UID: \"2943ec09-c4ed-4275-8528-9288665739f9\") " pod="openstack/ceilometer-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.219171 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2943ec09-c4ed-4275-8528-9288665739f9-log-httpd\") pod \"ceilometer-0\" (UID: \"2943ec09-c4ed-4275-8528-9288665739f9\") " pod="openstack/ceilometer-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.219219 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2943ec09-c4ed-4275-8528-9288665739f9-run-httpd\") pod \"ceilometer-0\" (UID: \"2943ec09-c4ed-4275-8528-9288665739f9\") " pod="openstack/ceilometer-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.219256 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2943ec09-c4ed-4275-8528-9288665739f9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2943ec09-c4ed-4275-8528-9288665739f9\") " pod="openstack/ceilometer-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.219301 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2943ec09-c4ed-4275-8528-9288665739f9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2943ec09-c4ed-4275-8528-9288665739f9\") " pod="openstack/ceilometer-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.219335 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2943ec09-c4ed-4275-8528-9288665739f9-scripts\") pod \"ceilometer-0\" (UID: \"2943ec09-c4ed-4275-8528-9288665739f9\") " pod="openstack/ceilometer-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.220144 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2943ec09-c4ed-4275-8528-9288665739f9-run-httpd\") pod \"ceilometer-0\" (UID: \"2943ec09-c4ed-4275-8528-9288665739f9\") " pod="openstack/ceilometer-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.220457 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2943ec09-c4ed-4275-8528-9288665739f9-log-httpd\") pod \"ceilometer-0\" (UID: \"2943ec09-c4ed-4275-8528-9288665739f9\") " pod="openstack/ceilometer-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.223338 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2943ec09-c4ed-4275-8528-9288665739f9-scripts\") pod \"ceilometer-0\" (UID: \"2943ec09-c4ed-4275-8528-9288665739f9\") " pod="openstack/ceilometer-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.224458 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2943ec09-c4ed-4275-8528-9288665739f9-config-data\") pod \"ceilometer-0\" (UID: \"2943ec09-c4ed-4275-8528-9288665739f9\") " pod="openstack/ceilometer-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.224723 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2943ec09-c4ed-4275-8528-9288665739f9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2943ec09-c4ed-4275-8528-9288665739f9\") " pod="openstack/ceilometer-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.226108 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2943ec09-c4ed-4275-8528-9288665739f9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2943ec09-c4ed-4275-8528-9288665739f9\") " pod="openstack/ceilometer-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.235497 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjgrp\" (UniqueName: \"kubernetes.io/projected/2943ec09-c4ed-4275-8528-9288665739f9-kube-api-access-qjgrp\") pod \"ceilometer-0\" (UID: \"2943ec09-c4ed-4275-8528-9288665739f9\") " pod="openstack/ceilometer-0" Mar 09 09:26:37 crc kubenswrapper[4861]: E0309 09:26:37.236182 4861 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48fbe4a1_81ab_4a46_8150_821bc8afa220.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48fbe4a1_81ab_4a46_8150_821bc8afa220.slice/crio-3cace9b3826297e13b51f776c4a63465d7f30becfc9c4dce3918a85756fc792e\": RecentStats: unable to find data in memory cache]" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.351115 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.437560 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"628d8d0c-948a-4878-ac3f-d1c35befe1d0","Type":"ContainerDied","Data":"29982d9b7b3a7bb1491811d7268ce1c3239582d260e5cf3190922ddb419fa90c"} Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.437641 4861 scope.go:117] "RemoveContainer" containerID="69366dc012006a67b0c8071919516dc593bdd15df6f0b37444ccb5fd57bd1e59" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.437638 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.441561 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ace45eda-d816-4b28-9be4-88ba845234d5","Type":"ContainerDied","Data":"7ebabdce2450c29dc4efb5ee8ae6dff72e56adf264714a1a9088428f0d2ca244"} Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.441601 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.467827 4861 scope.go:117] "RemoveContainer" containerID="66cc06ff1175b455f0e6e180f5bb9076e45beb7156969699f7fcd6f894ab0395" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.506420 4861 scope.go:117] "RemoveContainer" containerID="2c32fdf4263718449b53d53ce307c950e72cf7461ab5c95d7dad1d0a312e2f13" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.507107 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.516859 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.537855 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.539344 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.543040 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.543212 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.543576 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-vwtrw" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.543765 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.568277 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.568606 4861 scope.go:117] "RemoveContainer" containerID="7f31414b1c79420e6a1f936e6325a3fd7e3796d6f4c036328d8a8a3e0db5bc87" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.581482 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.592209 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.609004 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.610602 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.628332 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.628599 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.628817 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.670682 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36eb13a3-0fad-4a47-bc5f-088f0aab8e02" path="/var/lib/kubelet/pods/36eb13a3-0fad-4a47-bc5f-088f0aab8e02/volumes" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.671531 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="628d8d0c-948a-4878-ac3f-d1c35befe1d0" path="/var/lib/kubelet/pods/628d8d0c-948a-4878-ac3f-d1c35befe1d0/volumes" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.675314 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ace45eda-d816-4b28-9be4-88ba845234d5" path="/var/lib/kubelet/pods/ace45eda-d816-4b28-9be4-88ba845234d5/volumes" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.733027 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6611f0ac-3406-4da9-b81a-1515dddfafcd-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6611f0ac-3406-4da9-b81a-1515dddfafcd\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.733076 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6611f0ac-3406-4da9-b81a-1515dddfafcd-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6611f0ac-3406-4da9-b81a-1515dddfafcd\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.733120 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d237bf3c-da06-48d8-aef3-91be47f05320-config-data\") pod \"glance-default-external-api-0\" (UID: \"d237bf3c-da06-48d8-aef3-91be47f05320\") " pod="openstack/glance-default-external-api-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.733158 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6611f0ac-3406-4da9-b81a-1515dddfafcd-logs\") pod \"glance-default-internal-api-0\" (UID: \"6611f0ac-3406-4da9-b81a-1515dddfafcd\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.733205 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"6611f0ac-3406-4da9-b81a-1515dddfafcd\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.733225 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d237bf3c-da06-48d8-aef3-91be47f05320-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d237bf3c-da06-48d8-aef3-91be47f05320\") " pod="openstack/glance-default-external-api-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.733263 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d237bf3c-da06-48d8-aef3-91be47f05320-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d237bf3c-da06-48d8-aef3-91be47f05320\") " pod="openstack/glance-default-external-api-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.733283 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d237bf3c-da06-48d8-aef3-91be47f05320-logs\") pod \"glance-default-external-api-0\" (UID: \"d237bf3c-da06-48d8-aef3-91be47f05320\") " pod="openstack/glance-default-external-api-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.733358 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d237bf3c-da06-48d8-aef3-91be47f05320-scripts\") pod \"glance-default-external-api-0\" (UID: \"d237bf3c-da06-48d8-aef3-91be47f05320\") " pod="openstack/glance-default-external-api-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.733438 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"d237bf3c-da06-48d8-aef3-91be47f05320\") " pod="openstack/glance-default-external-api-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.733564 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6611f0ac-3406-4da9-b81a-1515dddfafcd-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6611f0ac-3406-4da9-b81a-1515dddfafcd\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.733704 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d237bf3c-da06-48d8-aef3-91be47f05320-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d237bf3c-da06-48d8-aef3-91be47f05320\") " pod="openstack/glance-default-external-api-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.733727 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6611f0ac-3406-4da9-b81a-1515dddfafcd-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6611f0ac-3406-4da9-b81a-1515dddfafcd\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.733775 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b857j\" (UniqueName: \"kubernetes.io/projected/6611f0ac-3406-4da9-b81a-1515dddfafcd-kube-api-access-b857j\") pod \"glance-default-internal-api-0\" (UID: \"6611f0ac-3406-4da9-b81a-1515dddfafcd\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.733793 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6611f0ac-3406-4da9-b81a-1515dddfafcd-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6611f0ac-3406-4da9-b81a-1515dddfafcd\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.733811 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ps8sl\" (UniqueName: \"kubernetes.io/projected/d237bf3c-da06-48d8-aef3-91be47f05320-kube-api-access-ps8sl\") pod \"glance-default-external-api-0\" (UID: \"d237bf3c-da06-48d8-aef3-91be47f05320\") " pod="openstack/glance-default-external-api-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.836008 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d237bf3c-da06-48d8-aef3-91be47f05320-scripts\") pod \"glance-default-external-api-0\" (UID: \"d237bf3c-da06-48d8-aef3-91be47f05320\") " pod="openstack/glance-default-external-api-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.836086 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"d237bf3c-da06-48d8-aef3-91be47f05320\") " pod="openstack/glance-default-external-api-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.836146 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6611f0ac-3406-4da9-b81a-1515dddfafcd-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6611f0ac-3406-4da9-b81a-1515dddfafcd\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.836230 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d237bf3c-da06-48d8-aef3-91be47f05320-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d237bf3c-da06-48d8-aef3-91be47f05320\") " pod="openstack/glance-default-external-api-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.836259 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6611f0ac-3406-4da9-b81a-1515dddfafcd-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6611f0ac-3406-4da9-b81a-1515dddfafcd\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.836329 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b857j\" (UniqueName: \"kubernetes.io/projected/6611f0ac-3406-4da9-b81a-1515dddfafcd-kube-api-access-b857j\") pod \"glance-default-internal-api-0\" (UID: \"6611f0ac-3406-4da9-b81a-1515dddfafcd\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.836398 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6611f0ac-3406-4da9-b81a-1515dddfafcd-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6611f0ac-3406-4da9-b81a-1515dddfafcd\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.836420 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ps8sl\" (UniqueName: \"kubernetes.io/projected/d237bf3c-da06-48d8-aef3-91be47f05320-kube-api-access-ps8sl\") pod \"glance-default-external-api-0\" (UID: \"d237bf3c-da06-48d8-aef3-91be47f05320\") " pod="openstack/glance-default-external-api-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.836496 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6611f0ac-3406-4da9-b81a-1515dddfafcd-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6611f0ac-3406-4da9-b81a-1515dddfafcd\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.836535 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6611f0ac-3406-4da9-b81a-1515dddfafcd-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6611f0ac-3406-4da9-b81a-1515dddfafcd\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.836557 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d237bf3c-da06-48d8-aef3-91be47f05320-config-data\") pod \"glance-default-external-api-0\" (UID: \"d237bf3c-da06-48d8-aef3-91be47f05320\") " pod="openstack/glance-default-external-api-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.836590 4861 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"d237bf3c-da06-48d8-aef3-91be47f05320\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-external-api-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.836615 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6611f0ac-3406-4da9-b81a-1515dddfafcd-logs\") pod \"glance-default-internal-api-0\" (UID: \"6611f0ac-3406-4da9-b81a-1515dddfafcd\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.836684 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"6611f0ac-3406-4da9-b81a-1515dddfafcd\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.836722 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d237bf3c-da06-48d8-aef3-91be47f05320-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d237bf3c-da06-48d8-aef3-91be47f05320\") " pod="openstack/glance-default-external-api-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.836786 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d237bf3c-da06-48d8-aef3-91be47f05320-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d237bf3c-da06-48d8-aef3-91be47f05320\") " pod="openstack/glance-default-external-api-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.836849 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d237bf3c-da06-48d8-aef3-91be47f05320-logs\") pod \"glance-default-external-api-0\" (UID: \"d237bf3c-da06-48d8-aef3-91be47f05320\") " pod="openstack/glance-default-external-api-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.837484 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d237bf3c-da06-48d8-aef3-91be47f05320-logs\") pod \"glance-default-external-api-0\" (UID: \"d237bf3c-da06-48d8-aef3-91be47f05320\") " pod="openstack/glance-default-external-api-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.838296 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6611f0ac-3406-4da9-b81a-1515dddfafcd-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6611f0ac-3406-4da9-b81a-1515dddfafcd\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.838522 4861 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"6611f0ac-3406-4da9-b81a-1515dddfafcd\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-internal-api-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.839275 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d237bf3c-da06-48d8-aef3-91be47f05320-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d237bf3c-da06-48d8-aef3-91be47f05320\") " pod="openstack/glance-default-external-api-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.841329 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6611f0ac-3406-4da9-b81a-1515dddfafcd-logs\") pod \"glance-default-internal-api-0\" (UID: \"6611f0ac-3406-4da9-b81a-1515dddfafcd\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.844583 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d237bf3c-da06-48d8-aef3-91be47f05320-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d237bf3c-da06-48d8-aef3-91be47f05320\") " pod="openstack/glance-default-external-api-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.844832 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d237bf3c-da06-48d8-aef3-91be47f05320-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d237bf3c-da06-48d8-aef3-91be47f05320\") " pod="openstack/glance-default-external-api-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.845243 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d237bf3c-da06-48d8-aef3-91be47f05320-scripts\") pod \"glance-default-external-api-0\" (UID: \"d237bf3c-da06-48d8-aef3-91be47f05320\") " pod="openstack/glance-default-external-api-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.846126 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6611f0ac-3406-4da9-b81a-1515dddfafcd-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6611f0ac-3406-4da9-b81a-1515dddfafcd\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.846759 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6611f0ac-3406-4da9-b81a-1515dddfafcd-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6611f0ac-3406-4da9-b81a-1515dddfafcd\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.849133 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6611f0ac-3406-4da9-b81a-1515dddfafcd-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6611f0ac-3406-4da9-b81a-1515dddfafcd\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.849987 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6611f0ac-3406-4da9-b81a-1515dddfafcd-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6611f0ac-3406-4da9-b81a-1515dddfafcd\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.857063 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d237bf3c-da06-48d8-aef3-91be47f05320-config-data\") pod \"glance-default-external-api-0\" (UID: \"d237bf3c-da06-48d8-aef3-91be47f05320\") " pod="openstack/glance-default-external-api-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.858779 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ps8sl\" (UniqueName: \"kubernetes.io/projected/d237bf3c-da06-48d8-aef3-91be47f05320-kube-api-access-ps8sl\") pod \"glance-default-external-api-0\" (UID: \"d237bf3c-da06-48d8-aef3-91be47f05320\") " pod="openstack/glance-default-external-api-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.861887 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b857j\" (UniqueName: \"kubernetes.io/projected/6611f0ac-3406-4da9-b81a-1515dddfafcd-kube-api-access-b857j\") pod \"glance-default-internal-api-0\" (UID: \"6611f0ac-3406-4da9-b81a-1515dddfafcd\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.893923 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"d237bf3c-da06-48d8-aef3-91be47f05320\") " pod="openstack/glance-default-external-api-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.896741 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"6611f0ac-3406-4da9-b81a-1515dddfafcd\") " pod="openstack/glance-default-internal-api-0" Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.943044 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:26:37 crc kubenswrapper[4861]: W0309 09:26:37.945029 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2943ec09_c4ed_4275_8528_9288665739f9.slice/crio-a171d152407b808e44771978aafe838aeb63661386aac5388432e5a1b857b960 WatchSource:0}: Error finding container a171d152407b808e44771978aafe838aeb63661386aac5388432e5a1b857b960: Status 404 returned error can't find the container with id a171d152407b808e44771978aafe838aeb63661386aac5388432e5a1b857b960 Mar 09 09:26:37 crc kubenswrapper[4861]: I0309 09:26:37.962203 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 09:26:38 crc kubenswrapper[4861]: I0309 09:26:38.163921 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 09:26:38 crc kubenswrapper[4861]: I0309 09:26:38.393546 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 09:26:38 crc kubenswrapper[4861]: W0309 09:26:38.411863 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6611f0ac_3406_4da9_b81a_1515dddfafcd.slice/crio-c3c4af7843b12600b36a2046679cf4d71008af9cc441bee7cb42ecbfd3207bcb WatchSource:0}: Error finding container c3c4af7843b12600b36a2046679cf4d71008af9cc441bee7cb42ecbfd3207bcb: Status 404 returned error can't find the container with id c3c4af7843b12600b36a2046679cf4d71008af9cc441bee7cb42ecbfd3207bcb Mar 09 09:26:38 crc kubenswrapper[4861]: I0309 09:26:38.452248 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2943ec09-c4ed-4275-8528-9288665739f9","Type":"ContainerStarted","Data":"a171d152407b808e44771978aafe838aeb63661386aac5388432e5a1b857b960"} Mar 09 09:26:38 crc kubenswrapper[4861]: I0309 09:26:38.454739 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6611f0ac-3406-4da9-b81a-1515dddfafcd","Type":"ContainerStarted","Data":"c3c4af7843b12600b36a2046679cf4d71008af9cc441bee7cb42ecbfd3207bcb"} Mar 09 09:26:38 crc kubenswrapper[4861]: I0309 09:26:38.593220 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:26:38 crc kubenswrapper[4861]: I0309 09:26:38.737114 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 09:26:39 crc kubenswrapper[4861]: I0309 09:26:39.467167 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6611f0ac-3406-4da9-b81a-1515dddfafcd","Type":"ContainerStarted","Data":"e83b296b8e7e1f2174063797cd72f2fbe48d193678459d6071297d1ca9ddc828"} Mar 09 09:26:39 crc kubenswrapper[4861]: I0309 09:26:39.470222 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2943ec09-c4ed-4275-8528-9288665739f9","Type":"ContainerStarted","Data":"cd69293d6dca22dbc6ce3df4470ace5f511b4e10c0248c73da1ae2f91677dc41"} Mar 09 09:26:39 crc kubenswrapper[4861]: I0309 09:26:39.472952 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d237bf3c-da06-48d8-aef3-91be47f05320","Type":"ContainerStarted","Data":"ddf97aa9816c45b0e9f09e1b5483c1836cba8e2ca104a41016cd971e748786f3"} Mar 09 09:26:40 crc kubenswrapper[4861]: I0309 09:26:40.484385 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6611f0ac-3406-4da9-b81a-1515dddfafcd","Type":"ContainerStarted","Data":"a0ccf0d47f03ca1ed9527ee682b484e992751e8db89a508a4dd34dcdbc8c45cb"} Mar 09 09:26:40 crc kubenswrapper[4861]: I0309 09:26:40.486574 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2943ec09-c4ed-4275-8528-9288665739f9","Type":"ContainerStarted","Data":"93f142a1d6038000ae0624f338e945bbe6d184d3588cb663733938f25ed69d98"} Mar 09 09:26:40 crc kubenswrapper[4861]: I0309 09:26:40.486609 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2943ec09-c4ed-4275-8528-9288665739f9","Type":"ContainerStarted","Data":"0c24eda7674e9f05cfe98f3be8cdbcc8d82a5ec425ccc02355b7561647067f13"} Mar 09 09:26:40 crc kubenswrapper[4861]: I0309 09:26:40.488240 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d237bf3c-da06-48d8-aef3-91be47f05320","Type":"ContainerStarted","Data":"6f82ceb6c678b28f1615c3056267afb5382aad50a4c4229d1cee94cad2e3961b"} Mar 09 09:26:40 crc kubenswrapper[4861]: I0309 09:26:40.488272 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d237bf3c-da06-48d8-aef3-91be47f05320","Type":"ContainerStarted","Data":"37e0fa70685c77c94456cd39385a576591a4a048d009e07f05a1b3816f3c9ac1"} Mar 09 09:26:40 crc kubenswrapper[4861]: I0309 09:26:40.508578 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.508555619 podStartE2EDuration="3.508555619s" podCreationTimestamp="2026-03-09 09:26:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:26:40.504652976 +0000 UTC m=+1243.589692387" watchObservedRunningTime="2026-03-09 09:26:40.508555619 +0000 UTC m=+1243.593595030" Mar 09 09:26:40 crc kubenswrapper[4861]: I0309 09:26:40.538206 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.53818689 podStartE2EDuration="3.53818689s" podCreationTimestamp="2026-03-09 09:26:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:26:40.530223542 +0000 UTC m=+1243.615262933" watchObservedRunningTime="2026-03-09 09:26:40.53818689 +0000 UTC m=+1243.623226281" Mar 09 09:26:42 crc kubenswrapper[4861]: I0309 09:26:42.508880 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2943ec09-c4ed-4275-8528-9288665739f9","Type":"ContainerStarted","Data":"01b28de24832249810f9ac9707338b80cebfa53caa277367e186ce1e7dd99a71"} Mar 09 09:26:42 crc kubenswrapper[4861]: I0309 09:26:42.509485 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 09:26:42 crc kubenswrapper[4861]: I0309 09:26:42.509069 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2943ec09-c4ed-4275-8528-9288665739f9" containerName="proxy-httpd" containerID="cri-o://01b28de24832249810f9ac9707338b80cebfa53caa277367e186ce1e7dd99a71" gracePeriod=30 Mar 09 09:26:42 crc kubenswrapper[4861]: I0309 09:26:42.509009 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2943ec09-c4ed-4275-8528-9288665739f9" containerName="ceilometer-central-agent" containerID="cri-o://cd69293d6dca22dbc6ce3df4470ace5f511b4e10c0248c73da1ae2f91677dc41" gracePeriod=30 Mar 09 09:26:42 crc kubenswrapper[4861]: I0309 09:26:42.509171 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2943ec09-c4ed-4275-8528-9288665739f9" containerName="sg-core" containerID="cri-o://93f142a1d6038000ae0624f338e945bbe6d184d3588cb663733938f25ed69d98" gracePeriod=30 Mar 09 09:26:42 crc kubenswrapper[4861]: I0309 09:26:42.509150 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2943ec09-c4ed-4275-8528-9288665739f9" containerName="ceilometer-notification-agent" containerID="cri-o://0c24eda7674e9f05cfe98f3be8cdbcc8d82a5ec425ccc02355b7561647067f13" gracePeriod=30 Mar 09 09:26:42 crc kubenswrapper[4861]: I0309 09:26:42.541494 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.557843001 podStartE2EDuration="6.541471711s" podCreationTimestamp="2026-03-09 09:26:36 +0000 UTC" firstStartedPulling="2026-03-09 09:26:37.947165262 +0000 UTC m=+1241.032204663" lastFinishedPulling="2026-03-09 09:26:41.930793982 +0000 UTC m=+1245.015833373" observedRunningTime="2026-03-09 09:26:42.53691707 +0000 UTC m=+1245.621956501" watchObservedRunningTime="2026-03-09 09:26:42.541471711 +0000 UTC m=+1245.626511112" Mar 09 09:26:43 crc kubenswrapper[4861]: I0309 09:26:43.521923 4861 generic.go:334] "Generic (PLEG): container finished" podID="2943ec09-c4ed-4275-8528-9288665739f9" containerID="01b28de24832249810f9ac9707338b80cebfa53caa277367e186ce1e7dd99a71" exitCode=0 Mar 09 09:26:43 crc kubenswrapper[4861]: I0309 09:26:43.523711 4861 generic.go:334] "Generic (PLEG): container finished" podID="2943ec09-c4ed-4275-8528-9288665739f9" containerID="93f142a1d6038000ae0624f338e945bbe6d184d3588cb663733938f25ed69d98" exitCode=2 Mar 09 09:26:43 crc kubenswrapper[4861]: I0309 09:26:43.523820 4861 generic.go:334] "Generic (PLEG): container finished" podID="2943ec09-c4ed-4275-8528-9288665739f9" containerID="0c24eda7674e9f05cfe98f3be8cdbcc8d82a5ec425ccc02355b7561647067f13" exitCode=0 Mar 09 09:26:43 crc kubenswrapper[4861]: I0309 09:26:43.522489 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2943ec09-c4ed-4275-8528-9288665739f9","Type":"ContainerDied","Data":"01b28de24832249810f9ac9707338b80cebfa53caa277367e186ce1e7dd99a71"} Mar 09 09:26:43 crc kubenswrapper[4861]: I0309 09:26:43.523968 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2943ec09-c4ed-4275-8528-9288665739f9","Type":"ContainerDied","Data":"93f142a1d6038000ae0624f338e945bbe6d184d3588cb663733938f25ed69d98"} Mar 09 09:26:43 crc kubenswrapper[4861]: I0309 09:26:43.523994 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2943ec09-c4ed-4275-8528-9288665739f9","Type":"ContainerDied","Data":"0c24eda7674e9f05cfe98f3be8cdbcc8d82a5ec425ccc02355b7561647067f13"} Mar 09 09:26:45 crc kubenswrapper[4861]: I0309 09:26:45.485404 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:26:45 crc kubenswrapper[4861]: I0309 09:26:45.543544 4861 generic.go:334] "Generic (PLEG): container finished" podID="2943ec09-c4ed-4275-8528-9288665739f9" containerID="cd69293d6dca22dbc6ce3df4470ace5f511b4e10c0248c73da1ae2f91677dc41" exitCode=0 Mar 09 09:26:45 crc kubenswrapper[4861]: I0309 09:26:45.543596 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2943ec09-c4ed-4275-8528-9288665739f9","Type":"ContainerDied","Data":"cd69293d6dca22dbc6ce3df4470ace5f511b4e10c0248c73da1ae2f91677dc41"} Mar 09 09:26:45 crc kubenswrapper[4861]: I0309 09:26:45.543640 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2943ec09-c4ed-4275-8528-9288665739f9","Type":"ContainerDied","Data":"a171d152407b808e44771978aafe838aeb63661386aac5388432e5a1b857b960"} Mar 09 09:26:45 crc kubenswrapper[4861]: I0309 09:26:45.543655 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:26:45 crc kubenswrapper[4861]: I0309 09:26:45.543665 4861 scope.go:117] "RemoveContainer" containerID="01b28de24832249810f9ac9707338b80cebfa53caa277367e186ce1e7dd99a71" Mar 09 09:26:45 crc kubenswrapper[4861]: I0309 09:26:45.564073 4861 scope.go:117] "RemoveContainer" containerID="93f142a1d6038000ae0624f338e945bbe6d184d3588cb663733938f25ed69d98" Mar 09 09:26:45 crc kubenswrapper[4861]: I0309 09:26:45.585845 4861 scope.go:117] "RemoveContainer" containerID="0c24eda7674e9f05cfe98f3be8cdbcc8d82a5ec425ccc02355b7561647067f13" Mar 09 09:26:45 crc kubenswrapper[4861]: I0309 09:26:45.607879 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjgrp\" (UniqueName: \"kubernetes.io/projected/2943ec09-c4ed-4275-8528-9288665739f9-kube-api-access-qjgrp\") pod \"2943ec09-c4ed-4275-8528-9288665739f9\" (UID: \"2943ec09-c4ed-4275-8528-9288665739f9\") " Mar 09 09:26:45 crc kubenswrapper[4861]: I0309 09:26:45.607953 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2943ec09-c4ed-4275-8528-9288665739f9-log-httpd\") pod \"2943ec09-c4ed-4275-8528-9288665739f9\" (UID: \"2943ec09-c4ed-4275-8528-9288665739f9\") " Mar 09 09:26:45 crc kubenswrapper[4861]: I0309 09:26:45.608033 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2943ec09-c4ed-4275-8528-9288665739f9-run-httpd\") pod \"2943ec09-c4ed-4275-8528-9288665739f9\" (UID: \"2943ec09-c4ed-4275-8528-9288665739f9\") " Mar 09 09:26:45 crc kubenswrapper[4861]: I0309 09:26:45.608088 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2943ec09-c4ed-4275-8528-9288665739f9-combined-ca-bundle\") pod \"2943ec09-c4ed-4275-8528-9288665739f9\" (UID: \"2943ec09-c4ed-4275-8528-9288665739f9\") " Mar 09 09:26:45 crc kubenswrapper[4861]: I0309 09:26:45.608142 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2943ec09-c4ed-4275-8528-9288665739f9-scripts\") pod \"2943ec09-c4ed-4275-8528-9288665739f9\" (UID: \"2943ec09-c4ed-4275-8528-9288665739f9\") " Mar 09 09:26:45 crc kubenswrapper[4861]: I0309 09:26:45.608220 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2943ec09-c4ed-4275-8528-9288665739f9-config-data\") pod \"2943ec09-c4ed-4275-8528-9288665739f9\" (UID: \"2943ec09-c4ed-4275-8528-9288665739f9\") " Mar 09 09:26:45 crc kubenswrapper[4861]: I0309 09:26:45.608257 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2943ec09-c4ed-4275-8528-9288665739f9-sg-core-conf-yaml\") pod \"2943ec09-c4ed-4275-8528-9288665739f9\" (UID: \"2943ec09-c4ed-4275-8528-9288665739f9\") " Mar 09 09:26:45 crc kubenswrapper[4861]: I0309 09:26:45.608691 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2943ec09-c4ed-4275-8528-9288665739f9-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2943ec09-c4ed-4275-8528-9288665739f9" (UID: "2943ec09-c4ed-4275-8528-9288665739f9"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:26:45 crc kubenswrapper[4861]: I0309 09:26:45.608852 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2943ec09-c4ed-4275-8528-9288665739f9-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2943ec09-c4ed-4275-8528-9288665739f9" (UID: "2943ec09-c4ed-4275-8528-9288665739f9"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:26:45 crc kubenswrapper[4861]: I0309 09:26:45.614924 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2943ec09-c4ed-4275-8528-9288665739f9-scripts" (OuterVolumeSpecName: "scripts") pod "2943ec09-c4ed-4275-8528-9288665739f9" (UID: "2943ec09-c4ed-4275-8528-9288665739f9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:26:45 crc kubenswrapper[4861]: I0309 09:26:45.619041 4861 scope.go:117] "RemoveContainer" containerID="cd69293d6dca22dbc6ce3df4470ace5f511b4e10c0248c73da1ae2f91677dc41" Mar 09 09:26:45 crc kubenswrapper[4861]: I0309 09:26:45.626202 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2943ec09-c4ed-4275-8528-9288665739f9-kube-api-access-qjgrp" (OuterVolumeSpecName: "kube-api-access-qjgrp") pod "2943ec09-c4ed-4275-8528-9288665739f9" (UID: "2943ec09-c4ed-4275-8528-9288665739f9"). InnerVolumeSpecName "kube-api-access-qjgrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:26:45 crc kubenswrapper[4861]: I0309 09:26:45.648193 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2943ec09-c4ed-4275-8528-9288665739f9-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2943ec09-c4ed-4275-8528-9288665739f9" (UID: "2943ec09-c4ed-4275-8528-9288665739f9"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:26:45 crc kubenswrapper[4861]: I0309 09:26:45.689211 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2943ec09-c4ed-4275-8528-9288665739f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2943ec09-c4ed-4275-8528-9288665739f9" (UID: "2943ec09-c4ed-4275-8528-9288665739f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:26:45 crc kubenswrapper[4861]: I0309 09:26:45.703759 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2943ec09-c4ed-4275-8528-9288665739f9-config-data" (OuterVolumeSpecName: "config-data") pod "2943ec09-c4ed-4275-8528-9288665739f9" (UID: "2943ec09-c4ed-4275-8528-9288665739f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:26:45 crc kubenswrapper[4861]: I0309 09:26:45.710079 4861 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2943ec09-c4ed-4275-8528-9288665739f9-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:45 crc kubenswrapper[4861]: I0309 09:26:45.710121 4861 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2943ec09-c4ed-4275-8528-9288665739f9-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:45 crc kubenswrapper[4861]: I0309 09:26:45.710131 4861 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2943ec09-c4ed-4275-8528-9288665739f9-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:45 crc kubenswrapper[4861]: I0309 09:26:45.710142 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjgrp\" (UniqueName: \"kubernetes.io/projected/2943ec09-c4ed-4275-8528-9288665739f9-kube-api-access-qjgrp\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:45 crc kubenswrapper[4861]: I0309 09:26:45.710151 4861 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2943ec09-c4ed-4275-8528-9288665739f9-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:45 crc kubenswrapper[4861]: I0309 09:26:45.710158 4861 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2943ec09-c4ed-4275-8528-9288665739f9-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:45 crc kubenswrapper[4861]: I0309 09:26:45.710167 4861 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2943ec09-c4ed-4275-8528-9288665739f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:45 crc kubenswrapper[4861]: I0309 09:26:45.715052 4861 scope.go:117] "RemoveContainer" containerID="01b28de24832249810f9ac9707338b80cebfa53caa277367e186ce1e7dd99a71" Mar 09 09:26:45 crc kubenswrapper[4861]: E0309 09:26:45.715639 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01b28de24832249810f9ac9707338b80cebfa53caa277367e186ce1e7dd99a71\": container with ID starting with 01b28de24832249810f9ac9707338b80cebfa53caa277367e186ce1e7dd99a71 not found: ID does not exist" containerID="01b28de24832249810f9ac9707338b80cebfa53caa277367e186ce1e7dd99a71" Mar 09 09:26:45 crc kubenswrapper[4861]: I0309 09:26:45.715691 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01b28de24832249810f9ac9707338b80cebfa53caa277367e186ce1e7dd99a71"} err="failed to get container status \"01b28de24832249810f9ac9707338b80cebfa53caa277367e186ce1e7dd99a71\": rpc error: code = NotFound desc = could not find container \"01b28de24832249810f9ac9707338b80cebfa53caa277367e186ce1e7dd99a71\": container with ID starting with 01b28de24832249810f9ac9707338b80cebfa53caa277367e186ce1e7dd99a71 not found: ID does not exist" Mar 09 09:26:45 crc kubenswrapper[4861]: I0309 09:26:45.715913 4861 scope.go:117] "RemoveContainer" containerID="93f142a1d6038000ae0624f338e945bbe6d184d3588cb663733938f25ed69d98" Mar 09 09:26:45 crc kubenswrapper[4861]: E0309 09:26:45.716358 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93f142a1d6038000ae0624f338e945bbe6d184d3588cb663733938f25ed69d98\": container with ID starting with 93f142a1d6038000ae0624f338e945bbe6d184d3588cb663733938f25ed69d98 not found: ID does not exist" containerID="93f142a1d6038000ae0624f338e945bbe6d184d3588cb663733938f25ed69d98" Mar 09 09:26:45 crc kubenswrapper[4861]: I0309 09:26:45.716484 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93f142a1d6038000ae0624f338e945bbe6d184d3588cb663733938f25ed69d98"} err="failed to get container status \"93f142a1d6038000ae0624f338e945bbe6d184d3588cb663733938f25ed69d98\": rpc error: code = NotFound desc = could not find container \"93f142a1d6038000ae0624f338e945bbe6d184d3588cb663733938f25ed69d98\": container with ID starting with 93f142a1d6038000ae0624f338e945bbe6d184d3588cb663733938f25ed69d98 not found: ID does not exist" Mar 09 09:26:45 crc kubenswrapper[4861]: I0309 09:26:45.716558 4861 scope.go:117] "RemoveContainer" containerID="0c24eda7674e9f05cfe98f3be8cdbcc8d82a5ec425ccc02355b7561647067f13" Mar 09 09:26:45 crc kubenswrapper[4861]: E0309 09:26:45.716891 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c24eda7674e9f05cfe98f3be8cdbcc8d82a5ec425ccc02355b7561647067f13\": container with ID starting with 0c24eda7674e9f05cfe98f3be8cdbcc8d82a5ec425ccc02355b7561647067f13 not found: ID does not exist" containerID="0c24eda7674e9f05cfe98f3be8cdbcc8d82a5ec425ccc02355b7561647067f13" Mar 09 09:26:45 crc kubenswrapper[4861]: I0309 09:26:45.716919 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c24eda7674e9f05cfe98f3be8cdbcc8d82a5ec425ccc02355b7561647067f13"} err="failed to get container status \"0c24eda7674e9f05cfe98f3be8cdbcc8d82a5ec425ccc02355b7561647067f13\": rpc error: code = NotFound desc = could not find container \"0c24eda7674e9f05cfe98f3be8cdbcc8d82a5ec425ccc02355b7561647067f13\": container with ID starting with 0c24eda7674e9f05cfe98f3be8cdbcc8d82a5ec425ccc02355b7561647067f13 not found: ID does not exist" Mar 09 09:26:45 crc kubenswrapper[4861]: I0309 09:26:45.716936 4861 scope.go:117] "RemoveContainer" containerID="cd69293d6dca22dbc6ce3df4470ace5f511b4e10c0248c73da1ae2f91677dc41" Mar 09 09:26:45 crc kubenswrapper[4861]: E0309 09:26:45.717242 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd69293d6dca22dbc6ce3df4470ace5f511b4e10c0248c73da1ae2f91677dc41\": container with ID starting with cd69293d6dca22dbc6ce3df4470ace5f511b4e10c0248c73da1ae2f91677dc41 not found: ID does not exist" containerID="cd69293d6dca22dbc6ce3df4470ace5f511b4e10c0248c73da1ae2f91677dc41" Mar 09 09:26:45 crc kubenswrapper[4861]: I0309 09:26:45.717267 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd69293d6dca22dbc6ce3df4470ace5f511b4e10c0248c73da1ae2f91677dc41"} err="failed to get container status \"cd69293d6dca22dbc6ce3df4470ace5f511b4e10c0248c73da1ae2f91677dc41\": rpc error: code = NotFound desc = could not find container \"cd69293d6dca22dbc6ce3df4470ace5f511b4e10c0248c73da1ae2f91677dc41\": container with ID starting with cd69293d6dca22dbc6ce3df4470ace5f511b4e10c0248c73da1ae2f91677dc41 not found: ID does not exist" Mar 09 09:26:45 crc kubenswrapper[4861]: I0309 09:26:45.878430 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:26:45 crc kubenswrapper[4861]: I0309 09:26:45.886251 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:26:45 crc kubenswrapper[4861]: I0309 09:26:45.902974 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:26:45 crc kubenswrapper[4861]: E0309 09:26:45.903356 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2943ec09-c4ed-4275-8528-9288665739f9" containerName="ceilometer-notification-agent" Mar 09 09:26:45 crc kubenswrapper[4861]: I0309 09:26:45.903398 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="2943ec09-c4ed-4275-8528-9288665739f9" containerName="ceilometer-notification-agent" Mar 09 09:26:45 crc kubenswrapper[4861]: E0309 09:26:45.903413 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2943ec09-c4ed-4275-8528-9288665739f9" containerName="proxy-httpd" Mar 09 09:26:45 crc kubenswrapper[4861]: I0309 09:26:45.903418 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="2943ec09-c4ed-4275-8528-9288665739f9" containerName="proxy-httpd" Mar 09 09:26:45 crc kubenswrapper[4861]: E0309 09:26:45.903430 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2943ec09-c4ed-4275-8528-9288665739f9" containerName="ceilometer-central-agent" Mar 09 09:26:45 crc kubenswrapper[4861]: I0309 09:26:45.903436 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="2943ec09-c4ed-4275-8528-9288665739f9" containerName="ceilometer-central-agent" Mar 09 09:26:45 crc kubenswrapper[4861]: E0309 09:26:45.903449 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2943ec09-c4ed-4275-8528-9288665739f9" containerName="sg-core" Mar 09 09:26:45 crc kubenswrapper[4861]: I0309 09:26:45.903456 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="2943ec09-c4ed-4275-8528-9288665739f9" containerName="sg-core" Mar 09 09:26:45 crc kubenswrapper[4861]: I0309 09:26:45.903618 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="2943ec09-c4ed-4275-8528-9288665739f9" containerName="proxy-httpd" Mar 09 09:26:45 crc kubenswrapper[4861]: I0309 09:26:45.903629 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="2943ec09-c4ed-4275-8528-9288665739f9" containerName="sg-core" Mar 09 09:26:45 crc kubenswrapper[4861]: I0309 09:26:45.903643 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="2943ec09-c4ed-4275-8528-9288665739f9" containerName="ceilometer-central-agent" Mar 09 09:26:45 crc kubenswrapper[4861]: I0309 09:26:45.903654 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="2943ec09-c4ed-4275-8528-9288665739f9" containerName="ceilometer-notification-agent" Mar 09 09:26:45 crc kubenswrapper[4861]: I0309 09:26:45.905193 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:26:45 crc kubenswrapper[4861]: I0309 09:26:45.909940 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 09:26:45 crc kubenswrapper[4861]: I0309 09:26:45.910608 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 09:26:45 crc kubenswrapper[4861]: I0309 09:26:45.916881 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:26:46 crc kubenswrapper[4861]: I0309 09:26:46.015307 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edbe6993-7926-4a78-9227-b7c85af7ec66-config-data\") pod \"ceilometer-0\" (UID: \"edbe6993-7926-4a78-9227-b7c85af7ec66\") " pod="openstack/ceilometer-0" Mar 09 09:26:46 crc kubenswrapper[4861]: I0309 09:26:46.015383 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/edbe6993-7926-4a78-9227-b7c85af7ec66-scripts\") pod \"ceilometer-0\" (UID: \"edbe6993-7926-4a78-9227-b7c85af7ec66\") " pod="openstack/ceilometer-0" Mar 09 09:26:46 crc kubenswrapper[4861]: I0309 09:26:46.015435 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljwqp\" (UniqueName: \"kubernetes.io/projected/edbe6993-7926-4a78-9227-b7c85af7ec66-kube-api-access-ljwqp\") pod \"ceilometer-0\" (UID: \"edbe6993-7926-4a78-9227-b7c85af7ec66\") " pod="openstack/ceilometer-0" Mar 09 09:26:46 crc kubenswrapper[4861]: I0309 09:26:46.015457 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/edbe6993-7926-4a78-9227-b7c85af7ec66-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"edbe6993-7926-4a78-9227-b7c85af7ec66\") " pod="openstack/ceilometer-0" Mar 09 09:26:46 crc kubenswrapper[4861]: I0309 09:26:46.015476 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edbe6993-7926-4a78-9227-b7c85af7ec66-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"edbe6993-7926-4a78-9227-b7c85af7ec66\") " pod="openstack/ceilometer-0" Mar 09 09:26:46 crc kubenswrapper[4861]: I0309 09:26:46.015533 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/edbe6993-7926-4a78-9227-b7c85af7ec66-log-httpd\") pod \"ceilometer-0\" (UID: \"edbe6993-7926-4a78-9227-b7c85af7ec66\") " pod="openstack/ceilometer-0" Mar 09 09:26:46 crc kubenswrapper[4861]: I0309 09:26:46.015563 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/edbe6993-7926-4a78-9227-b7c85af7ec66-run-httpd\") pod \"ceilometer-0\" (UID: \"edbe6993-7926-4a78-9227-b7c85af7ec66\") " pod="openstack/ceilometer-0" Mar 09 09:26:46 crc kubenswrapper[4861]: I0309 09:26:46.117085 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljwqp\" (UniqueName: \"kubernetes.io/projected/edbe6993-7926-4a78-9227-b7c85af7ec66-kube-api-access-ljwqp\") pod \"ceilometer-0\" (UID: \"edbe6993-7926-4a78-9227-b7c85af7ec66\") " pod="openstack/ceilometer-0" Mar 09 09:26:46 crc kubenswrapper[4861]: I0309 09:26:46.117139 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/edbe6993-7926-4a78-9227-b7c85af7ec66-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"edbe6993-7926-4a78-9227-b7c85af7ec66\") " pod="openstack/ceilometer-0" Mar 09 09:26:46 crc kubenswrapper[4861]: I0309 09:26:46.117170 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edbe6993-7926-4a78-9227-b7c85af7ec66-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"edbe6993-7926-4a78-9227-b7c85af7ec66\") " pod="openstack/ceilometer-0" Mar 09 09:26:46 crc kubenswrapper[4861]: I0309 09:26:46.117205 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/edbe6993-7926-4a78-9227-b7c85af7ec66-log-httpd\") pod \"ceilometer-0\" (UID: \"edbe6993-7926-4a78-9227-b7c85af7ec66\") " pod="openstack/ceilometer-0" Mar 09 09:26:46 crc kubenswrapper[4861]: I0309 09:26:46.117240 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/edbe6993-7926-4a78-9227-b7c85af7ec66-run-httpd\") pod \"ceilometer-0\" (UID: \"edbe6993-7926-4a78-9227-b7c85af7ec66\") " pod="openstack/ceilometer-0" Mar 09 09:26:46 crc kubenswrapper[4861]: I0309 09:26:46.117306 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edbe6993-7926-4a78-9227-b7c85af7ec66-config-data\") pod \"ceilometer-0\" (UID: \"edbe6993-7926-4a78-9227-b7c85af7ec66\") " pod="openstack/ceilometer-0" Mar 09 09:26:46 crc kubenswrapper[4861]: I0309 09:26:46.117339 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/edbe6993-7926-4a78-9227-b7c85af7ec66-scripts\") pod \"ceilometer-0\" (UID: \"edbe6993-7926-4a78-9227-b7c85af7ec66\") " pod="openstack/ceilometer-0" Mar 09 09:26:46 crc kubenswrapper[4861]: I0309 09:26:46.118402 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/edbe6993-7926-4a78-9227-b7c85af7ec66-log-httpd\") pod \"ceilometer-0\" (UID: \"edbe6993-7926-4a78-9227-b7c85af7ec66\") " pod="openstack/ceilometer-0" Mar 09 09:26:46 crc kubenswrapper[4861]: I0309 09:26:46.118603 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/edbe6993-7926-4a78-9227-b7c85af7ec66-run-httpd\") pod \"ceilometer-0\" (UID: \"edbe6993-7926-4a78-9227-b7c85af7ec66\") " pod="openstack/ceilometer-0" Mar 09 09:26:46 crc kubenswrapper[4861]: I0309 09:26:46.123660 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/edbe6993-7926-4a78-9227-b7c85af7ec66-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"edbe6993-7926-4a78-9227-b7c85af7ec66\") " pod="openstack/ceilometer-0" Mar 09 09:26:46 crc kubenswrapper[4861]: I0309 09:26:46.124419 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/edbe6993-7926-4a78-9227-b7c85af7ec66-scripts\") pod \"ceilometer-0\" (UID: \"edbe6993-7926-4a78-9227-b7c85af7ec66\") " pod="openstack/ceilometer-0" Mar 09 09:26:46 crc kubenswrapper[4861]: I0309 09:26:46.133341 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edbe6993-7926-4a78-9227-b7c85af7ec66-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"edbe6993-7926-4a78-9227-b7c85af7ec66\") " pod="openstack/ceilometer-0" Mar 09 09:26:46 crc kubenswrapper[4861]: I0309 09:26:46.133544 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edbe6993-7926-4a78-9227-b7c85af7ec66-config-data\") pod \"ceilometer-0\" (UID: \"edbe6993-7926-4a78-9227-b7c85af7ec66\") " pod="openstack/ceilometer-0" Mar 09 09:26:46 crc kubenswrapper[4861]: I0309 09:26:46.134687 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljwqp\" (UniqueName: \"kubernetes.io/projected/edbe6993-7926-4a78-9227-b7c85af7ec66-kube-api-access-ljwqp\") pod \"ceilometer-0\" (UID: \"edbe6993-7926-4a78-9227-b7c85af7ec66\") " pod="openstack/ceilometer-0" Mar 09 09:26:46 crc kubenswrapper[4861]: I0309 09:26:46.223140 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:26:46 crc kubenswrapper[4861]: W0309 09:26:46.665015 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podedbe6993_7926_4a78_9227_b7c85af7ec66.slice/crio-98dd934ab934e64c0938dada01153f7f71df7b0fb60d9ac456eec54a7110323e WatchSource:0}: Error finding container 98dd934ab934e64c0938dada01153f7f71df7b0fb60d9ac456eec54a7110323e: Status 404 returned error can't find the container with id 98dd934ab934e64c0938dada01153f7f71df7b0fb60d9ac456eec54a7110323e Mar 09 09:26:46 crc kubenswrapper[4861]: I0309 09:26:46.669719 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:26:47 crc kubenswrapper[4861]: E0309 09:26:47.460995 4861 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48fbe4a1_81ab_4a46_8150_821bc8afa220.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48fbe4a1_81ab_4a46_8150_821bc8afa220.slice/crio-3cace9b3826297e13b51f776c4a63465d7f30becfc9c4dce3918a85756fc792e\": RecentStats: unable to find data in memory cache]" Mar 09 09:26:47 crc kubenswrapper[4861]: I0309 09:26:47.569751 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"edbe6993-7926-4a78-9227-b7c85af7ec66","Type":"ContainerStarted","Data":"98dd934ab934e64c0938dada01153f7f71df7b0fb60d9ac456eec54a7110323e"} Mar 09 09:26:47 crc kubenswrapper[4861]: I0309 09:26:47.670034 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2943ec09-c4ed-4275-8528-9288665739f9" path="/var/lib/kubelet/pods/2943ec09-c4ed-4275-8528-9288665739f9/volumes" Mar 09 09:26:47 crc kubenswrapper[4861]: I0309 09:26:47.963260 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 09 09:26:47 crc kubenswrapper[4861]: I0309 09:26:47.963339 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 09 09:26:47 crc kubenswrapper[4861]: I0309 09:26:47.991791 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 09 09:26:48 crc kubenswrapper[4861]: I0309 09:26:48.004112 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 09 09:26:48 crc kubenswrapper[4861]: I0309 09:26:48.165047 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 09 09:26:48 crc kubenswrapper[4861]: I0309 09:26:48.165117 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 09 09:26:48 crc kubenswrapper[4861]: I0309 09:26:48.236909 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 09 09:26:48 crc kubenswrapper[4861]: I0309 09:26:48.238115 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 09 09:26:48 crc kubenswrapper[4861]: I0309 09:26:48.582991 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"edbe6993-7926-4a78-9227-b7c85af7ec66","Type":"ContainerStarted","Data":"9a6976fb3b15c1a33bf9f11c55266c90e65d8bc0b256d7961d1f06a0053a3d1f"} Mar 09 09:26:48 crc kubenswrapper[4861]: I0309 09:26:48.583237 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 09 09:26:48 crc kubenswrapper[4861]: I0309 09:26:48.583277 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 09 09:26:48 crc kubenswrapper[4861]: I0309 09:26:48.583288 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 09 09:26:48 crc kubenswrapper[4861]: I0309 09:26:48.583297 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 09 09:26:49 crc kubenswrapper[4861]: I0309 09:26:49.592886 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"edbe6993-7926-4a78-9227-b7c85af7ec66","Type":"ContainerStarted","Data":"6360fe2180cbf6227ebf7a75a027983d6faae1823563d9854f335c4cb2aa39bc"} Mar 09 09:26:50 crc kubenswrapper[4861]: I0309 09:26:50.606364 4861 generic.go:334] "Generic (PLEG): container finished" podID="4599154b-2118-461d-9999-d07931415f9c" containerID="b6a18808b1a1ab9736d1bfd5c639a8629172104f526890dce6521c53541fa906" exitCode=0 Mar 09 09:26:50 crc kubenswrapper[4861]: I0309 09:26:50.606409 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-6zv4b" event={"ID":"4599154b-2118-461d-9999-d07931415f9c","Type":"ContainerDied","Data":"b6a18808b1a1ab9736d1bfd5c639a8629172104f526890dce6521c53541fa906"} Mar 09 09:26:50 crc kubenswrapper[4861]: I0309 09:26:50.609102 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"edbe6993-7926-4a78-9227-b7c85af7ec66","Type":"ContainerStarted","Data":"e87f7e0e851f4944b2b3370f531360c735836531f36248b68167ac059e5e4495"} Mar 09 09:26:50 crc kubenswrapper[4861]: I0309 09:26:50.609152 4861 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 09 09:26:50 crc kubenswrapper[4861]: I0309 09:26:50.609339 4861 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 09 09:26:50 crc kubenswrapper[4861]: I0309 09:26:50.792110 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 09 09:26:50 crc kubenswrapper[4861]: I0309 09:26:50.927483 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 09 09:26:51 crc kubenswrapper[4861]: I0309 09:26:51.005564 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 09 09:26:51 crc kubenswrapper[4861]: I0309 09:26:51.005681 4861 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 09 09:26:51 crc kubenswrapper[4861]: I0309 09:26:51.011213 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 09 09:26:52 crc kubenswrapper[4861]: I0309 09:26:52.026997 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-6zv4b" Mar 09 09:26:52 crc kubenswrapper[4861]: I0309 09:26:52.152280 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4599154b-2118-461d-9999-d07931415f9c-combined-ca-bundle\") pod \"4599154b-2118-461d-9999-d07931415f9c\" (UID: \"4599154b-2118-461d-9999-d07931415f9c\") " Mar 09 09:26:52 crc kubenswrapper[4861]: I0309 09:26:52.152888 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4599154b-2118-461d-9999-d07931415f9c-scripts\") pod \"4599154b-2118-461d-9999-d07931415f9c\" (UID: \"4599154b-2118-461d-9999-d07931415f9c\") " Mar 09 09:26:52 crc kubenswrapper[4861]: I0309 09:26:52.152923 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cf5jx\" (UniqueName: \"kubernetes.io/projected/4599154b-2118-461d-9999-d07931415f9c-kube-api-access-cf5jx\") pod \"4599154b-2118-461d-9999-d07931415f9c\" (UID: \"4599154b-2118-461d-9999-d07931415f9c\") " Mar 09 09:26:52 crc kubenswrapper[4861]: I0309 09:26:52.153470 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4599154b-2118-461d-9999-d07931415f9c-config-data\") pod \"4599154b-2118-461d-9999-d07931415f9c\" (UID: \"4599154b-2118-461d-9999-d07931415f9c\") " Mar 09 09:26:52 crc kubenswrapper[4861]: I0309 09:26:52.161779 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4599154b-2118-461d-9999-d07931415f9c-scripts" (OuterVolumeSpecName: "scripts") pod "4599154b-2118-461d-9999-d07931415f9c" (UID: "4599154b-2118-461d-9999-d07931415f9c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:26:52 crc kubenswrapper[4861]: I0309 09:26:52.167493 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4599154b-2118-461d-9999-d07931415f9c-kube-api-access-cf5jx" (OuterVolumeSpecName: "kube-api-access-cf5jx") pod "4599154b-2118-461d-9999-d07931415f9c" (UID: "4599154b-2118-461d-9999-d07931415f9c"). InnerVolumeSpecName "kube-api-access-cf5jx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:26:52 crc kubenswrapper[4861]: I0309 09:26:52.199165 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4599154b-2118-461d-9999-d07931415f9c-config-data" (OuterVolumeSpecName: "config-data") pod "4599154b-2118-461d-9999-d07931415f9c" (UID: "4599154b-2118-461d-9999-d07931415f9c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:26:52 crc kubenswrapper[4861]: I0309 09:26:52.254591 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4599154b-2118-461d-9999-d07931415f9c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4599154b-2118-461d-9999-d07931415f9c" (UID: "4599154b-2118-461d-9999-d07931415f9c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:26:52 crc kubenswrapper[4861]: I0309 09:26:52.256356 4861 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4599154b-2118-461d-9999-d07931415f9c-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:52 crc kubenswrapper[4861]: I0309 09:26:52.256406 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cf5jx\" (UniqueName: \"kubernetes.io/projected/4599154b-2118-461d-9999-d07931415f9c-kube-api-access-cf5jx\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:52 crc kubenswrapper[4861]: I0309 09:26:52.256424 4861 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4599154b-2118-461d-9999-d07931415f9c-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:52 crc kubenswrapper[4861]: I0309 09:26:52.256435 4861 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4599154b-2118-461d-9999-d07931415f9c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:26:52 crc kubenswrapper[4861]: I0309 09:26:52.628887 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-6zv4b" Mar 09 09:26:52 crc kubenswrapper[4861]: I0309 09:26:52.628885 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-6zv4b" event={"ID":"4599154b-2118-461d-9999-d07931415f9c","Type":"ContainerDied","Data":"8506cffe2da3cf31dfc1994ad585554c3e0566cb785d3007e1ef35e9b324a0cd"} Mar 09 09:26:52 crc kubenswrapper[4861]: I0309 09:26:52.629421 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8506cffe2da3cf31dfc1994ad585554c3e0566cb785d3007e1ef35e9b324a0cd" Mar 09 09:26:52 crc kubenswrapper[4861]: I0309 09:26:52.632474 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"edbe6993-7926-4a78-9227-b7c85af7ec66","Type":"ContainerStarted","Data":"7f801cb67a90ced05cbc62eaecaa06f1ab05d667be9d41dd2eb4d9888ebafdea"} Mar 09 09:26:52 crc kubenswrapper[4861]: I0309 09:26:52.632533 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 09:26:52 crc kubenswrapper[4861]: I0309 09:26:52.729230 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.278126302 podStartE2EDuration="7.729205778s" podCreationTimestamp="2026-03-09 09:26:45 +0000 UTC" firstStartedPulling="2026-03-09 09:26:46.668096453 +0000 UTC m=+1249.753135854" lastFinishedPulling="2026-03-09 09:26:52.119175929 +0000 UTC m=+1255.204215330" observedRunningTime="2026-03-09 09:26:52.664055285 +0000 UTC m=+1255.749094696" watchObservedRunningTime="2026-03-09 09:26:52.729205778 +0000 UTC m=+1255.814245179" Mar 09 09:26:52 crc kubenswrapper[4861]: I0309 09:26:52.743274 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 09 09:26:52 crc kubenswrapper[4861]: E0309 09:26:52.743786 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4599154b-2118-461d-9999-d07931415f9c" containerName="nova-cell0-conductor-db-sync" Mar 09 09:26:52 crc kubenswrapper[4861]: I0309 09:26:52.743812 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="4599154b-2118-461d-9999-d07931415f9c" containerName="nova-cell0-conductor-db-sync" Mar 09 09:26:52 crc kubenswrapper[4861]: I0309 09:26:52.744084 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="4599154b-2118-461d-9999-d07931415f9c" containerName="nova-cell0-conductor-db-sync" Mar 09 09:26:52 crc kubenswrapper[4861]: I0309 09:26:52.744895 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 09 09:26:52 crc kubenswrapper[4861]: I0309 09:26:52.747575 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 09 09:26:52 crc kubenswrapper[4861]: I0309 09:26:52.747819 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-g64g6" Mar 09 09:26:52 crc kubenswrapper[4861]: I0309 09:26:52.760994 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 09 09:26:52 crc kubenswrapper[4861]: I0309 09:26:52.868833 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9hdg\" (UniqueName: \"kubernetes.io/projected/fc47e276-b337-4696-ac08-1aa31c4b6864-kube-api-access-z9hdg\") pod \"nova-cell0-conductor-0\" (UID: \"fc47e276-b337-4696-ac08-1aa31c4b6864\") " pod="openstack/nova-cell0-conductor-0" Mar 09 09:26:52 crc kubenswrapper[4861]: I0309 09:26:52.868932 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc47e276-b337-4696-ac08-1aa31c4b6864-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"fc47e276-b337-4696-ac08-1aa31c4b6864\") " pod="openstack/nova-cell0-conductor-0" Mar 09 09:26:52 crc kubenswrapper[4861]: I0309 09:26:52.869066 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc47e276-b337-4696-ac08-1aa31c4b6864-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"fc47e276-b337-4696-ac08-1aa31c4b6864\") " pod="openstack/nova-cell0-conductor-0" Mar 09 09:26:52 crc kubenswrapper[4861]: I0309 09:26:52.970778 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9hdg\" (UniqueName: \"kubernetes.io/projected/fc47e276-b337-4696-ac08-1aa31c4b6864-kube-api-access-z9hdg\") pod \"nova-cell0-conductor-0\" (UID: \"fc47e276-b337-4696-ac08-1aa31c4b6864\") " pod="openstack/nova-cell0-conductor-0" Mar 09 09:26:52 crc kubenswrapper[4861]: I0309 09:26:52.970899 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc47e276-b337-4696-ac08-1aa31c4b6864-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"fc47e276-b337-4696-ac08-1aa31c4b6864\") " pod="openstack/nova-cell0-conductor-0" Mar 09 09:26:52 crc kubenswrapper[4861]: I0309 09:26:52.971009 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc47e276-b337-4696-ac08-1aa31c4b6864-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"fc47e276-b337-4696-ac08-1aa31c4b6864\") " pod="openstack/nova-cell0-conductor-0" Mar 09 09:26:52 crc kubenswrapper[4861]: I0309 09:26:52.979208 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc47e276-b337-4696-ac08-1aa31c4b6864-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"fc47e276-b337-4696-ac08-1aa31c4b6864\") " pod="openstack/nova-cell0-conductor-0" Mar 09 09:26:52 crc kubenswrapper[4861]: I0309 09:26:52.979848 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc47e276-b337-4696-ac08-1aa31c4b6864-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"fc47e276-b337-4696-ac08-1aa31c4b6864\") " pod="openstack/nova-cell0-conductor-0" Mar 09 09:26:52 crc kubenswrapper[4861]: I0309 09:26:52.994980 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9hdg\" (UniqueName: \"kubernetes.io/projected/fc47e276-b337-4696-ac08-1aa31c4b6864-kube-api-access-z9hdg\") pod \"nova-cell0-conductor-0\" (UID: \"fc47e276-b337-4696-ac08-1aa31c4b6864\") " pod="openstack/nova-cell0-conductor-0" Mar 09 09:26:53 crc kubenswrapper[4861]: I0309 09:26:53.074125 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 09 09:26:53 crc kubenswrapper[4861]: I0309 09:26:53.601768 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 09 09:26:53 crc kubenswrapper[4861]: I0309 09:26:53.674566 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"fc47e276-b337-4696-ac08-1aa31c4b6864","Type":"ContainerStarted","Data":"0066be87c75e200e8161274f81788e6af59e3d5b3a1e8b50e20cd6322369770e"} Mar 09 09:26:54 crc kubenswrapper[4861]: I0309 09:26:54.670591 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"fc47e276-b337-4696-ac08-1aa31c4b6864","Type":"ContainerStarted","Data":"726ec7c3429158f9f5fde29de73d3b1092000684dab6f128a116676c59205d4e"} Mar 09 09:26:54 crc kubenswrapper[4861]: I0309 09:26:54.672136 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 09 09:26:54 crc kubenswrapper[4861]: I0309 09:26:54.692170 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.692147448 podStartE2EDuration="2.692147448s" podCreationTimestamp="2026-03-09 09:26:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:26:54.68592879 +0000 UTC m=+1257.770968211" watchObservedRunningTime="2026-03-09 09:26:54.692147448 +0000 UTC m=+1257.777186849" Mar 09 09:26:57 crc kubenswrapper[4861]: E0309 09:26:57.740742 4861 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48fbe4a1_81ab_4a46_8150_821bc8afa220.slice/crio-3cace9b3826297e13b51f776c4a63465d7f30becfc9c4dce3918a85756fc792e\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48fbe4a1_81ab_4a46_8150_821bc8afa220.slice\": RecentStats: unable to find data in memory cache]" Mar 09 09:26:58 crc kubenswrapper[4861]: I0309 09:26:58.105754 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 09 09:26:58 crc kubenswrapper[4861]: I0309 09:26:58.595675 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-sbjdd"] Mar 09 09:26:58 crc kubenswrapper[4861]: I0309 09:26:58.597365 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-sbjdd" Mar 09 09:26:58 crc kubenswrapper[4861]: I0309 09:26:58.600501 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 09 09:26:58 crc kubenswrapper[4861]: I0309 09:26:58.601562 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 09 09:26:58 crc kubenswrapper[4861]: I0309 09:26:58.619662 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-sbjdd"] Mar 09 09:26:58 crc kubenswrapper[4861]: I0309 09:26:58.704514 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17803e89-5e7e-4d37-b96f-53e26da13fc2-scripts\") pod \"nova-cell0-cell-mapping-sbjdd\" (UID: \"17803e89-5e7e-4d37-b96f-53e26da13fc2\") " pod="openstack/nova-cell0-cell-mapping-sbjdd" Mar 09 09:26:58 crc kubenswrapper[4861]: I0309 09:26:58.704575 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49jct\" (UniqueName: \"kubernetes.io/projected/17803e89-5e7e-4d37-b96f-53e26da13fc2-kube-api-access-49jct\") pod \"nova-cell0-cell-mapping-sbjdd\" (UID: \"17803e89-5e7e-4d37-b96f-53e26da13fc2\") " pod="openstack/nova-cell0-cell-mapping-sbjdd" Mar 09 09:26:58 crc kubenswrapper[4861]: I0309 09:26:58.704604 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17803e89-5e7e-4d37-b96f-53e26da13fc2-config-data\") pod \"nova-cell0-cell-mapping-sbjdd\" (UID: \"17803e89-5e7e-4d37-b96f-53e26da13fc2\") " pod="openstack/nova-cell0-cell-mapping-sbjdd" Mar 09 09:26:58 crc kubenswrapper[4861]: I0309 09:26:58.704661 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17803e89-5e7e-4d37-b96f-53e26da13fc2-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-sbjdd\" (UID: \"17803e89-5e7e-4d37-b96f-53e26da13fc2\") " pod="openstack/nova-cell0-cell-mapping-sbjdd" Mar 09 09:26:58 crc kubenswrapper[4861]: I0309 09:26:58.806690 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17803e89-5e7e-4d37-b96f-53e26da13fc2-scripts\") pod \"nova-cell0-cell-mapping-sbjdd\" (UID: \"17803e89-5e7e-4d37-b96f-53e26da13fc2\") " pod="openstack/nova-cell0-cell-mapping-sbjdd" Mar 09 09:26:58 crc kubenswrapper[4861]: I0309 09:26:58.806741 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49jct\" (UniqueName: \"kubernetes.io/projected/17803e89-5e7e-4d37-b96f-53e26da13fc2-kube-api-access-49jct\") pod \"nova-cell0-cell-mapping-sbjdd\" (UID: \"17803e89-5e7e-4d37-b96f-53e26da13fc2\") " pod="openstack/nova-cell0-cell-mapping-sbjdd" Mar 09 09:26:58 crc kubenswrapper[4861]: I0309 09:26:58.806770 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17803e89-5e7e-4d37-b96f-53e26da13fc2-config-data\") pod \"nova-cell0-cell-mapping-sbjdd\" (UID: \"17803e89-5e7e-4d37-b96f-53e26da13fc2\") " pod="openstack/nova-cell0-cell-mapping-sbjdd" Mar 09 09:26:58 crc kubenswrapper[4861]: I0309 09:26:58.806841 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17803e89-5e7e-4d37-b96f-53e26da13fc2-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-sbjdd\" (UID: \"17803e89-5e7e-4d37-b96f-53e26da13fc2\") " pod="openstack/nova-cell0-cell-mapping-sbjdd" Mar 09 09:26:58 crc kubenswrapper[4861]: I0309 09:26:58.818356 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17803e89-5e7e-4d37-b96f-53e26da13fc2-scripts\") pod \"nova-cell0-cell-mapping-sbjdd\" (UID: \"17803e89-5e7e-4d37-b96f-53e26da13fc2\") " pod="openstack/nova-cell0-cell-mapping-sbjdd" Mar 09 09:26:58 crc kubenswrapper[4861]: I0309 09:26:58.824025 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17803e89-5e7e-4d37-b96f-53e26da13fc2-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-sbjdd\" (UID: \"17803e89-5e7e-4d37-b96f-53e26da13fc2\") " pod="openstack/nova-cell0-cell-mapping-sbjdd" Mar 09 09:26:58 crc kubenswrapper[4861]: I0309 09:26:58.830310 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17803e89-5e7e-4d37-b96f-53e26da13fc2-config-data\") pod \"nova-cell0-cell-mapping-sbjdd\" (UID: \"17803e89-5e7e-4d37-b96f-53e26da13fc2\") " pod="openstack/nova-cell0-cell-mapping-sbjdd" Mar 09 09:26:58 crc kubenswrapper[4861]: I0309 09:26:58.861403 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49jct\" (UniqueName: \"kubernetes.io/projected/17803e89-5e7e-4d37-b96f-53e26da13fc2-kube-api-access-49jct\") pod \"nova-cell0-cell-mapping-sbjdd\" (UID: \"17803e89-5e7e-4d37-b96f-53e26da13fc2\") " pod="openstack/nova-cell0-cell-mapping-sbjdd" Mar 09 09:26:58 crc kubenswrapper[4861]: I0309 09:26:58.923243 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-sbjdd" Mar 09 09:26:58 crc kubenswrapper[4861]: I0309 09:26:58.932112 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 09 09:26:58 crc kubenswrapper[4861]: I0309 09:26:58.933613 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 09:26:58 crc kubenswrapper[4861]: I0309 09:26:58.940585 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 09 09:26:58 crc kubenswrapper[4861]: I0309 09:26:58.968318 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 09 09:26:58 crc kubenswrapper[4861]: I0309 09:26:58.991246 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 09 09:26:58 crc kubenswrapper[4861]: I0309 09:26:58.993145 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 09:26:58 crc kubenswrapper[4861]: I0309 09:26:58.996309 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.013666 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07b7fc52-e101-4d6d-bb35-c460aba044df-config-data\") pod \"nova-api-0\" (UID: \"07b7fc52-e101-4d6d-bb35-c460aba044df\") " pod="openstack/nova-api-0" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.013742 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07b7fc52-e101-4d6d-bb35-c460aba044df-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"07b7fc52-e101-4d6d-bb35-c460aba044df\") " pod="openstack/nova-api-0" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.013793 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n26zs\" (UniqueName: \"kubernetes.io/projected/07b7fc52-e101-4d6d-bb35-c460aba044df-kube-api-access-n26zs\") pod \"nova-api-0\" (UID: \"07b7fc52-e101-4d6d-bb35-c460aba044df\") " pod="openstack/nova-api-0" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.013902 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07b7fc52-e101-4d6d-bb35-c460aba044df-logs\") pod \"nova-api-0\" (UID: \"07b7fc52-e101-4d6d-bb35-c460aba044df\") " pod="openstack/nova-api-0" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.031214 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.065997 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.079820 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.086828 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.117660 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07b7fc52-e101-4d6d-bb35-c460aba044df-logs\") pod \"nova-api-0\" (UID: \"07b7fc52-e101-4d6d-bb35-c460aba044df\") " pod="openstack/nova-api-0" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.117750 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/486e3d0a-4fe7-4831-bded-2b16583f0498-config-data\") pod \"nova-metadata-0\" (UID: \"486e3d0a-4fe7-4831-bded-2b16583f0498\") " pod="openstack/nova-metadata-0" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.117790 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6szbc\" (UniqueName: \"kubernetes.io/projected/486e3d0a-4fe7-4831-bded-2b16583f0498-kube-api-access-6szbc\") pod \"nova-metadata-0\" (UID: \"486e3d0a-4fe7-4831-bded-2b16583f0498\") " pod="openstack/nova-metadata-0" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.117837 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/486e3d0a-4fe7-4831-bded-2b16583f0498-logs\") pod \"nova-metadata-0\" (UID: \"486e3d0a-4fe7-4831-bded-2b16583f0498\") " pod="openstack/nova-metadata-0" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.117866 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07b7fc52-e101-4d6d-bb35-c460aba044df-config-data\") pod \"nova-api-0\" (UID: \"07b7fc52-e101-4d6d-bb35-c460aba044df\") " pod="openstack/nova-api-0" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.117906 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/486e3d0a-4fe7-4831-bded-2b16583f0498-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"486e3d0a-4fe7-4831-bded-2b16583f0498\") " pod="openstack/nova-metadata-0" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.117939 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07b7fc52-e101-4d6d-bb35-c460aba044df-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"07b7fc52-e101-4d6d-bb35-c460aba044df\") " pod="openstack/nova-api-0" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.118006 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n26zs\" (UniqueName: \"kubernetes.io/projected/07b7fc52-e101-4d6d-bb35-c460aba044df-kube-api-access-n26zs\") pod \"nova-api-0\" (UID: \"07b7fc52-e101-4d6d-bb35-c460aba044df\") " pod="openstack/nova-api-0" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.118833 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07b7fc52-e101-4d6d-bb35-c460aba044df-logs\") pod \"nova-api-0\" (UID: \"07b7fc52-e101-4d6d-bb35-c460aba044df\") " pod="openstack/nova-api-0" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.130062 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07b7fc52-e101-4d6d-bb35-c460aba044df-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"07b7fc52-e101-4d6d-bb35-c460aba044df\") " pod="openstack/nova-api-0" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.132460 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.160095 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n26zs\" (UniqueName: \"kubernetes.io/projected/07b7fc52-e101-4d6d-bb35-c460aba044df-kube-api-access-n26zs\") pod \"nova-api-0\" (UID: \"07b7fc52-e101-4d6d-bb35-c460aba044df\") " pod="openstack/nova-api-0" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.162185 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07b7fc52-e101-4d6d-bb35-c460aba044df-config-data\") pod \"nova-api-0\" (UID: \"07b7fc52-e101-4d6d-bb35-c460aba044df\") " pod="openstack/nova-api-0" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.162301 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.219712 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05ca7d41-820a-463e-8129-a58a6b79542b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"05ca7d41-820a-463e-8129-a58a6b79542b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.220071 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d24ts\" (UniqueName: \"kubernetes.io/projected/05ca7d41-820a-463e-8129-a58a6b79542b-kube-api-access-d24ts\") pod \"nova-cell1-novncproxy-0\" (UID: \"05ca7d41-820a-463e-8129-a58a6b79542b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.220145 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05ca7d41-820a-463e-8129-a58a6b79542b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"05ca7d41-820a-463e-8129-a58a6b79542b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.220179 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/486e3d0a-4fe7-4831-bded-2b16583f0498-config-data\") pod \"nova-metadata-0\" (UID: \"486e3d0a-4fe7-4831-bded-2b16583f0498\") " pod="openstack/nova-metadata-0" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.220206 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6szbc\" (UniqueName: \"kubernetes.io/projected/486e3d0a-4fe7-4831-bded-2b16583f0498-kube-api-access-6szbc\") pod \"nova-metadata-0\" (UID: \"486e3d0a-4fe7-4831-bded-2b16583f0498\") " pod="openstack/nova-metadata-0" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.220245 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/486e3d0a-4fe7-4831-bded-2b16583f0498-logs\") pod \"nova-metadata-0\" (UID: \"486e3d0a-4fe7-4831-bded-2b16583f0498\") " pod="openstack/nova-metadata-0" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.220295 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/486e3d0a-4fe7-4831-bded-2b16583f0498-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"486e3d0a-4fe7-4831-bded-2b16583f0498\") " pod="openstack/nova-metadata-0" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.222514 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.222721 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/486e3d0a-4fe7-4831-bded-2b16583f0498-logs\") pod \"nova-metadata-0\" (UID: \"486e3d0a-4fe7-4831-bded-2b16583f0498\") " pod="openstack/nova-metadata-0" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.223572 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/486e3d0a-4fe7-4831-bded-2b16583f0498-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"486e3d0a-4fe7-4831-bded-2b16583f0498\") " pod="openstack/nova-metadata-0" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.223835 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.225912 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.279424 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.284560 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7bd5679c8c-2ns4s"] Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.285025 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.306693 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bd5679c8c-2ns4s" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.342338 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6szbc\" (UniqueName: \"kubernetes.io/projected/486e3d0a-4fe7-4831-bded-2b16583f0498-kube-api-access-6szbc\") pod \"nova-metadata-0\" (UID: \"486e3d0a-4fe7-4831-bded-2b16583f0498\") " pod="openstack/nova-metadata-0" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.344729 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/393b4059-d8bc-4bc2-ad01-3b609472c649-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"393b4059-d8bc-4bc2-ad01-3b609472c649\") " pod="openstack/nova-scheduler-0" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.344919 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/393b4059-d8bc-4bc2-ad01-3b609472c649-config-data\") pod \"nova-scheduler-0\" (UID: \"393b4059-d8bc-4bc2-ad01-3b609472c649\") " pod="openstack/nova-scheduler-0" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.345063 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05ca7d41-820a-463e-8129-a58a6b79542b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"05ca7d41-820a-463e-8129-a58a6b79542b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.345241 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d24ts\" (UniqueName: \"kubernetes.io/projected/05ca7d41-820a-463e-8129-a58a6b79542b-kube-api-access-d24ts\") pod \"nova-cell1-novncproxy-0\" (UID: \"05ca7d41-820a-463e-8129-a58a6b79542b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.345420 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05ca7d41-820a-463e-8129-a58a6b79542b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"05ca7d41-820a-463e-8129-a58a6b79542b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.345967 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhmqv\" (UniqueName: \"kubernetes.io/projected/393b4059-d8bc-4bc2-ad01-3b609472c649-kube-api-access-hhmqv\") pod \"nova-scheduler-0\" (UID: \"393b4059-d8bc-4bc2-ad01-3b609472c649\") " pod="openstack/nova-scheduler-0" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.349970 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/486e3d0a-4fe7-4831-bded-2b16583f0498-config-data\") pod \"nova-metadata-0\" (UID: \"486e3d0a-4fe7-4831-bded-2b16583f0498\") " pod="openstack/nova-metadata-0" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.352840 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.368659 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bd5679c8c-2ns4s"] Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.371882 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05ca7d41-820a-463e-8129-a58a6b79542b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"05ca7d41-820a-463e-8129-a58a6b79542b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.373128 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d24ts\" (UniqueName: \"kubernetes.io/projected/05ca7d41-820a-463e-8129-a58a6b79542b-kube-api-access-d24ts\") pod \"nova-cell1-novncproxy-0\" (UID: \"05ca7d41-820a-463e-8129-a58a6b79542b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.383544 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05ca7d41-820a-463e-8129-a58a6b79542b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"05ca7d41-820a-463e-8129-a58a6b79542b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.398552 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.453075 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/71107669-f5ae-4df6-a694-a643153ad6f4-dns-svc\") pod \"dnsmasq-dns-7bd5679c8c-2ns4s\" (UID: \"71107669-f5ae-4df6-a694-a643153ad6f4\") " pod="openstack/dnsmasq-dns-7bd5679c8c-2ns4s" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.455050 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/393b4059-d8bc-4bc2-ad01-3b609472c649-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"393b4059-d8bc-4bc2-ad01-3b609472c649\") " pod="openstack/nova-scheduler-0" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.455137 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/393b4059-d8bc-4bc2-ad01-3b609472c649-config-data\") pod \"nova-scheduler-0\" (UID: \"393b4059-d8bc-4bc2-ad01-3b609472c649\") " pod="openstack/nova-scheduler-0" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.455207 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-668hf\" (UniqueName: \"kubernetes.io/projected/71107669-f5ae-4df6-a694-a643153ad6f4-kube-api-access-668hf\") pod \"dnsmasq-dns-7bd5679c8c-2ns4s\" (UID: \"71107669-f5ae-4df6-a694-a643153ad6f4\") " pod="openstack/dnsmasq-dns-7bd5679c8c-2ns4s" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.455323 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/71107669-f5ae-4df6-a694-a643153ad6f4-ovsdbserver-sb\") pod \"dnsmasq-dns-7bd5679c8c-2ns4s\" (UID: \"71107669-f5ae-4df6-a694-a643153ad6f4\") " pod="openstack/dnsmasq-dns-7bd5679c8c-2ns4s" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.455399 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/71107669-f5ae-4df6-a694-a643153ad6f4-dns-swift-storage-0\") pod \"dnsmasq-dns-7bd5679c8c-2ns4s\" (UID: \"71107669-f5ae-4df6-a694-a643153ad6f4\") " pod="openstack/dnsmasq-dns-7bd5679c8c-2ns4s" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.455494 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71107669-f5ae-4df6-a694-a643153ad6f4-config\") pod \"dnsmasq-dns-7bd5679c8c-2ns4s\" (UID: \"71107669-f5ae-4df6-a694-a643153ad6f4\") " pod="openstack/dnsmasq-dns-7bd5679c8c-2ns4s" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.455614 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/71107669-f5ae-4df6-a694-a643153ad6f4-ovsdbserver-nb\") pod \"dnsmasq-dns-7bd5679c8c-2ns4s\" (UID: \"71107669-f5ae-4df6-a694-a643153ad6f4\") " pod="openstack/dnsmasq-dns-7bd5679c8c-2ns4s" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.455725 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhmqv\" (UniqueName: \"kubernetes.io/projected/393b4059-d8bc-4bc2-ad01-3b609472c649-kube-api-access-hhmqv\") pod \"nova-scheduler-0\" (UID: \"393b4059-d8bc-4bc2-ad01-3b609472c649\") " pod="openstack/nova-scheduler-0" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.479226 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/393b4059-d8bc-4bc2-ad01-3b609472c649-config-data\") pod \"nova-scheduler-0\" (UID: \"393b4059-d8bc-4bc2-ad01-3b609472c649\") " pod="openstack/nova-scheduler-0" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.480140 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhmqv\" (UniqueName: \"kubernetes.io/projected/393b4059-d8bc-4bc2-ad01-3b609472c649-kube-api-access-hhmqv\") pod \"nova-scheduler-0\" (UID: \"393b4059-d8bc-4bc2-ad01-3b609472c649\") " pod="openstack/nova-scheduler-0" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.482043 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/393b4059-d8bc-4bc2-ad01-3b609472c649-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"393b4059-d8bc-4bc2-ad01-3b609472c649\") " pod="openstack/nova-scheduler-0" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.557501 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/71107669-f5ae-4df6-a694-a643153ad6f4-ovsdbserver-nb\") pod \"dnsmasq-dns-7bd5679c8c-2ns4s\" (UID: \"71107669-f5ae-4df6-a694-a643153ad6f4\") " pod="openstack/dnsmasq-dns-7bd5679c8c-2ns4s" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.557649 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/71107669-f5ae-4df6-a694-a643153ad6f4-dns-svc\") pod \"dnsmasq-dns-7bd5679c8c-2ns4s\" (UID: \"71107669-f5ae-4df6-a694-a643153ad6f4\") " pod="openstack/dnsmasq-dns-7bd5679c8c-2ns4s" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.557764 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-668hf\" (UniqueName: \"kubernetes.io/projected/71107669-f5ae-4df6-a694-a643153ad6f4-kube-api-access-668hf\") pod \"dnsmasq-dns-7bd5679c8c-2ns4s\" (UID: \"71107669-f5ae-4df6-a694-a643153ad6f4\") " pod="openstack/dnsmasq-dns-7bd5679c8c-2ns4s" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.557828 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/71107669-f5ae-4df6-a694-a643153ad6f4-ovsdbserver-sb\") pod \"dnsmasq-dns-7bd5679c8c-2ns4s\" (UID: \"71107669-f5ae-4df6-a694-a643153ad6f4\") " pod="openstack/dnsmasq-dns-7bd5679c8c-2ns4s" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.557884 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/71107669-f5ae-4df6-a694-a643153ad6f4-dns-swift-storage-0\") pod \"dnsmasq-dns-7bd5679c8c-2ns4s\" (UID: \"71107669-f5ae-4df6-a694-a643153ad6f4\") " pod="openstack/dnsmasq-dns-7bd5679c8c-2ns4s" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.557943 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71107669-f5ae-4df6-a694-a643153ad6f4-config\") pod \"dnsmasq-dns-7bd5679c8c-2ns4s\" (UID: \"71107669-f5ae-4df6-a694-a643153ad6f4\") " pod="openstack/dnsmasq-dns-7bd5679c8c-2ns4s" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.559280 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/71107669-f5ae-4df6-a694-a643153ad6f4-ovsdbserver-nb\") pod \"dnsmasq-dns-7bd5679c8c-2ns4s\" (UID: \"71107669-f5ae-4df6-a694-a643153ad6f4\") " pod="openstack/dnsmasq-dns-7bd5679c8c-2ns4s" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.560546 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/71107669-f5ae-4df6-a694-a643153ad6f4-dns-swift-storage-0\") pod \"dnsmasq-dns-7bd5679c8c-2ns4s\" (UID: \"71107669-f5ae-4df6-a694-a643153ad6f4\") " pod="openstack/dnsmasq-dns-7bd5679c8c-2ns4s" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.560548 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/71107669-f5ae-4df6-a694-a643153ad6f4-dns-svc\") pod \"dnsmasq-dns-7bd5679c8c-2ns4s\" (UID: \"71107669-f5ae-4df6-a694-a643153ad6f4\") " pod="openstack/dnsmasq-dns-7bd5679c8c-2ns4s" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.562840 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71107669-f5ae-4df6-a694-a643153ad6f4-config\") pod \"dnsmasq-dns-7bd5679c8c-2ns4s\" (UID: \"71107669-f5ae-4df6-a694-a643153ad6f4\") " pod="openstack/dnsmasq-dns-7bd5679c8c-2ns4s" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.567306 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/71107669-f5ae-4df6-a694-a643153ad6f4-ovsdbserver-sb\") pod \"dnsmasq-dns-7bd5679c8c-2ns4s\" (UID: \"71107669-f5ae-4df6-a694-a643153ad6f4\") " pod="openstack/dnsmasq-dns-7bd5679c8c-2ns4s" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.589746 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-668hf\" (UniqueName: \"kubernetes.io/projected/71107669-f5ae-4df6-a694-a643153ad6f4-kube-api-access-668hf\") pod \"dnsmasq-dns-7bd5679c8c-2ns4s\" (UID: \"71107669-f5ae-4df6-a694-a643153ad6f4\") " pod="openstack/dnsmasq-dns-7bd5679c8c-2ns4s" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.590248 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.695006 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.731820 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bd5679c8c-2ns4s" Mar 09 09:26:59 crc kubenswrapper[4861]: I0309 09:26:59.795188 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-sbjdd"] Mar 09 09:27:00 crc kubenswrapper[4861]: I0309 09:27:00.011010 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 09 09:27:00 crc kubenswrapper[4861]: I0309 09:27:00.091064 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-sf294"] Mar 09 09:27:00 crc kubenswrapper[4861]: I0309 09:27:00.092817 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-sf294" Mar 09 09:27:00 crc kubenswrapper[4861]: I0309 09:27:00.096540 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 09 09:27:00 crc kubenswrapper[4861]: I0309 09:27:00.096969 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 09 09:27:00 crc kubenswrapper[4861]: I0309 09:27:00.106780 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-sf294"] Mar 09 09:27:00 crc kubenswrapper[4861]: I0309 09:27:00.117037 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 09:27:00 crc kubenswrapper[4861]: I0309 09:27:00.174290 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3c0062b-4a2c-451a-b683-eeeea965a54e-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-sf294\" (UID: \"b3c0062b-4a2c-451a-b683-eeeea965a54e\") " pod="openstack/nova-cell1-conductor-db-sync-sf294" Mar 09 09:27:00 crc kubenswrapper[4861]: I0309 09:27:00.174360 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3c0062b-4a2c-451a-b683-eeeea965a54e-config-data\") pod \"nova-cell1-conductor-db-sync-sf294\" (UID: \"b3c0062b-4a2c-451a-b683-eeeea965a54e\") " pod="openstack/nova-cell1-conductor-db-sync-sf294" Mar 09 09:27:00 crc kubenswrapper[4861]: I0309 09:27:00.174429 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrswl\" (UniqueName: \"kubernetes.io/projected/b3c0062b-4a2c-451a-b683-eeeea965a54e-kube-api-access-mrswl\") pod \"nova-cell1-conductor-db-sync-sf294\" (UID: \"b3c0062b-4a2c-451a-b683-eeeea965a54e\") " pod="openstack/nova-cell1-conductor-db-sync-sf294" Mar 09 09:27:00 crc kubenswrapper[4861]: I0309 09:27:00.174533 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3c0062b-4a2c-451a-b683-eeeea965a54e-scripts\") pod \"nova-cell1-conductor-db-sync-sf294\" (UID: \"b3c0062b-4a2c-451a-b683-eeeea965a54e\") " pod="openstack/nova-cell1-conductor-db-sync-sf294" Mar 09 09:27:00 crc kubenswrapper[4861]: I0309 09:27:00.276500 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3c0062b-4a2c-451a-b683-eeeea965a54e-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-sf294\" (UID: \"b3c0062b-4a2c-451a-b683-eeeea965a54e\") " pod="openstack/nova-cell1-conductor-db-sync-sf294" Mar 09 09:27:00 crc kubenswrapper[4861]: I0309 09:27:00.276604 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3c0062b-4a2c-451a-b683-eeeea965a54e-config-data\") pod \"nova-cell1-conductor-db-sync-sf294\" (UID: \"b3c0062b-4a2c-451a-b683-eeeea965a54e\") " pod="openstack/nova-cell1-conductor-db-sync-sf294" Mar 09 09:27:00 crc kubenswrapper[4861]: I0309 09:27:00.276643 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrswl\" (UniqueName: \"kubernetes.io/projected/b3c0062b-4a2c-451a-b683-eeeea965a54e-kube-api-access-mrswl\") pod \"nova-cell1-conductor-db-sync-sf294\" (UID: \"b3c0062b-4a2c-451a-b683-eeeea965a54e\") " pod="openstack/nova-cell1-conductor-db-sync-sf294" Mar 09 09:27:00 crc kubenswrapper[4861]: I0309 09:27:00.276727 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3c0062b-4a2c-451a-b683-eeeea965a54e-scripts\") pod \"nova-cell1-conductor-db-sync-sf294\" (UID: \"b3c0062b-4a2c-451a-b683-eeeea965a54e\") " pod="openstack/nova-cell1-conductor-db-sync-sf294" Mar 09 09:27:00 crc kubenswrapper[4861]: I0309 09:27:00.286849 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3c0062b-4a2c-451a-b683-eeeea965a54e-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-sf294\" (UID: \"b3c0062b-4a2c-451a-b683-eeeea965a54e\") " pod="openstack/nova-cell1-conductor-db-sync-sf294" Mar 09 09:27:00 crc kubenswrapper[4861]: I0309 09:27:00.288414 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3c0062b-4a2c-451a-b683-eeeea965a54e-config-data\") pod \"nova-cell1-conductor-db-sync-sf294\" (UID: \"b3c0062b-4a2c-451a-b683-eeeea965a54e\") " pod="openstack/nova-cell1-conductor-db-sync-sf294" Mar 09 09:27:00 crc kubenswrapper[4861]: I0309 09:27:00.290753 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3c0062b-4a2c-451a-b683-eeeea965a54e-scripts\") pod \"nova-cell1-conductor-db-sync-sf294\" (UID: \"b3c0062b-4a2c-451a-b683-eeeea965a54e\") " pod="openstack/nova-cell1-conductor-db-sync-sf294" Mar 09 09:27:00 crc kubenswrapper[4861]: I0309 09:27:00.292088 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 09:27:00 crc kubenswrapper[4861]: W0309 09:27:00.293397 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod486e3d0a_4fe7_4831_bded_2b16583f0498.slice/crio-b53940cbe8a147ee6f2b60ddf2aad4bcd879a9ff88f996cb2512845a4dab4a98 WatchSource:0}: Error finding container b53940cbe8a147ee6f2b60ddf2aad4bcd879a9ff88f996cb2512845a4dab4a98: Status 404 returned error can't find the container with id b53940cbe8a147ee6f2b60ddf2aad4bcd879a9ff88f996cb2512845a4dab4a98 Mar 09 09:27:00 crc kubenswrapper[4861]: I0309 09:27:00.294152 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrswl\" (UniqueName: \"kubernetes.io/projected/b3c0062b-4a2c-451a-b683-eeeea965a54e-kube-api-access-mrswl\") pod \"nova-cell1-conductor-db-sync-sf294\" (UID: \"b3c0062b-4a2c-451a-b683-eeeea965a54e\") " pod="openstack/nova-cell1-conductor-db-sync-sf294" Mar 09 09:27:00 crc kubenswrapper[4861]: I0309 09:27:00.456483 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 09:27:00 crc kubenswrapper[4861]: W0309 09:27:00.459139 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod393b4059_d8bc_4bc2_ad01_3b609472c649.slice/crio-65d5ee29454dabb5f9aa6cb0ca3d87929c73de1e855ad5bb2f6d20e50ed720bb WatchSource:0}: Error finding container 65d5ee29454dabb5f9aa6cb0ca3d87929c73de1e855ad5bb2f6d20e50ed720bb: Status 404 returned error can't find the container with id 65d5ee29454dabb5f9aa6cb0ca3d87929c73de1e855ad5bb2f6d20e50ed720bb Mar 09 09:27:00 crc kubenswrapper[4861]: I0309 09:27:00.482696 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bd5679c8c-2ns4s"] Mar 09 09:27:00 crc kubenswrapper[4861]: I0309 09:27:00.482787 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-sf294" Mar 09 09:27:00 crc kubenswrapper[4861]: W0309 09:27:00.484277 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71107669_f5ae_4df6_a694_a643153ad6f4.slice/crio-be2044ed451a8916fb2304cd4127b6dd157a5c7ee4195e491d44ccfa810879fe WatchSource:0}: Error finding container be2044ed451a8916fb2304cd4127b6dd157a5c7ee4195e491d44ccfa810879fe: Status 404 returned error can't find the container with id be2044ed451a8916fb2304cd4127b6dd157a5c7ee4195e491d44ccfa810879fe Mar 09 09:27:00 crc kubenswrapper[4861]: I0309 09:27:00.748239 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"393b4059-d8bc-4bc2-ad01-3b609472c649","Type":"ContainerStarted","Data":"65d5ee29454dabb5f9aa6cb0ca3d87929c73de1e855ad5bb2f6d20e50ed720bb"} Mar 09 09:27:00 crc kubenswrapper[4861]: I0309 09:27:00.754189 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-sbjdd" event={"ID":"17803e89-5e7e-4d37-b96f-53e26da13fc2","Type":"ContainerStarted","Data":"be077af5cfad01301f03211be785b590aa18994c7c2f198e213cf684d795b731"} Mar 09 09:27:00 crc kubenswrapper[4861]: I0309 09:27:00.754242 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-sbjdd" event={"ID":"17803e89-5e7e-4d37-b96f-53e26da13fc2","Type":"ContainerStarted","Data":"9b060681be8b6f731ff1d66ab772799a3d2b6174904f2ac77bf89e2b87a3ac4e"} Mar 09 09:27:00 crc kubenswrapper[4861]: I0309 09:27:00.757418 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd5679c8c-2ns4s" event={"ID":"71107669-f5ae-4df6-a694-a643153ad6f4","Type":"ContainerStarted","Data":"be2044ed451a8916fb2304cd4127b6dd157a5c7ee4195e491d44ccfa810879fe"} Mar 09 09:27:00 crc kubenswrapper[4861]: I0309 09:27:00.759106 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"05ca7d41-820a-463e-8129-a58a6b79542b","Type":"ContainerStarted","Data":"22731e9f85fc097dc86411cfa8a0169ee7ad7eeec061b49ac36778786da238c2"} Mar 09 09:27:00 crc kubenswrapper[4861]: I0309 09:27:00.760662 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"486e3d0a-4fe7-4831-bded-2b16583f0498","Type":"ContainerStarted","Data":"b53940cbe8a147ee6f2b60ddf2aad4bcd879a9ff88f996cb2512845a4dab4a98"} Mar 09 09:27:00 crc kubenswrapper[4861]: I0309 09:27:00.761803 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"07b7fc52-e101-4d6d-bb35-c460aba044df","Type":"ContainerStarted","Data":"099410a9d54c168bab70dd4f752581123f78655a624869ddc320c80096919580"} Mar 09 09:27:00 crc kubenswrapper[4861]: I0309 09:27:00.774395 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-sbjdd" podStartSLOduration=2.774364969 podStartE2EDuration="2.774364969s" podCreationTimestamp="2026-03-09 09:26:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:27:00.772254779 +0000 UTC m=+1263.857294180" watchObservedRunningTime="2026-03-09 09:27:00.774364969 +0000 UTC m=+1263.859404360" Mar 09 09:27:00 crc kubenswrapper[4861]: I0309 09:27:00.998428 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-sf294"] Mar 09 09:27:01 crc kubenswrapper[4861]: W0309 09:27:01.006052 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb3c0062b_4a2c_451a_b683_eeeea965a54e.slice/crio-acdff9ba253f2e782dce5ab9d35a2427d8f39fd13878312a82778eb0f0cd3c85 WatchSource:0}: Error finding container acdff9ba253f2e782dce5ab9d35a2427d8f39fd13878312a82778eb0f0cd3c85: Status 404 returned error can't find the container with id acdff9ba253f2e782dce5ab9d35a2427d8f39fd13878312a82778eb0f0cd3c85 Mar 09 09:27:01 crc kubenswrapper[4861]: I0309 09:27:01.781441 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-sf294" event={"ID":"b3c0062b-4a2c-451a-b683-eeeea965a54e","Type":"ContainerStarted","Data":"960f160c4e088103d19e3ef98910c17562f52e181ad6218e1d8e809c962675bc"} Mar 09 09:27:01 crc kubenswrapper[4861]: I0309 09:27:01.782084 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-sf294" event={"ID":"b3c0062b-4a2c-451a-b683-eeeea965a54e","Type":"ContainerStarted","Data":"acdff9ba253f2e782dce5ab9d35a2427d8f39fd13878312a82778eb0f0cd3c85"} Mar 09 09:27:01 crc kubenswrapper[4861]: I0309 09:27:01.795171 4861 generic.go:334] "Generic (PLEG): container finished" podID="71107669-f5ae-4df6-a694-a643153ad6f4" containerID="8a0d57d25fd49736c6c3dfa73f76c169f9aec799ea01849f4a3a184b99a520ad" exitCode=0 Mar 09 09:27:01 crc kubenswrapper[4861]: I0309 09:27:01.797290 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd5679c8c-2ns4s" event={"ID":"71107669-f5ae-4df6-a694-a643153ad6f4","Type":"ContainerDied","Data":"8a0d57d25fd49736c6c3dfa73f76c169f9aec799ea01849f4a3a184b99a520ad"} Mar 09 09:27:01 crc kubenswrapper[4861]: I0309 09:27:01.844283 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-sf294" podStartSLOduration=1.8442608919999999 podStartE2EDuration="1.844260892s" podCreationTimestamp="2026-03-09 09:27:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:27:01.802909383 +0000 UTC m=+1264.887948794" watchObservedRunningTime="2026-03-09 09:27:01.844260892 +0000 UTC m=+1264.929300293" Mar 09 09:27:02 crc kubenswrapper[4861]: I0309 09:27:02.826436 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 09:27:02 crc kubenswrapper[4861]: I0309 09:27:02.865353 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 09:27:05 crc kubenswrapper[4861]: I0309 09:27:05.866387 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"05ca7d41-820a-463e-8129-a58a6b79542b","Type":"ContainerStarted","Data":"0e80459d4f54dac677247cecdba18abeef2c46f728648bfb355574affecdf80f"} Mar 09 09:27:05 crc kubenswrapper[4861]: I0309 09:27:05.867157 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="05ca7d41-820a-463e-8129-a58a6b79542b" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://0e80459d4f54dac677247cecdba18abeef2c46f728648bfb355574affecdf80f" gracePeriod=30 Mar 09 09:27:05 crc kubenswrapper[4861]: I0309 09:27:05.873915 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"486e3d0a-4fe7-4831-bded-2b16583f0498","Type":"ContainerStarted","Data":"d265ae62c678d5ef17fba6e5b85f3844fb75439daab98e16a1eb1c21c24b2459"} Mar 09 09:27:05 crc kubenswrapper[4861]: I0309 09:27:05.876238 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"393b4059-d8bc-4bc2-ad01-3b609472c649","Type":"ContainerStarted","Data":"6b1fc12d14ebc4101d976c7a9ea5231249ba7f816ea7005b7278f807293ba286"} Mar 09 09:27:05 crc kubenswrapper[4861]: I0309 09:27:05.889476 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"07b7fc52-e101-4d6d-bb35-c460aba044df","Type":"ContainerStarted","Data":"71a1e47dc513402ccf0aa5c02a3c15b43fb149b33f9816e57f8c03e04ce60b0e"} Mar 09 09:27:05 crc kubenswrapper[4861]: I0309 09:27:05.894873 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.439203385 podStartE2EDuration="6.894853188s" podCreationTimestamp="2026-03-09 09:26:59 +0000 UTC" firstStartedPulling="2026-03-09 09:27:00.119327115 +0000 UTC m=+1263.204366516" lastFinishedPulling="2026-03-09 09:27:04.574976918 +0000 UTC m=+1267.660016319" observedRunningTime="2026-03-09 09:27:05.882326798 +0000 UTC m=+1268.967366219" watchObservedRunningTime="2026-03-09 09:27:05.894853188 +0000 UTC m=+1268.979892589" Mar 09 09:27:05 crc kubenswrapper[4861]: I0309 09:27:05.915941 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd5679c8c-2ns4s" event={"ID":"71107669-f5ae-4df6-a694-a643153ad6f4","Type":"ContainerStarted","Data":"95b01cabcea7fff8d5731e8e974164abb98d11542a5c5b10358d763297daab7c"} Mar 09 09:27:05 crc kubenswrapper[4861]: I0309 09:27:05.916923 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.803101098 podStartE2EDuration="6.916904032s" podCreationTimestamp="2026-03-09 09:26:59 +0000 UTC" firstStartedPulling="2026-03-09 09:27:00.461402801 +0000 UTC m=+1263.546442192" lastFinishedPulling="2026-03-09 09:27:04.575205725 +0000 UTC m=+1267.660245126" observedRunningTime="2026-03-09 09:27:05.914338598 +0000 UTC m=+1268.999377999" watchObservedRunningTime="2026-03-09 09:27:05.916904032 +0000 UTC m=+1269.001943433" Mar 09 09:27:05 crc kubenswrapper[4861]: I0309 09:27:05.920718 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7bd5679c8c-2ns4s" Mar 09 09:27:06 crc kubenswrapper[4861]: I0309 09:27:06.931736 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"486e3d0a-4fe7-4831-bded-2b16583f0498","Type":"ContainerStarted","Data":"0f895ed6068ba47fe4b4b66ed822e0f55127f509d1d757a60ba73a8360ce07df"} Mar 09 09:27:06 crc kubenswrapper[4861]: I0309 09:27:06.931880 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="486e3d0a-4fe7-4831-bded-2b16583f0498" containerName="nova-metadata-log" containerID="cri-o://d265ae62c678d5ef17fba6e5b85f3844fb75439daab98e16a1eb1c21c24b2459" gracePeriod=30 Mar 09 09:27:06 crc kubenswrapper[4861]: I0309 09:27:06.931930 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="486e3d0a-4fe7-4831-bded-2b16583f0498" containerName="nova-metadata-metadata" containerID="cri-o://0f895ed6068ba47fe4b4b66ed822e0f55127f509d1d757a60ba73a8360ce07df" gracePeriod=30 Mar 09 09:27:06 crc kubenswrapper[4861]: I0309 09:27:06.935998 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"07b7fc52-e101-4d6d-bb35-c460aba044df","Type":"ContainerStarted","Data":"c96457be6df970ca4fee8bb8744ee3afbda43952dd085697c4dd34445bbf7770"} Mar 09 09:27:06 crc kubenswrapper[4861]: I0309 09:27:06.956577 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7bd5679c8c-2ns4s" podStartSLOduration=7.956556445 podStartE2EDuration="7.956556445s" podCreationTimestamp="2026-03-09 09:26:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:27:05.942726595 +0000 UTC m=+1269.027765996" watchObservedRunningTime="2026-03-09 09:27:06.956556445 +0000 UTC m=+1270.041595846" Mar 09 09:27:06 crc kubenswrapper[4861]: I0309 09:27:06.980814 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=4.433226487 podStartE2EDuration="8.980794972s" podCreationTimestamp="2026-03-09 09:26:58 +0000 UTC" firstStartedPulling="2026-03-09 09:27:00.027942198 +0000 UTC m=+1263.112981599" lastFinishedPulling="2026-03-09 09:27:04.575510683 +0000 UTC m=+1267.660550084" observedRunningTime="2026-03-09 09:27:06.978133206 +0000 UTC m=+1270.063172597" watchObservedRunningTime="2026-03-09 09:27:06.980794972 +0000 UTC m=+1270.065834383" Mar 09 09:27:06 crc kubenswrapper[4861]: I0309 09:27:06.987591 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=4.708482451 podStartE2EDuration="8.987567037s" podCreationTimestamp="2026-03-09 09:26:58 +0000 UTC" firstStartedPulling="2026-03-09 09:27:00.297342124 +0000 UTC m=+1263.382381525" lastFinishedPulling="2026-03-09 09:27:04.57642671 +0000 UTC m=+1267.661466111" observedRunningTime="2026-03-09 09:27:06.956220045 +0000 UTC m=+1270.041259446" watchObservedRunningTime="2026-03-09 09:27:06.987567037 +0000 UTC m=+1270.072606438" Mar 09 09:27:07 crc kubenswrapper[4861]: I0309 09:27:07.498906 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 09:27:07 crc kubenswrapper[4861]: I0309 09:27:07.673820 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/486e3d0a-4fe7-4831-bded-2b16583f0498-logs\") pod \"486e3d0a-4fe7-4831-bded-2b16583f0498\" (UID: \"486e3d0a-4fe7-4831-bded-2b16583f0498\") " Mar 09 09:27:07 crc kubenswrapper[4861]: I0309 09:27:07.673960 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/486e3d0a-4fe7-4831-bded-2b16583f0498-config-data\") pod \"486e3d0a-4fe7-4831-bded-2b16583f0498\" (UID: \"486e3d0a-4fe7-4831-bded-2b16583f0498\") " Mar 09 09:27:07 crc kubenswrapper[4861]: I0309 09:27:07.674313 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/486e3d0a-4fe7-4831-bded-2b16583f0498-logs" (OuterVolumeSpecName: "logs") pod "486e3d0a-4fe7-4831-bded-2b16583f0498" (UID: "486e3d0a-4fe7-4831-bded-2b16583f0498"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:27:07 crc kubenswrapper[4861]: I0309 09:27:07.675548 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/486e3d0a-4fe7-4831-bded-2b16583f0498-combined-ca-bundle\") pod \"486e3d0a-4fe7-4831-bded-2b16583f0498\" (UID: \"486e3d0a-4fe7-4831-bded-2b16583f0498\") " Mar 09 09:27:07 crc kubenswrapper[4861]: I0309 09:27:07.675662 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6szbc\" (UniqueName: \"kubernetes.io/projected/486e3d0a-4fe7-4831-bded-2b16583f0498-kube-api-access-6szbc\") pod \"486e3d0a-4fe7-4831-bded-2b16583f0498\" (UID: \"486e3d0a-4fe7-4831-bded-2b16583f0498\") " Mar 09 09:27:07 crc kubenswrapper[4861]: I0309 09:27:07.676650 4861 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/486e3d0a-4fe7-4831-bded-2b16583f0498-logs\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:07 crc kubenswrapper[4861]: I0309 09:27:07.690572 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/486e3d0a-4fe7-4831-bded-2b16583f0498-kube-api-access-6szbc" (OuterVolumeSpecName: "kube-api-access-6szbc") pod "486e3d0a-4fe7-4831-bded-2b16583f0498" (UID: "486e3d0a-4fe7-4831-bded-2b16583f0498"). InnerVolumeSpecName "kube-api-access-6szbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:27:07 crc kubenswrapper[4861]: I0309 09:27:07.703142 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/486e3d0a-4fe7-4831-bded-2b16583f0498-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "486e3d0a-4fe7-4831-bded-2b16583f0498" (UID: "486e3d0a-4fe7-4831-bded-2b16583f0498"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:07 crc kubenswrapper[4861]: I0309 09:27:07.704596 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/486e3d0a-4fe7-4831-bded-2b16583f0498-config-data" (OuterVolumeSpecName: "config-data") pod "486e3d0a-4fe7-4831-bded-2b16583f0498" (UID: "486e3d0a-4fe7-4831-bded-2b16583f0498"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:07 crc kubenswrapper[4861]: I0309 09:27:07.783167 4861 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/486e3d0a-4fe7-4831-bded-2b16583f0498-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:07 crc kubenswrapper[4861]: I0309 09:27:07.783199 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6szbc\" (UniqueName: \"kubernetes.io/projected/486e3d0a-4fe7-4831-bded-2b16583f0498-kube-api-access-6szbc\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:07 crc kubenswrapper[4861]: I0309 09:27:07.783211 4861 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/486e3d0a-4fe7-4831-bded-2b16583f0498-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:07 crc kubenswrapper[4861]: I0309 09:27:07.946650 4861 generic.go:334] "Generic (PLEG): container finished" podID="486e3d0a-4fe7-4831-bded-2b16583f0498" containerID="0f895ed6068ba47fe4b4b66ed822e0f55127f509d1d757a60ba73a8360ce07df" exitCode=0 Mar 09 09:27:07 crc kubenswrapper[4861]: I0309 09:27:07.946682 4861 generic.go:334] "Generic (PLEG): container finished" podID="486e3d0a-4fe7-4831-bded-2b16583f0498" containerID="d265ae62c678d5ef17fba6e5b85f3844fb75439daab98e16a1eb1c21c24b2459" exitCode=143 Mar 09 09:27:07 crc kubenswrapper[4861]: I0309 09:27:07.946762 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 09:27:07 crc kubenswrapper[4861]: I0309 09:27:07.946800 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"486e3d0a-4fe7-4831-bded-2b16583f0498","Type":"ContainerDied","Data":"0f895ed6068ba47fe4b4b66ed822e0f55127f509d1d757a60ba73a8360ce07df"} Mar 09 09:27:07 crc kubenswrapper[4861]: I0309 09:27:07.946831 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"486e3d0a-4fe7-4831-bded-2b16583f0498","Type":"ContainerDied","Data":"d265ae62c678d5ef17fba6e5b85f3844fb75439daab98e16a1eb1c21c24b2459"} Mar 09 09:27:07 crc kubenswrapper[4861]: I0309 09:27:07.946846 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"486e3d0a-4fe7-4831-bded-2b16583f0498","Type":"ContainerDied","Data":"b53940cbe8a147ee6f2b60ddf2aad4bcd879a9ff88f996cb2512845a4dab4a98"} Mar 09 09:27:07 crc kubenswrapper[4861]: I0309 09:27:07.946863 4861 scope.go:117] "RemoveContainer" containerID="0f895ed6068ba47fe4b4b66ed822e0f55127f509d1d757a60ba73a8360ce07df" Mar 09 09:27:07 crc kubenswrapper[4861]: I0309 09:27:07.976789 4861 scope.go:117] "RemoveContainer" containerID="d265ae62c678d5ef17fba6e5b85f3844fb75439daab98e16a1eb1c21c24b2459" Mar 09 09:27:08 crc kubenswrapper[4861]: I0309 09:27:08.001837 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 09:27:08 crc kubenswrapper[4861]: I0309 09:27:08.015308 4861 scope.go:117] "RemoveContainer" containerID="0f895ed6068ba47fe4b4b66ed822e0f55127f509d1d757a60ba73a8360ce07df" Mar 09 09:27:08 crc kubenswrapper[4861]: E0309 09:27:08.016348 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f895ed6068ba47fe4b4b66ed822e0f55127f509d1d757a60ba73a8360ce07df\": container with ID starting with 0f895ed6068ba47fe4b4b66ed822e0f55127f509d1d757a60ba73a8360ce07df not found: ID does not exist" containerID="0f895ed6068ba47fe4b4b66ed822e0f55127f509d1d757a60ba73a8360ce07df" Mar 09 09:27:08 crc kubenswrapper[4861]: I0309 09:27:08.016485 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f895ed6068ba47fe4b4b66ed822e0f55127f509d1d757a60ba73a8360ce07df"} err="failed to get container status \"0f895ed6068ba47fe4b4b66ed822e0f55127f509d1d757a60ba73a8360ce07df\": rpc error: code = NotFound desc = could not find container \"0f895ed6068ba47fe4b4b66ed822e0f55127f509d1d757a60ba73a8360ce07df\": container with ID starting with 0f895ed6068ba47fe4b4b66ed822e0f55127f509d1d757a60ba73a8360ce07df not found: ID does not exist" Mar 09 09:27:08 crc kubenswrapper[4861]: I0309 09:27:08.016510 4861 scope.go:117] "RemoveContainer" containerID="d265ae62c678d5ef17fba6e5b85f3844fb75439daab98e16a1eb1c21c24b2459" Mar 09 09:27:08 crc kubenswrapper[4861]: E0309 09:27:08.017478 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d265ae62c678d5ef17fba6e5b85f3844fb75439daab98e16a1eb1c21c24b2459\": container with ID starting with d265ae62c678d5ef17fba6e5b85f3844fb75439daab98e16a1eb1c21c24b2459 not found: ID does not exist" containerID="d265ae62c678d5ef17fba6e5b85f3844fb75439daab98e16a1eb1c21c24b2459" Mar 09 09:27:08 crc kubenswrapper[4861]: I0309 09:27:08.017510 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d265ae62c678d5ef17fba6e5b85f3844fb75439daab98e16a1eb1c21c24b2459"} err="failed to get container status \"d265ae62c678d5ef17fba6e5b85f3844fb75439daab98e16a1eb1c21c24b2459\": rpc error: code = NotFound desc = could not find container \"d265ae62c678d5ef17fba6e5b85f3844fb75439daab98e16a1eb1c21c24b2459\": container with ID starting with d265ae62c678d5ef17fba6e5b85f3844fb75439daab98e16a1eb1c21c24b2459 not found: ID does not exist" Mar 09 09:27:08 crc kubenswrapper[4861]: I0309 09:27:08.017525 4861 scope.go:117] "RemoveContainer" containerID="0f895ed6068ba47fe4b4b66ed822e0f55127f509d1d757a60ba73a8360ce07df" Mar 09 09:27:08 crc kubenswrapper[4861]: I0309 09:27:08.018179 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f895ed6068ba47fe4b4b66ed822e0f55127f509d1d757a60ba73a8360ce07df"} err="failed to get container status \"0f895ed6068ba47fe4b4b66ed822e0f55127f509d1d757a60ba73a8360ce07df\": rpc error: code = NotFound desc = could not find container \"0f895ed6068ba47fe4b4b66ed822e0f55127f509d1d757a60ba73a8360ce07df\": container with ID starting with 0f895ed6068ba47fe4b4b66ed822e0f55127f509d1d757a60ba73a8360ce07df not found: ID does not exist" Mar 09 09:27:08 crc kubenswrapper[4861]: I0309 09:27:08.018235 4861 scope.go:117] "RemoveContainer" containerID="d265ae62c678d5ef17fba6e5b85f3844fb75439daab98e16a1eb1c21c24b2459" Mar 09 09:27:08 crc kubenswrapper[4861]: I0309 09:27:08.018646 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d265ae62c678d5ef17fba6e5b85f3844fb75439daab98e16a1eb1c21c24b2459"} err="failed to get container status \"d265ae62c678d5ef17fba6e5b85f3844fb75439daab98e16a1eb1c21c24b2459\": rpc error: code = NotFound desc = could not find container \"d265ae62c678d5ef17fba6e5b85f3844fb75439daab98e16a1eb1c21c24b2459\": container with ID starting with d265ae62c678d5ef17fba6e5b85f3844fb75439daab98e16a1eb1c21c24b2459 not found: ID does not exist" Mar 09 09:27:08 crc kubenswrapper[4861]: I0309 09:27:08.028587 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 09:27:08 crc kubenswrapper[4861]: I0309 09:27:08.067459 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 09 09:27:08 crc kubenswrapper[4861]: E0309 09:27:08.069841 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="486e3d0a-4fe7-4831-bded-2b16583f0498" containerName="nova-metadata-log" Mar 09 09:27:08 crc kubenswrapper[4861]: I0309 09:27:08.069898 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="486e3d0a-4fe7-4831-bded-2b16583f0498" containerName="nova-metadata-log" Mar 09 09:27:08 crc kubenswrapper[4861]: E0309 09:27:08.069982 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="486e3d0a-4fe7-4831-bded-2b16583f0498" containerName="nova-metadata-metadata" Mar 09 09:27:08 crc kubenswrapper[4861]: I0309 09:27:08.069999 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="486e3d0a-4fe7-4831-bded-2b16583f0498" containerName="nova-metadata-metadata" Mar 09 09:27:08 crc kubenswrapper[4861]: I0309 09:27:08.070467 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="486e3d0a-4fe7-4831-bded-2b16583f0498" containerName="nova-metadata-metadata" Mar 09 09:27:08 crc kubenswrapper[4861]: I0309 09:27:08.070501 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="486e3d0a-4fe7-4831-bded-2b16583f0498" containerName="nova-metadata-log" Mar 09 09:27:08 crc kubenswrapper[4861]: I0309 09:27:08.073422 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 09:27:08 crc kubenswrapper[4861]: I0309 09:27:08.077006 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 09 09:27:08 crc kubenswrapper[4861]: I0309 09:27:08.083758 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 09:27:08 crc kubenswrapper[4861]: I0309 09:27:08.084883 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 09 09:27:08 crc kubenswrapper[4861]: I0309 09:27:08.090162 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d429c2d8-7dd5-4d76-8568-cdeb73242010-config-data\") pod \"nova-metadata-0\" (UID: \"d429c2d8-7dd5-4d76-8568-cdeb73242010\") " pod="openstack/nova-metadata-0" Mar 09 09:27:08 crc kubenswrapper[4861]: I0309 09:27:08.090230 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d429c2d8-7dd5-4d76-8568-cdeb73242010-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d429c2d8-7dd5-4d76-8568-cdeb73242010\") " pod="openstack/nova-metadata-0" Mar 09 09:27:08 crc kubenswrapper[4861]: I0309 09:27:08.090394 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d429c2d8-7dd5-4d76-8568-cdeb73242010-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d429c2d8-7dd5-4d76-8568-cdeb73242010\") " pod="openstack/nova-metadata-0" Mar 09 09:27:08 crc kubenswrapper[4861]: I0309 09:27:08.090454 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d429c2d8-7dd5-4d76-8568-cdeb73242010-logs\") pod \"nova-metadata-0\" (UID: \"d429c2d8-7dd5-4d76-8568-cdeb73242010\") " pod="openstack/nova-metadata-0" Mar 09 09:27:08 crc kubenswrapper[4861]: I0309 09:27:08.090536 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzgkv\" (UniqueName: \"kubernetes.io/projected/d429c2d8-7dd5-4d76-8568-cdeb73242010-kube-api-access-zzgkv\") pod \"nova-metadata-0\" (UID: \"d429c2d8-7dd5-4d76-8568-cdeb73242010\") " pod="openstack/nova-metadata-0" Mar 09 09:27:08 crc kubenswrapper[4861]: I0309 09:27:08.192308 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d429c2d8-7dd5-4d76-8568-cdeb73242010-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d429c2d8-7dd5-4d76-8568-cdeb73242010\") " pod="openstack/nova-metadata-0" Mar 09 09:27:08 crc kubenswrapper[4861]: I0309 09:27:08.192407 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d429c2d8-7dd5-4d76-8568-cdeb73242010-logs\") pod \"nova-metadata-0\" (UID: \"d429c2d8-7dd5-4d76-8568-cdeb73242010\") " pod="openstack/nova-metadata-0" Mar 09 09:27:08 crc kubenswrapper[4861]: I0309 09:27:08.192481 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzgkv\" (UniqueName: \"kubernetes.io/projected/d429c2d8-7dd5-4d76-8568-cdeb73242010-kube-api-access-zzgkv\") pod \"nova-metadata-0\" (UID: \"d429c2d8-7dd5-4d76-8568-cdeb73242010\") " pod="openstack/nova-metadata-0" Mar 09 09:27:08 crc kubenswrapper[4861]: I0309 09:27:08.192570 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d429c2d8-7dd5-4d76-8568-cdeb73242010-config-data\") pod \"nova-metadata-0\" (UID: \"d429c2d8-7dd5-4d76-8568-cdeb73242010\") " pod="openstack/nova-metadata-0" Mar 09 09:27:08 crc kubenswrapper[4861]: I0309 09:27:08.192594 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d429c2d8-7dd5-4d76-8568-cdeb73242010-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d429c2d8-7dd5-4d76-8568-cdeb73242010\") " pod="openstack/nova-metadata-0" Mar 09 09:27:08 crc kubenswrapper[4861]: I0309 09:27:08.193120 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d429c2d8-7dd5-4d76-8568-cdeb73242010-logs\") pod \"nova-metadata-0\" (UID: \"d429c2d8-7dd5-4d76-8568-cdeb73242010\") " pod="openstack/nova-metadata-0" Mar 09 09:27:08 crc kubenswrapper[4861]: I0309 09:27:08.196491 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d429c2d8-7dd5-4d76-8568-cdeb73242010-config-data\") pod \"nova-metadata-0\" (UID: \"d429c2d8-7dd5-4d76-8568-cdeb73242010\") " pod="openstack/nova-metadata-0" Mar 09 09:27:08 crc kubenswrapper[4861]: I0309 09:27:08.197149 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d429c2d8-7dd5-4d76-8568-cdeb73242010-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d429c2d8-7dd5-4d76-8568-cdeb73242010\") " pod="openstack/nova-metadata-0" Mar 09 09:27:08 crc kubenswrapper[4861]: I0309 09:27:08.210035 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d429c2d8-7dd5-4d76-8568-cdeb73242010-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d429c2d8-7dd5-4d76-8568-cdeb73242010\") " pod="openstack/nova-metadata-0" Mar 09 09:27:08 crc kubenswrapper[4861]: I0309 09:27:08.210466 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzgkv\" (UniqueName: \"kubernetes.io/projected/d429c2d8-7dd5-4d76-8568-cdeb73242010-kube-api-access-zzgkv\") pod \"nova-metadata-0\" (UID: \"d429c2d8-7dd5-4d76-8568-cdeb73242010\") " pod="openstack/nova-metadata-0" Mar 09 09:27:08 crc kubenswrapper[4861]: I0309 09:27:08.415399 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 09:27:08 crc kubenswrapper[4861]: W0309 09:27:08.895460 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd429c2d8_7dd5_4d76_8568_cdeb73242010.slice/crio-930fbc504ee008208b94a3675cba8a211dcc812c4e178da46456a7f8ae1295d8 WatchSource:0}: Error finding container 930fbc504ee008208b94a3675cba8a211dcc812c4e178da46456a7f8ae1295d8: Status 404 returned error can't find the container with id 930fbc504ee008208b94a3675cba8a211dcc812c4e178da46456a7f8ae1295d8 Mar 09 09:27:08 crc kubenswrapper[4861]: I0309 09:27:08.905050 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 09:27:08 crc kubenswrapper[4861]: I0309 09:27:08.957810 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d429c2d8-7dd5-4d76-8568-cdeb73242010","Type":"ContainerStarted","Data":"930fbc504ee008208b94a3675cba8a211dcc812c4e178da46456a7f8ae1295d8"} Mar 09 09:27:08 crc kubenswrapper[4861]: I0309 09:27:08.959705 4861 generic.go:334] "Generic (PLEG): container finished" podID="17803e89-5e7e-4d37-b96f-53e26da13fc2" containerID="be077af5cfad01301f03211be785b590aa18994c7c2f198e213cf684d795b731" exitCode=0 Mar 09 09:27:08 crc kubenswrapper[4861]: I0309 09:27:08.959757 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-sbjdd" event={"ID":"17803e89-5e7e-4d37-b96f-53e26da13fc2","Type":"ContainerDied","Data":"be077af5cfad01301f03211be785b590aa18994c7c2f198e213cf684d795b731"} Mar 09 09:27:09 crc kubenswrapper[4861]: I0309 09:27:09.282447 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 09 09:27:09 crc kubenswrapper[4861]: I0309 09:27:09.282516 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 09 09:27:09 crc kubenswrapper[4861]: I0309 09:27:09.399644 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:27:09 crc kubenswrapper[4861]: I0309 09:27:09.671096 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="486e3d0a-4fe7-4831-bded-2b16583f0498" path="/var/lib/kubelet/pods/486e3d0a-4fe7-4831-bded-2b16583f0498/volumes" Mar 09 09:27:09 crc kubenswrapper[4861]: I0309 09:27:09.696712 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 09 09:27:09 crc kubenswrapper[4861]: I0309 09:27:09.696759 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 09 09:27:09 crc kubenswrapper[4861]: I0309 09:27:09.728024 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 09 09:27:09 crc kubenswrapper[4861]: I0309 09:27:09.880907 4861 scope.go:117] "RemoveContainer" containerID="585f3fac79c08536861720b7ac44790472fcc0efb6247b2bbeebe0f35028a3af" Mar 09 09:27:09 crc kubenswrapper[4861]: I0309 09:27:09.977937 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d429c2d8-7dd5-4d76-8568-cdeb73242010","Type":"ContainerStarted","Data":"77cc34933452aa9cb47250f188afa3e7a88d7dfbada7351b2b2e7656d7419941"} Mar 09 09:27:09 crc kubenswrapper[4861]: I0309 09:27:09.977988 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d429c2d8-7dd5-4d76-8568-cdeb73242010","Type":"ContainerStarted","Data":"c3af73625a6ab6498de700867ae14086535378dede19b6c25ba65630e6352e09"} Mar 09 09:27:10 crc kubenswrapper[4861]: I0309 09:27:10.014702 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.014681195 podStartE2EDuration="3.014681195s" podCreationTimestamp="2026-03-09 09:27:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:27:10.003115913 +0000 UTC m=+1273.088155334" watchObservedRunningTime="2026-03-09 09:27:10.014681195 +0000 UTC m=+1273.099720606" Mar 09 09:27:10 crc kubenswrapper[4861]: I0309 09:27:10.030532 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 09 09:27:10 crc kubenswrapper[4861]: I0309 09:27:10.319342 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-sbjdd" Mar 09 09:27:10 crc kubenswrapper[4861]: I0309 09:27:10.365640 4861 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="07b7fc52-e101-4d6d-bb35-c460aba044df" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.193:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 09:27:10 crc kubenswrapper[4861]: I0309 09:27:10.365602 4861 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="07b7fc52-e101-4d6d-bb35-c460aba044df" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.193:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 09:27:10 crc kubenswrapper[4861]: I0309 09:27:10.437854 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17803e89-5e7e-4d37-b96f-53e26da13fc2-combined-ca-bundle\") pod \"17803e89-5e7e-4d37-b96f-53e26da13fc2\" (UID: \"17803e89-5e7e-4d37-b96f-53e26da13fc2\") " Mar 09 09:27:10 crc kubenswrapper[4861]: I0309 09:27:10.437943 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17803e89-5e7e-4d37-b96f-53e26da13fc2-config-data\") pod \"17803e89-5e7e-4d37-b96f-53e26da13fc2\" (UID: \"17803e89-5e7e-4d37-b96f-53e26da13fc2\") " Mar 09 09:27:10 crc kubenswrapper[4861]: I0309 09:27:10.438062 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49jct\" (UniqueName: \"kubernetes.io/projected/17803e89-5e7e-4d37-b96f-53e26da13fc2-kube-api-access-49jct\") pod \"17803e89-5e7e-4d37-b96f-53e26da13fc2\" (UID: \"17803e89-5e7e-4d37-b96f-53e26da13fc2\") " Mar 09 09:27:10 crc kubenswrapper[4861]: I0309 09:27:10.438291 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17803e89-5e7e-4d37-b96f-53e26da13fc2-scripts\") pod \"17803e89-5e7e-4d37-b96f-53e26da13fc2\" (UID: \"17803e89-5e7e-4d37-b96f-53e26da13fc2\") " Mar 09 09:27:10 crc kubenswrapper[4861]: I0309 09:27:10.453439 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17803e89-5e7e-4d37-b96f-53e26da13fc2-scripts" (OuterVolumeSpecName: "scripts") pod "17803e89-5e7e-4d37-b96f-53e26da13fc2" (UID: "17803e89-5e7e-4d37-b96f-53e26da13fc2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:10 crc kubenswrapper[4861]: I0309 09:27:10.455591 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17803e89-5e7e-4d37-b96f-53e26da13fc2-kube-api-access-49jct" (OuterVolumeSpecName: "kube-api-access-49jct") pod "17803e89-5e7e-4d37-b96f-53e26da13fc2" (UID: "17803e89-5e7e-4d37-b96f-53e26da13fc2"). InnerVolumeSpecName "kube-api-access-49jct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:27:10 crc kubenswrapper[4861]: I0309 09:27:10.466018 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17803e89-5e7e-4d37-b96f-53e26da13fc2-config-data" (OuterVolumeSpecName: "config-data") pod "17803e89-5e7e-4d37-b96f-53e26da13fc2" (UID: "17803e89-5e7e-4d37-b96f-53e26da13fc2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:10 crc kubenswrapper[4861]: I0309 09:27:10.475977 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17803e89-5e7e-4d37-b96f-53e26da13fc2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "17803e89-5e7e-4d37-b96f-53e26da13fc2" (UID: "17803e89-5e7e-4d37-b96f-53e26da13fc2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:10 crc kubenswrapper[4861]: I0309 09:27:10.543811 4861 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17803e89-5e7e-4d37-b96f-53e26da13fc2-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:10 crc kubenswrapper[4861]: I0309 09:27:10.544114 4861 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17803e89-5e7e-4d37-b96f-53e26da13fc2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:10 crc kubenswrapper[4861]: I0309 09:27:10.544126 4861 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17803e89-5e7e-4d37-b96f-53e26da13fc2-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:10 crc kubenswrapper[4861]: I0309 09:27:10.544135 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49jct\" (UniqueName: \"kubernetes.io/projected/17803e89-5e7e-4d37-b96f-53e26da13fc2-kube-api-access-49jct\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:10 crc kubenswrapper[4861]: I0309 09:27:10.989033 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-sbjdd" event={"ID":"17803e89-5e7e-4d37-b96f-53e26da13fc2","Type":"ContainerDied","Data":"9b060681be8b6f731ff1d66ab772799a3d2b6174904f2ac77bf89e2b87a3ac4e"} Mar 09 09:27:10 crc kubenswrapper[4861]: I0309 09:27:10.989114 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b060681be8b6f731ff1d66ab772799a3d2b6174904f2ac77bf89e2b87a3ac4e" Mar 09 09:27:10 crc kubenswrapper[4861]: I0309 09:27:10.991148 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-sbjdd" Mar 09 09:27:11 crc kubenswrapper[4861]: I0309 09:27:11.154055 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 09 09:27:11 crc kubenswrapper[4861]: I0309 09:27:11.154543 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="07b7fc52-e101-4d6d-bb35-c460aba044df" containerName="nova-api-log" containerID="cri-o://71a1e47dc513402ccf0aa5c02a3c15b43fb149b33f9816e57f8c03e04ce60b0e" gracePeriod=30 Mar 09 09:27:11 crc kubenswrapper[4861]: I0309 09:27:11.154704 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="07b7fc52-e101-4d6d-bb35-c460aba044df" containerName="nova-api-api" containerID="cri-o://c96457be6df970ca4fee8bb8744ee3afbda43952dd085697c4dd34445bbf7770" gracePeriod=30 Mar 09 09:27:11 crc kubenswrapper[4861]: I0309 09:27:11.179899 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 09:27:11 crc kubenswrapper[4861]: I0309 09:27:11.195913 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 09:27:11 crc kubenswrapper[4861]: I0309 09:27:11.999530 4861 generic.go:334] "Generic (PLEG): container finished" podID="07b7fc52-e101-4d6d-bb35-c460aba044df" containerID="71a1e47dc513402ccf0aa5c02a3c15b43fb149b33f9816e57f8c03e04ce60b0e" exitCode=143 Mar 09 09:27:12 crc kubenswrapper[4861]: I0309 09:27:11.999585 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"07b7fc52-e101-4d6d-bb35-c460aba044df","Type":"ContainerDied","Data":"71a1e47dc513402ccf0aa5c02a3c15b43fb149b33f9816e57f8c03e04ce60b0e"} Mar 09 09:27:12 crc kubenswrapper[4861]: I0309 09:27:11.999863 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="393b4059-d8bc-4bc2-ad01-3b609472c649" containerName="nova-scheduler-scheduler" containerID="cri-o://6b1fc12d14ebc4101d976c7a9ea5231249ba7f816ea7005b7278f807293ba286" gracePeriod=30 Mar 09 09:27:12 crc kubenswrapper[4861]: I0309 09:27:11.999900 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d429c2d8-7dd5-4d76-8568-cdeb73242010" containerName="nova-metadata-log" containerID="cri-o://c3af73625a6ab6498de700867ae14086535378dede19b6c25ba65630e6352e09" gracePeriod=30 Mar 09 09:27:12 crc kubenswrapper[4861]: I0309 09:27:12.000049 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d429c2d8-7dd5-4d76-8568-cdeb73242010" containerName="nova-metadata-metadata" containerID="cri-o://77cc34933452aa9cb47250f188afa3e7a88d7dfbada7351b2b2e7656d7419941" gracePeriod=30 Mar 09 09:27:12 crc kubenswrapper[4861]: I0309 09:27:12.581805 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 09:27:12 crc kubenswrapper[4861]: I0309 09:27:12.686472 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzgkv\" (UniqueName: \"kubernetes.io/projected/d429c2d8-7dd5-4d76-8568-cdeb73242010-kube-api-access-zzgkv\") pod \"d429c2d8-7dd5-4d76-8568-cdeb73242010\" (UID: \"d429c2d8-7dd5-4d76-8568-cdeb73242010\") " Mar 09 09:27:12 crc kubenswrapper[4861]: I0309 09:27:12.686517 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d429c2d8-7dd5-4d76-8568-cdeb73242010-nova-metadata-tls-certs\") pod \"d429c2d8-7dd5-4d76-8568-cdeb73242010\" (UID: \"d429c2d8-7dd5-4d76-8568-cdeb73242010\") " Mar 09 09:27:12 crc kubenswrapper[4861]: I0309 09:27:12.686680 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d429c2d8-7dd5-4d76-8568-cdeb73242010-logs\") pod \"d429c2d8-7dd5-4d76-8568-cdeb73242010\" (UID: \"d429c2d8-7dd5-4d76-8568-cdeb73242010\") " Mar 09 09:27:12 crc kubenswrapper[4861]: I0309 09:27:12.686723 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d429c2d8-7dd5-4d76-8568-cdeb73242010-combined-ca-bundle\") pod \"d429c2d8-7dd5-4d76-8568-cdeb73242010\" (UID: \"d429c2d8-7dd5-4d76-8568-cdeb73242010\") " Mar 09 09:27:12 crc kubenswrapper[4861]: I0309 09:27:12.686856 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d429c2d8-7dd5-4d76-8568-cdeb73242010-config-data\") pod \"d429c2d8-7dd5-4d76-8568-cdeb73242010\" (UID: \"d429c2d8-7dd5-4d76-8568-cdeb73242010\") " Mar 09 09:27:12 crc kubenswrapper[4861]: I0309 09:27:12.687032 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d429c2d8-7dd5-4d76-8568-cdeb73242010-logs" (OuterVolumeSpecName: "logs") pod "d429c2d8-7dd5-4d76-8568-cdeb73242010" (UID: "d429c2d8-7dd5-4d76-8568-cdeb73242010"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:27:12 crc kubenswrapper[4861]: I0309 09:27:12.687551 4861 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d429c2d8-7dd5-4d76-8568-cdeb73242010-logs\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:12 crc kubenswrapper[4861]: I0309 09:27:12.692033 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d429c2d8-7dd5-4d76-8568-cdeb73242010-kube-api-access-zzgkv" (OuterVolumeSpecName: "kube-api-access-zzgkv") pod "d429c2d8-7dd5-4d76-8568-cdeb73242010" (UID: "d429c2d8-7dd5-4d76-8568-cdeb73242010"). InnerVolumeSpecName "kube-api-access-zzgkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:27:12 crc kubenswrapper[4861]: I0309 09:27:12.716462 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d429c2d8-7dd5-4d76-8568-cdeb73242010-config-data" (OuterVolumeSpecName: "config-data") pod "d429c2d8-7dd5-4d76-8568-cdeb73242010" (UID: "d429c2d8-7dd5-4d76-8568-cdeb73242010"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:12 crc kubenswrapper[4861]: I0309 09:27:12.722209 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d429c2d8-7dd5-4d76-8568-cdeb73242010-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d429c2d8-7dd5-4d76-8568-cdeb73242010" (UID: "d429c2d8-7dd5-4d76-8568-cdeb73242010"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:12 crc kubenswrapper[4861]: I0309 09:27:12.745202 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d429c2d8-7dd5-4d76-8568-cdeb73242010-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "d429c2d8-7dd5-4d76-8568-cdeb73242010" (UID: "d429c2d8-7dd5-4d76-8568-cdeb73242010"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:12 crc kubenswrapper[4861]: I0309 09:27:12.790966 4861 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d429c2d8-7dd5-4d76-8568-cdeb73242010-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:12 crc kubenswrapper[4861]: I0309 09:27:12.791193 4861 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d429c2d8-7dd5-4d76-8568-cdeb73242010-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:12 crc kubenswrapper[4861]: I0309 09:27:12.791215 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzgkv\" (UniqueName: \"kubernetes.io/projected/d429c2d8-7dd5-4d76-8568-cdeb73242010-kube-api-access-zzgkv\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:12 crc kubenswrapper[4861]: I0309 09:27:12.791229 4861 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d429c2d8-7dd5-4d76-8568-cdeb73242010-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:13 crc kubenswrapper[4861]: I0309 09:27:13.012633 4861 generic.go:334] "Generic (PLEG): container finished" podID="d429c2d8-7dd5-4d76-8568-cdeb73242010" containerID="77cc34933452aa9cb47250f188afa3e7a88d7dfbada7351b2b2e7656d7419941" exitCode=0 Mar 09 09:27:13 crc kubenswrapper[4861]: I0309 09:27:13.014184 4861 generic.go:334] "Generic (PLEG): container finished" podID="d429c2d8-7dd5-4d76-8568-cdeb73242010" containerID="c3af73625a6ab6498de700867ae14086535378dede19b6c25ba65630e6352e09" exitCode=143 Mar 09 09:27:13 crc kubenswrapper[4861]: I0309 09:27:13.012699 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 09:27:13 crc kubenswrapper[4861]: I0309 09:27:13.012718 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d429c2d8-7dd5-4d76-8568-cdeb73242010","Type":"ContainerDied","Data":"77cc34933452aa9cb47250f188afa3e7a88d7dfbada7351b2b2e7656d7419941"} Mar 09 09:27:13 crc kubenswrapper[4861]: I0309 09:27:13.014433 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d429c2d8-7dd5-4d76-8568-cdeb73242010","Type":"ContainerDied","Data":"c3af73625a6ab6498de700867ae14086535378dede19b6c25ba65630e6352e09"} Mar 09 09:27:13 crc kubenswrapper[4861]: I0309 09:27:13.014462 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d429c2d8-7dd5-4d76-8568-cdeb73242010","Type":"ContainerDied","Data":"930fbc504ee008208b94a3675cba8a211dcc812c4e178da46456a7f8ae1295d8"} Mar 09 09:27:13 crc kubenswrapper[4861]: I0309 09:27:13.014482 4861 scope.go:117] "RemoveContainer" containerID="77cc34933452aa9cb47250f188afa3e7a88d7dfbada7351b2b2e7656d7419941" Mar 09 09:27:13 crc kubenswrapper[4861]: I0309 09:27:13.019110 4861 generic.go:334] "Generic (PLEG): container finished" podID="b3c0062b-4a2c-451a-b683-eeeea965a54e" containerID="960f160c4e088103d19e3ef98910c17562f52e181ad6218e1d8e809c962675bc" exitCode=0 Mar 09 09:27:13 crc kubenswrapper[4861]: I0309 09:27:13.019157 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-sf294" event={"ID":"b3c0062b-4a2c-451a-b683-eeeea965a54e","Type":"ContainerDied","Data":"960f160c4e088103d19e3ef98910c17562f52e181ad6218e1d8e809c962675bc"} Mar 09 09:27:13 crc kubenswrapper[4861]: I0309 09:27:13.061708 4861 scope.go:117] "RemoveContainer" containerID="c3af73625a6ab6498de700867ae14086535378dede19b6c25ba65630e6352e09" Mar 09 09:27:13 crc kubenswrapper[4861]: I0309 09:27:13.073836 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 09:27:13 crc kubenswrapper[4861]: I0309 09:27:13.086870 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 09:27:13 crc kubenswrapper[4861]: I0309 09:27:13.090259 4861 scope.go:117] "RemoveContainer" containerID="77cc34933452aa9cb47250f188afa3e7a88d7dfbada7351b2b2e7656d7419941" Mar 09 09:27:13 crc kubenswrapper[4861]: E0309 09:27:13.091323 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77cc34933452aa9cb47250f188afa3e7a88d7dfbada7351b2b2e7656d7419941\": container with ID starting with 77cc34933452aa9cb47250f188afa3e7a88d7dfbada7351b2b2e7656d7419941 not found: ID does not exist" containerID="77cc34933452aa9cb47250f188afa3e7a88d7dfbada7351b2b2e7656d7419941" Mar 09 09:27:13 crc kubenswrapper[4861]: I0309 09:27:13.091389 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77cc34933452aa9cb47250f188afa3e7a88d7dfbada7351b2b2e7656d7419941"} err="failed to get container status \"77cc34933452aa9cb47250f188afa3e7a88d7dfbada7351b2b2e7656d7419941\": rpc error: code = NotFound desc = could not find container \"77cc34933452aa9cb47250f188afa3e7a88d7dfbada7351b2b2e7656d7419941\": container with ID starting with 77cc34933452aa9cb47250f188afa3e7a88d7dfbada7351b2b2e7656d7419941 not found: ID does not exist" Mar 09 09:27:13 crc kubenswrapper[4861]: I0309 09:27:13.091422 4861 scope.go:117] "RemoveContainer" containerID="c3af73625a6ab6498de700867ae14086535378dede19b6c25ba65630e6352e09" Mar 09 09:27:13 crc kubenswrapper[4861]: E0309 09:27:13.091979 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3af73625a6ab6498de700867ae14086535378dede19b6c25ba65630e6352e09\": container with ID starting with c3af73625a6ab6498de700867ae14086535378dede19b6c25ba65630e6352e09 not found: ID does not exist" containerID="c3af73625a6ab6498de700867ae14086535378dede19b6c25ba65630e6352e09" Mar 09 09:27:13 crc kubenswrapper[4861]: I0309 09:27:13.092060 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3af73625a6ab6498de700867ae14086535378dede19b6c25ba65630e6352e09"} err="failed to get container status \"c3af73625a6ab6498de700867ae14086535378dede19b6c25ba65630e6352e09\": rpc error: code = NotFound desc = could not find container \"c3af73625a6ab6498de700867ae14086535378dede19b6c25ba65630e6352e09\": container with ID starting with c3af73625a6ab6498de700867ae14086535378dede19b6c25ba65630e6352e09 not found: ID does not exist" Mar 09 09:27:13 crc kubenswrapper[4861]: I0309 09:27:13.092109 4861 scope.go:117] "RemoveContainer" containerID="77cc34933452aa9cb47250f188afa3e7a88d7dfbada7351b2b2e7656d7419941" Mar 09 09:27:13 crc kubenswrapper[4861]: I0309 09:27:13.092509 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77cc34933452aa9cb47250f188afa3e7a88d7dfbada7351b2b2e7656d7419941"} err="failed to get container status \"77cc34933452aa9cb47250f188afa3e7a88d7dfbada7351b2b2e7656d7419941\": rpc error: code = NotFound desc = could not find container \"77cc34933452aa9cb47250f188afa3e7a88d7dfbada7351b2b2e7656d7419941\": container with ID starting with 77cc34933452aa9cb47250f188afa3e7a88d7dfbada7351b2b2e7656d7419941 not found: ID does not exist" Mar 09 09:27:13 crc kubenswrapper[4861]: I0309 09:27:13.092539 4861 scope.go:117] "RemoveContainer" containerID="c3af73625a6ab6498de700867ae14086535378dede19b6c25ba65630e6352e09" Mar 09 09:27:13 crc kubenswrapper[4861]: I0309 09:27:13.092785 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3af73625a6ab6498de700867ae14086535378dede19b6c25ba65630e6352e09"} err="failed to get container status \"c3af73625a6ab6498de700867ae14086535378dede19b6c25ba65630e6352e09\": rpc error: code = NotFound desc = could not find container \"c3af73625a6ab6498de700867ae14086535378dede19b6c25ba65630e6352e09\": container with ID starting with c3af73625a6ab6498de700867ae14086535378dede19b6c25ba65630e6352e09 not found: ID does not exist" Mar 09 09:27:13 crc kubenswrapper[4861]: I0309 09:27:13.118095 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 09 09:27:13 crc kubenswrapper[4861]: E0309 09:27:13.118930 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d429c2d8-7dd5-4d76-8568-cdeb73242010" containerName="nova-metadata-log" Mar 09 09:27:13 crc kubenswrapper[4861]: I0309 09:27:13.118968 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="d429c2d8-7dd5-4d76-8568-cdeb73242010" containerName="nova-metadata-log" Mar 09 09:27:13 crc kubenswrapper[4861]: E0309 09:27:13.118986 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d429c2d8-7dd5-4d76-8568-cdeb73242010" containerName="nova-metadata-metadata" Mar 09 09:27:13 crc kubenswrapper[4861]: I0309 09:27:13.118996 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="d429c2d8-7dd5-4d76-8568-cdeb73242010" containerName="nova-metadata-metadata" Mar 09 09:27:13 crc kubenswrapper[4861]: E0309 09:27:13.119055 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17803e89-5e7e-4d37-b96f-53e26da13fc2" containerName="nova-manage" Mar 09 09:27:13 crc kubenswrapper[4861]: I0309 09:27:13.119065 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="17803e89-5e7e-4d37-b96f-53e26da13fc2" containerName="nova-manage" Mar 09 09:27:13 crc kubenswrapper[4861]: I0309 09:27:13.119401 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="17803e89-5e7e-4d37-b96f-53e26da13fc2" containerName="nova-manage" Mar 09 09:27:13 crc kubenswrapper[4861]: I0309 09:27:13.119443 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="d429c2d8-7dd5-4d76-8568-cdeb73242010" containerName="nova-metadata-metadata" Mar 09 09:27:13 crc kubenswrapper[4861]: I0309 09:27:13.119466 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="d429c2d8-7dd5-4d76-8568-cdeb73242010" containerName="nova-metadata-log" Mar 09 09:27:13 crc kubenswrapper[4861]: I0309 09:27:13.121236 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 09:27:13 crc kubenswrapper[4861]: I0309 09:27:13.131316 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 09 09:27:13 crc kubenswrapper[4861]: I0309 09:27:13.134495 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 09 09:27:13 crc kubenswrapper[4861]: I0309 09:27:13.141348 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 09:27:13 crc kubenswrapper[4861]: I0309 09:27:13.199450 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91be51cc-9158-4ade-b36c-cb7bc65b006e-config-data\") pod \"nova-metadata-0\" (UID: \"91be51cc-9158-4ade-b36c-cb7bc65b006e\") " pod="openstack/nova-metadata-0" Mar 09 09:27:13 crc kubenswrapper[4861]: I0309 09:27:13.199525 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdsc9\" (UniqueName: \"kubernetes.io/projected/91be51cc-9158-4ade-b36c-cb7bc65b006e-kube-api-access-pdsc9\") pod \"nova-metadata-0\" (UID: \"91be51cc-9158-4ade-b36c-cb7bc65b006e\") " pod="openstack/nova-metadata-0" Mar 09 09:27:13 crc kubenswrapper[4861]: I0309 09:27:13.199558 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/91be51cc-9158-4ade-b36c-cb7bc65b006e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"91be51cc-9158-4ade-b36c-cb7bc65b006e\") " pod="openstack/nova-metadata-0" Mar 09 09:27:13 crc kubenswrapper[4861]: I0309 09:27:13.199603 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91be51cc-9158-4ade-b36c-cb7bc65b006e-logs\") pod \"nova-metadata-0\" (UID: \"91be51cc-9158-4ade-b36c-cb7bc65b006e\") " pod="openstack/nova-metadata-0" Mar 09 09:27:13 crc kubenswrapper[4861]: I0309 09:27:13.199679 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91be51cc-9158-4ade-b36c-cb7bc65b006e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"91be51cc-9158-4ade-b36c-cb7bc65b006e\") " pod="openstack/nova-metadata-0" Mar 09 09:27:13 crc kubenswrapper[4861]: I0309 09:27:13.301689 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91be51cc-9158-4ade-b36c-cb7bc65b006e-config-data\") pod \"nova-metadata-0\" (UID: \"91be51cc-9158-4ade-b36c-cb7bc65b006e\") " pod="openstack/nova-metadata-0" Mar 09 09:27:13 crc kubenswrapper[4861]: I0309 09:27:13.301772 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdsc9\" (UniqueName: \"kubernetes.io/projected/91be51cc-9158-4ade-b36c-cb7bc65b006e-kube-api-access-pdsc9\") pod \"nova-metadata-0\" (UID: \"91be51cc-9158-4ade-b36c-cb7bc65b006e\") " pod="openstack/nova-metadata-0" Mar 09 09:27:13 crc kubenswrapper[4861]: I0309 09:27:13.301809 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/91be51cc-9158-4ade-b36c-cb7bc65b006e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"91be51cc-9158-4ade-b36c-cb7bc65b006e\") " pod="openstack/nova-metadata-0" Mar 09 09:27:13 crc kubenswrapper[4861]: I0309 09:27:13.301856 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91be51cc-9158-4ade-b36c-cb7bc65b006e-logs\") pod \"nova-metadata-0\" (UID: \"91be51cc-9158-4ade-b36c-cb7bc65b006e\") " pod="openstack/nova-metadata-0" Mar 09 09:27:13 crc kubenswrapper[4861]: I0309 09:27:13.301891 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91be51cc-9158-4ade-b36c-cb7bc65b006e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"91be51cc-9158-4ade-b36c-cb7bc65b006e\") " pod="openstack/nova-metadata-0" Mar 09 09:27:13 crc kubenswrapper[4861]: I0309 09:27:13.302978 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91be51cc-9158-4ade-b36c-cb7bc65b006e-logs\") pod \"nova-metadata-0\" (UID: \"91be51cc-9158-4ade-b36c-cb7bc65b006e\") " pod="openstack/nova-metadata-0" Mar 09 09:27:13 crc kubenswrapper[4861]: I0309 09:27:13.306527 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91be51cc-9158-4ade-b36c-cb7bc65b006e-config-data\") pod \"nova-metadata-0\" (UID: \"91be51cc-9158-4ade-b36c-cb7bc65b006e\") " pod="openstack/nova-metadata-0" Mar 09 09:27:13 crc kubenswrapper[4861]: I0309 09:27:13.308944 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/91be51cc-9158-4ade-b36c-cb7bc65b006e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"91be51cc-9158-4ade-b36c-cb7bc65b006e\") " pod="openstack/nova-metadata-0" Mar 09 09:27:13 crc kubenswrapper[4861]: I0309 09:27:13.309197 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91be51cc-9158-4ade-b36c-cb7bc65b006e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"91be51cc-9158-4ade-b36c-cb7bc65b006e\") " pod="openstack/nova-metadata-0" Mar 09 09:27:13 crc kubenswrapper[4861]: I0309 09:27:13.324031 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdsc9\" (UniqueName: \"kubernetes.io/projected/91be51cc-9158-4ade-b36c-cb7bc65b006e-kube-api-access-pdsc9\") pod \"nova-metadata-0\" (UID: \"91be51cc-9158-4ade-b36c-cb7bc65b006e\") " pod="openstack/nova-metadata-0" Mar 09 09:27:13 crc kubenswrapper[4861]: I0309 09:27:13.476982 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 09:27:13 crc kubenswrapper[4861]: I0309 09:27:13.684311 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d429c2d8-7dd5-4d76-8568-cdeb73242010" path="/var/lib/kubelet/pods/d429c2d8-7dd5-4d76-8568-cdeb73242010/volumes" Mar 09 09:27:13 crc kubenswrapper[4861]: I0309 09:27:13.688060 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 09:27:13 crc kubenswrapper[4861]: I0309 09:27:13.815962 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhmqv\" (UniqueName: \"kubernetes.io/projected/393b4059-d8bc-4bc2-ad01-3b609472c649-kube-api-access-hhmqv\") pod \"393b4059-d8bc-4bc2-ad01-3b609472c649\" (UID: \"393b4059-d8bc-4bc2-ad01-3b609472c649\") " Mar 09 09:27:13 crc kubenswrapper[4861]: I0309 09:27:13.816023 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/393b4059-d8bc-4bc2-ad01-3b609472c649-combined-ca-bundle\") pod \"393b4059-d8bc-4bc2-ad01-3b609472c649\" (UID: \"393b4059-d8bc-4bc2-ad01-3b609472c649\") " Mar 09 09:27:13 crc kubenswrapper[4861]: I0309 09:27:13.816073 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/393b4059-d8bc-4bc2-ad01-3b609472c649-config-data\") pod \"393b4059-d8bc-4bc2-ad01-3b609472c649\" (UID: \"393b4059-d8bc-4bc2-ad01-3b609472c649\") " Mar 09 09:27:13 crc kubenswrapper[4861]: I0309 09:27:13.820675 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/393b4059-d8bc-4bc2-ad01-3b609472c649-kube-api-access-hhmqv" (OuterVolumeSpecName: "kube-api-access-hhmqv") pod "393b4059-d8bc-4bc2-ad01-3b609472c649" (UID: "393b4059-d8bc-4bc2-ad01-3b609472c649"). InnerVolumeSpecName "kube-api-access-hhmqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:27:13 crc kubenswrapper[4861]: I0309 09:27:13.849172 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/393b4059-d8bc-4bc2-ad01-3b609472c649-config-data" (OuterVolumeSpecName: "config-data") pod "393b4059-d8bc-4bc2-ad01-3b609472c649" (UID: "393b4059-d8bc-4bc2-ad01-3b609472c649"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:13 crc kubenswrapper[4861]: I0309 09:27:13.849171 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/393b4059-d8bc-4bc2-ad01-3b609472c649-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "393b4059-d8bc-4bc2-ad01-3b609472c649" (UID: "393b4059-d8bc-4bc2-ad01-3b609472c649"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:13 crc kubenswrapper[4861]: I0309 09:27:13.917806 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhmqv\" (UniqueName: \"kubernetes.io/projected/393b4059-d8bc-4bc2-ad01-3b609472c649-kube-api-access-hhmqv\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:13 crc kubenswrapper[4861]: I0309 09:27:13.918296 4861 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/393b4059-d8bc-4bc2-ad01-3b609472c649-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:13 crc kubenswrapper[4861]: I0309 09:27:13.918380 4861 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/393b4059-d8bc-4bc2-ad01-3b609472c649-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:13 crc kubenswrapper[4861]: I0309 09:27:13.949860 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 09:27:13 crc kubenswrapper[4861]: W0309 09:27:13.955841 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91be51cc_9158_4ade_b36c_cb7bc65b006e.slice/crio-68c6340eb22aae18335395601f5bab91668c7125ae3d77aa613158b2f9af0b26 WatchSource:0}: Error finding container 68c6340eb22aae18335395601f5bab91668c7125ae3d77aa613158b2f9af0b26: Status 404 returned error can't find the container with id 68c6340eb22aae18335395601f5bab91668c7125ae3d77aa613158b2f9af0b26 Mar 09 09:27:14 crc kubenswrapper[4861]: I0309 09:27:14.029389 4861 generic.go:334] "Generic (PLEG): container finished" podID="393b4059-d8bc-4bc2-ad01-3b609472c649" containerID="6b1fc12d14ebc4101d976c7a9ea5231249ba7f816ea7005b7278f807293ba286" exitCode=0 Mar 09 09:27:14 crc kubenswrapper[4861]: I0309 09:27:14.029445 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 09:27:14 crc kubenswrapper[4861]: I0309 09:27:14.029444 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"393b4059-d8bc-4bc2-ad01-3b609472c649","Type":"ContainerDied","Data":"6b1fc12d14ebc4101d976c7a9ea5231249ba7f816ea7005b7278f807293ba286"} Mar 09 09:27:14 crc kubenswrapper[4861]: I0309 09:27:14.030510 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"393b4059-d8bc-4bc2-ad01-3b609472c649","Type":"ContainerDied","Data":"65d5ee29454dabb5f9aa6cb0ca3d87929c73de1e855ad5bb2f6d20e50ed720bb"} Mar 09 09:27:14 crc kubenswrapper[4861]: I0309 09:27:14.030532 4861 scope.go:117] "RemoveContainer" containerID="6b1fc12d14ebc4101d976c7a9ea5231249ba7f816ea7005b7278f807293ba286" Mar 09 09:27:14 crc kubenswrapper[4861]: I0309 09:27:14.032025 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"91be51cc-9158-4ade-b36c-cb7bc65b006e","Type":"ContainerStarted","Data":"68c6340eb22aae18335395601f5bab91668c7125ae3d77aa613158b2f9af0b26"} Mar 09 09:27:14 crc kubenswrapper[4861]: I0309 09:27:14.075758 4861 scope.go:117] "RemoveContainer" containerID="6b1fc12d14ebc4101d976c7a9ea5231249ba7f816ea7005b7278f807293ba286" Mar 09 09:27:14 crc kubenswrapper[4861]: E0309 09:27:14.080077 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b1fc12d14ebc4101d976c7a9ea5231249ba7f816ea7005b7278f807293ba286\": container with ID starting with 6b1fc12d14ebc4101d976c7a9ea5231249ba7f816ea7005b7278f807293ba286 not found: ID does not exist" containerID="6b1fc12d14ebc4101d976c7a9ea5231249ba7f816ea7005b7278f807293ba286" Mar 09 09:27:14 crc kubenswrapper[4861]: I0309 09:27:14.080146 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b1fc12d14ebc4101d976c7a9ea5231249ba7f816ea7005b7278f807293ba286"} err="failed to get container status \"6b1fc12d14ebc4101d976c7a9ea5231249ba7f816ea7005b7278f807293ba286\": rpc error: code = NotFound desc = could not find container \"6b1fc12d14ebc4101d976c7a9ea5231249ba7f816ea7005b7278f807293ba286\": container with ID starting with 6b1fc12d14ebc4101d976c7a9ea5231249ba7f816ea7005b7278f807293ba286 not found: ID does not exist" Mar 09 09:27:14 crc kubenswrapper[4861]: I0309 09:27:14.083763 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 09:27:14 crc kubenswrapper[4861]: I0309 09:27:14.092813 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 09:27:14 crc kubenswrapper[4861]: I0309 09:27:14.102200 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 09:27:14 crc kubenswrapper[4861]: E0309 09:27:14.102706 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="393b4059-d8bc-4bc2-ad01-3b609472c649" containerName="nova-scheduler-scheduler" Mar 09 09:27:14 crc kubenswrapper[4861]: I0309 09:27:14.102720 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="393b4059-d8bc-4bc2-ad01-3b609472c649" containerName="nova-scheduler-scheduler" Mar 09 09:27:14 crc kubenswrapper[4861]: I0309 09:27:14.102923 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="393b4059-d8bc-4bc2-ad01-3b609472c649" containerName="nova-scheduler-scheduler" Mar 09 09:27:14 crc kubenswrapper[4861]: I0309 09:27:14.103695 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 09:27:14 crc kubenswrapper[4861]: I0309 09:27:14.111873 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 09:27:14 crc kubenswrapper[4861]: I0309 09:27:14.118013 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 09 09:27:14 crc kubenswrapper[4861]: I0309 09:27:14.223904 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72e92702-681e-4575-84da-71f26ef95ebf-config-data\") pod \"nova-scheduler-0\" (UID: \"72e92702-681e-4575-84da-71f26ef95ebf\") " pod="openstack/nova-scheduler-0" Mar 09 09:27:14 crc kubenswrapper[4861]: I0309 09:27:14.224335 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72e92702-681e-4575-84da-71f26ef95ebf-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"72e92702-681e-4575-84da-71f26ef95ebf\") " pod="openstack/nova-scheduler-0" Mar 09 09:27:14 crc kubenswrapper[4861]: I0309 09:27:14.224455 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xn4c\" (UniqueName: \"kubernetes.io/projected/72e92702-681e-4575-84da-71f26ef95ebf-kube-api-access-6xn4c\") pod \"nova-scheduler-0\" (UID: \"72e92702-681e-4575-84da-71f26ef95ebf\") " pod="openstack/nova-scheduler-0" Mar 09 09:27:14 crc kubenswrapper[4861]: I0309 09:27:14.340741 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72e92702-681e-4575-84da-71f26ef95ebf-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"72e92702-681e-4575-84da-71f26ef95ebf\") " pod="openstack/nova-scheduler-0" Mar 09 09:27:14 crc kubenswrapper[4861]: I0309 09:27:14.341013 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xn4c\" (UniqueName: \"kubernetes.io/projected/72e92702-681e-4575-84da-71f26ef95ebf-kube-api-access-6xn4c\") pod \"nova-scheduler-0\" (UID: \"72e92702-681e-4575-84da-71f26ef95ebf\") " pod="openstack/nova-scheduler-0" Mar 09 09:27:14 crc kubenswrapper[4861]: I0309 09:27:14.341285 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72e92702-681e-4575-84da-71f26ef95ebf-config-data\") pod \"nova-scheduler-0\" (UID: \"72e92702-681e-4575-84da-71f26ef95ebf\") " pod="openstack/nova-scheduler-0" Mar 09 09:27:14 crc kubenswrapper[4861]: I0309 09:27:14.345947 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72e92702-681e-4575-84da-71f26ef95ebf-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"72e92702-681e-4575-84da-71f26ef95ebf\") " pod="openstack/nova-scheduler-0" Mar 09 09:27:14 crc kubenswrapper[4861]: I0309 09:27:14.346004 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72e92702-681e-4575-84da-71f26ef95ebf-config-data\") pod \"nova-scheduler-0\" (UID: \"72e92702-681e-4575-84da-71f26ef95ebf\") " pod="openstack/nova-scheduler-0" Mar 09 09:27:14 crc kubenswrapper[4861]: I0309 09:27:14.370432 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xn4c\" (UniqueName: \"kubernetes.io/projected/72e92702-681e-4575-84da-71f26ef95ebf-kube-api-access-6xn4c\") pod \"nova-scheduler-0\" (UID: \"72e92702-681e-4575-84da-71f26ef95ebf\") " pod="openstack/nova-scheduler-0" Mar 09 09:27:14 crc kubenswrapper[4861]: I0309 09:27:14.372360 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-sf294" Mar 09 09:27:14 crc kubenswrapper[4861]: I0309 09:27:14.437652 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 09:27:14 crc kubenswrapper[4861]: I0309 09:27:14.549439 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3c0062b-4a2c-451a-b683-eeeea965a54e-combined-ca-bundle\") pod \"b3c0062b-4a2c-451a-b683-eeeea965a54e\" (UID: \"b3c0062b-4a2c-451a-b683-eeeea965a54e\") " Mar 09 09:27:14 crc kubenswrapper[4861]: I0309 09:27:14.549503 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrswl\" (UniqueName: \"kubernetes.io/projected/b3c0062b-4a2c-451a-b683-eeeea965a54e-kube-api-access-mrswl\") pod \"b3c0062b-4a2c-451a-b683-eeeea965a54e\" (UID: \"b3c0062b-4a2c-451a-b683-eeeea965a54e\") " Mar 09 09:27:14 crc kubenswrapper[4861]: I0309 09:27:14.549547 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3c0062b-4a2c-451a-b683-eeeea965a54e-scripts\") pod \"b3c0062b-4a2c-451a-b683-eeeea965a54e\" (UID: \"b3c0062b-4a2c-451a-b683-eeeea965a54e\") " Mar 09 09:27:14 crc kubenswrapper[4861]: I0309 09:27:14.549586 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3c0062b-4a2c-451a-b683-eeeea965a54e-config-data\") pod \"b3c0062b-4a2c-451a-b683-eeeea965a54e\" (UID: \"b3c0062b-4a2c-451a-b683-eeeea965a54e\") " Mar 09 09:27:14 crc kubenswrapper[4861]: I0309 09:27:14.555119 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3c0062b-4a2c-451a-b683-eeeea965a54e-scripts" (OuterVolumeSpecName: "scripts") pod "b3c0062b-4a2c-451a-b683-eeeea965a54e" (UID: "b3c0062b-4a2c-451a-b683-eeeea965a54e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:14 crc kubenswrapper[4861]: I0309 09:27:14.559069 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3c0062b-4a2c-451a-b683-eeeea965a54e-kube-api-access-mrswl" (OuterVolumeSpecName: "kube-api-access-mrswl") pod "b3c0062b-4a2c-451a-b683-eeeea965a54e" (UID: "b3c0062b-4a2c-451a-b683-eeeea965a54e"). InnerVolumeSpecName "kube-api-access-mrswl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:27:14 crc kubenswrapper[4861]: I0309 09:27:14.577206 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3c0062b-4a2c-451a-b683-eeeea965a54e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b3c0062b-4a2c-451a-b683-eeeea965a54e" (UID: "b3c0062b-4a2c-451a-b683-eeeea965a54e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:14 crc kubenswrapper[4861]: I0309 09:27:14.581052 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3c0062b-4a2c-451a-b683-eeeea965a54e-config-data" (OuterVolumeSpecName: "config-data") pod "b3c0062b-4a2c-451a-b683-eeeea965a54e" (UID: "b3c0062b-4a2c-451a-b683-eeeea965a54e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:14 crc kubenswrapper[4861]: I0309 09:27:14.655454 4861 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3c0062b-4a2c-451a-b683-eeeea965a54e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:14 crc kubenswrapper[4861]: I0309 09:27:14.655719 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrswl\" (UniqueName: \"kubernetes.io/projected/b3c0062b-4a2c-451a-b683-eeeea965a54e-kube-api-access-mrswl\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:14 crc kubenswrapper[4861]: I0309 09:27:14.655796 4861 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b3c0062b-4a2c-451a-b683-eeeea965a54e-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:14 crc kubenswrapper[4861]: I0309 09:27:14.655858 4861 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b3c0062b-4a2c-451a-b683-eeeea965a54e-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:14 crc kubenswrapper[4861]: I0309 09:27:14.733564 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7bd5679c8c-2ns4s" Mar 09 09:27:14 crc kubenswrapper[4861]: I0309 09:27:14.791579 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b8fcc65cc-466c6"] Mar 09 09:27:14 crc kubenswrapper[4861]: I0309 09:27:14.792233 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b8fcc65cc-466c6" podUID="cc825c65-a951-464c-87d3-3e3bedee3e50" containerName="dnsmasq-dns" containerID="cri-o://0e949aca604757214fec9b436182444696e848dd9f9110091d4fb4c98eca8660" gracePeriod=10 Mar 09 09:27:14 crc kubenswrapper[4861]: I0309 09:27:14.910153 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 09:27:14 crc kubenswrapper[4861]: W0309 09:27:14.920702 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72e92702_681e_4575_84da_71f26ef95ebf.slice/crio-552e811ffa96a97a38b9382431a9e2a2a1903ffec26e36aede014fbe3f480224 WatchSource:0}: Error finding container 552e811ffa96a97a38b9382431a9e2a2a1903ffec26e36aede014fbe3f480224: Status 404 returned error can't find the container with id 552e811ffa96a97a38b9382431a9e2a2a1903ffec26e36aede014fbe3f480224 Mar 09 09:27:15 crc kubenswrapper[4861]: I0309 09:27:15.047084 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-sf294" event={"ID":"b3c0062b-4a2c-451a-b683-eeeea965a54e","Type":"ContainerDied","Data":"acdff9ba253f2e782dce5ab9d35a2427d8f39fd13878312a82778eb0f0cd3c85"} Mar 09 09:27:15 crc kubenswrapper[4861]: I0309 09:27:15.047486 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="acdff9ba253f2e782dce5ab9d35a2427d8f39fd13878312a82778eb0f0cd3c85" Mar 09 09:27:15 crc kubenswrapper[4861]: I0309 09:27:15.047345 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-sf294" Mar 09 09:27:15 crc kubenswrapper[4861]: I0309 09:27:15.054837 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"91be51cc-9158-4ade-b36c-cb7bc65b006e","Type":"ContainerStarted","Data":"9223ada1a05d882dfb343dfc0c60faa4904ce5cbff03a162d9b13cfb54a4e2bf"} Mar 09 09:27:15 crc kubenswrapper[4861]: I0309 09:27:15.054891 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"91be51cc-9158-4ade-b36c-cb7bc65b006e","Type":"ContainerStarted","Data":"000c5555aae3e05ddcec2b60140e35dcbd36b7dccbfbf507e97c39d1c0d74722"} Mar 09 09:27:15 crc kubenswrapper[4861]: I0309 09:27:15.066548 4861 generic.go:334] "Generic (PLEG): container finished" podID="cc825c65-a951-464c-87d3-3e3bedee3e50" containerID="0e949aca604757214fec9b436182444696e848dd9f9110091d4fb4c98eca8660" exitCode=0 Mar 09 09:27:15 crc kubenswrapper[4861]: I0309 09:27:15.067103 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b8fcc65cc-466c6" event={"ID":"cc825c65-a951-464c-87d3-3e3bedee3e50","Type":"ContainerDied","Data":"0e949aca604757214fec9b436182444696e848dd9f9110091d4fb4c98eca8660"} Mar 09 09:27:15 crc kubenswrapper[4861]: I0309 09:27:15.088093 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"72e92702-681e-4575-84da-71f26ef95ebf","Type":"ContainerStarted","Data":"552e811ffa96a97a38b9382431a9e2a2a1903ffec26e36aede014fbe3f480224"} Mar 09 09:27:15 crc kubenswrapper[4861]: I0309 09:27:15.092704 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.092685122 podStartE2EDuration="2.092685122s" podCreationTimestamp="2026-03-09 09:27:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:27:15.078289048 +0000 UTC m=+1278.163328449" watchObservedRunningTime="2026-03-09 09:27:15.092685122 +0000 UTC m=+1278.177724523" Mar 09 09:27:15 crc kubenswrapper[4861]: I0309 09:27:15.142547 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 09 09:27:15 crc kubenswrapper[4861]: E0309 09:27:15.143028 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3c0062b-4a2c-451a-b683-eeeea965a54e" containerName="nova-cell1-conductor-db-sync" Mar 09 09:27:15 crc kubenswrapper[4861]: I0309 09:27:15.143043 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3c0062b-4a2c-451a-b683-eeeea965a54e" containerName="nova-cell1-conductor-db-sync" Mar 09 09:27:15 crc kubenswrapper[4861]: I0309 09:27:15.143289 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3c0062b-4a2c-451a-b683-eeeea965a54e" containerName="nova-cell1-conductor-db-sync" Mar 09 09:27:15 crc kubenswrapper[4861]: I0309 09:27:15.143965 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 09 09:27:15 crc kubenswrapper[4861]: I0309 09:27:15.146133 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 09 09:27:15 crc kubenswrapper[4861]: I0309 09:27:15.166149 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5b057f4-8239-4e46-b205-81552d6cd5e6-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"c5b057f4-8239-4e46-b205-81552d6cd5e6\") " pod="openstack/nova-cell1-conductor-0" Mar 09 09:27:15 crc kubenswrapper[4861]: I0309 09:27:15.166220 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5b057f4-8239-4e46-b205-81552d6cd5e6-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"c5b057f4-8239-4e46-b205-81552d6cd5e6\") " pod="openstack/nova-cell1-conductor-0" Mar 09 09:27:15 crc kubenswrapper[4861]: I0309 09:27:15.166262 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xpgb\" (UniqueName: \"kubernetes.io/projected/c5b057f4-8239-4e46-b205-81552d6cd5e6-kube-api-access-9xpgb\") pod \"nova-cell1-conductor-0\" (UID: \"c5b057f4-8239-4e46-b205-81552d6cd5e6\") " pod="openstack/nova-cell1-conductor-0" Mar 09 09:27:15 crc kubenswrapper[4861]: I0309 09:27:15.176451 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 09 09:27:15 crc kubenswrapper[4861]: I0309 09:27:15.268619 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xpgb\" (UniqueName: \"kubernetes.io/projected/c5b057f4-8239-4e46-b205-81552d6cd5e6-kube-api-access-9xpgb\") pod \"nova-cell1-conductor-0\" (UID: \"c5b057f4-8239-4e46-b205-81552d6cd5e6\") " pod="openstack/nova-cell1-conductor-0" Mar 09 09:27:15 crc kubenswrapper[4861]: I0309 09:27:15.269043 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5b057f4-8239-4e46-b205-81552d6cd5e6-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"c5b057f4-8239-4e46-b205-81552d6cd5e6\") " pod="openstack/nova-cell1-conductor-0" Mar 09 09:27:15 crc kubenswrapper[4861]: I0309 09:27:15.269106 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5b057f4-8239-4e46-b205-81552d6cd5e6-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"c5b057f4-8239-4e46-b205-81552d6cd5e6\") " pod="openstack/nova-cell1-conductor-0" Mar 09 09:27:15 crc kubenswrapper[4861]: I0309 09:27:15.274118 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5b057f4-8239-4e46-b205-81552d6cd5e6-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"c5b057f4-8239-4e46-b205-81552d6cd5e6\") " pod="openstack/nova-cell1-conductor-0" Mar 09 09:27:15 crc kubenswrapper[4861]: I0309 09:27:15.274137 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5b057f4-8239-4e46-b205-81552d6cd5e6-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"c5b057f4-8239-4e46-b205-81552d6cd5e6\") " pod="openstack/nova-cell1-conductor-0" Mar 09 09:27:15 crc kubenswrapper[4861]: I0309 09:27:15.275827 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b8fcc65cc-466c6" Mar 09 09:27:15 crc kubenswrapper[4861]: I0309 09:27:15.286699 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xpgb\" (UniqueName: \"kubernetes.io/projected/c5b057f4-8239-4e46-b205-81552d6cd5e6-kube-api-access-9xpgb\") pod \"nova-cell1-conductor-0\" (UID: \"c5b057f4-8239-4e46-b205-81552d6cd5e6\") " pod="openstack/nova-cell1-conductor-0" Mar 09 09:27:15 crc kubenswrapper[4861]: I0309 09:27:15.370570 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc825c65-a951-464c-87d3-3e3bedee3e50-ovsdbserver-sb\") pod \"cc825c65-a951-464c-87d3-3e3bedee3e50\" (UID: \"cc825c65-a951-464c-87d3-3e3bedee3e50\") " Mar 09 09:27:15 crc kubenswrapper[4861]: I0309 09:27:15.370711 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc825c65-a951-464c-87d3-3e3bedee3e50-dns-svc\") pod \"cc825c65-a951-464c-87d3-3e3bedee3e50\" (UID: \"cc825c65-a951-464c-87d3-3e3bedee3e50\") " Mar 09 09:27:15 crc kubenswrapper[4861]: I0309 09:27:15.370898 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cc825c65-a951-464c-87d3-3e3bedee3e50-dns-swift-storage-0\") pod \"cc825c65-a951-464c-87d3-3e3bedee3e50\" (UID: \"cc825c65-a951-464c-87d3-3e3bedee3e50\") " Mar 09 09:27:15 crc kubenswrapper[4861]: I0309 09:27:15.371006 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc825c65-a951-464c-87d3-3e3bedee3e50-ovsdbserver-nb\") pod \"cc825c65-a951-464c-87d3-3e3bedee3e50\" (UID: \"cc825c65-a951-464c-87d3-3e3bedee3e50\") " Mar 09 09:27:15 crc kubenswrapper[4861]: I0309 09:27:15.371542 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc825c65-a951-464c-87d3-3e3bedee3e50-config\") pod \"cc825c65-a951-464c-87d3-3e3bedee3e50\" (UID: \"cc825c65-a951-464c-87d3-3e3bedee3e50\") " Mar 09 09:27:15 crc kubenswrapper[4861]: I0309 09:27:15.371628 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bldjp\" (UniqueName: \"kubernetes.io/projected/cc825c65-a951-464c-87d3-3e3bedee3e50-kube-api-access-bldjp\") pod \"cc825c65-a951-464c-87d3-3e3bedee3e50\" (UID: \"cc825c65-a951-464c-87d3-3e3bedee3e50\") " Mar 09 09:27:15 crc kubenswrapper[4861]: I0309 09:27:15.378589 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc825c65-a951-464c-87d3-3e3bedee3e50-kube-api-access-bldjp" (OuterVolumeSpecName: "kube-api-access-bldjp") pod "cc825c65-a951-464c-87d3-3e3bedee3e50" (UID: "cc825c65-a951-464c-87d3-3e3bedee3e50"). InnerVolumeSpecName "kube-api-access-bldjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:27:15 crc kubenswrapper[4861]: I0309 09:27:15.422033 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc825c65-a951-464c-87d3-3e3bedee3e50-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cc825c65-a951-464c-87d3-3e3bedee3e50" (UID: "cc825c65-a951-464c-87d3-3e3bedee3e50"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:27:15 crc kubenswrapper[4861]: I0309 09:27:15.423038 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc825c65-a951-464c-87d3-3e3bedee3e50-config" (OuterVolumeSpecName: "config") pod "cc825c65-a951-464c-87d3-3e3bedee3e50" (UID: "cc825c65-a951-464c-87d3-3e3bedee3e50"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:27:15 crc kubenswrapper[4861]: I0309 09:27:15.437021 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc825c65-a951-464c-87d3-3e3bedee3e50-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cc825c65-a951-464c-87d3-3e3bedee3e50" (UID: "cc825c65-a951-464c-87d3-3e3bedee3e50"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:27:15 crc kubenswrapper[4861]: I0309 09:27:15.443789 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc825c65-a951-464c-87d3-3e3bedee3e50-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cc825c65-a951-464c-87d3-3e3bedee3e50" (UID: "cc825c65-a951-464c-87d3-3e3bedee3e50"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:27:15 crc kubenswrapper[4861]: I0309 09:27:15.447742 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc825c65-a951-464c-87d3-3e3bedee3e50-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "cc825c65-a951-464c-87d3-3e3bedee3e50" (UID: "cc825c65-a951-464c-87d3-3e3bedee3e50"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:27:15 crc kubenswrapper[4861]: I0309 09:27:15.484273 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 09 09:27:15 crc kubenswrapper[4861]: I0309 09:27:15.484488 4861 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cc825c65-a951-464c-87d3-3e3bedee3e50-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:15 crc kubenswrapper[4861]: I0309 09:27:15.484518 4861 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc825c65-a951-464c-87d3-3e3bedee3e50-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:15 crc kubenswrapper[4861]: I0309 09:27:15.484532 4861 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc825c65-a951-464c-87d3-3e3bedee3e50-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:15 crc kubenswrapper[4861]: I0309 09:27:15.484550 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bldjp\" (UniqueName: \"kubernetes.io/projected/cc825c65-a951-464c-87d3-3e3bedee3e50-kube-api-access-bldjp\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:15 crc kubenswrapper[4861]: I0309 09:27:15.484562 4861 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc825c65-a951-464c-87d3-3e3bedee3e50-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:15 crc kubenswrapper[4861]: I0309 09:27:15.484571 4861 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc825c65-a951-464c-87d3-3e3bedee3e50-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:15 crc kubenswrapper[4861]: I0309 09:27:15.686254 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="393b4059-d8bc-4bc2-ad01-3b609472c649" path="/var/lib/kubelet/pods/393b4059-d8bc-4bc2-ad01-3b609472c649/volumes" Mar 09 09:27:15 crc kubenswrapper[4861]: I0309 09:27:15.956187 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 09 09:27:16 crc kubenswrapper[4861]: I0309 09:27:16.069224 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 09:27:16 crc kubenswrapper[4861]: I0309 09:27:16.093949 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07b7fc52-e101-4d6d-bb35-c460aba044df-config-data\") pod \"07b7fc52-e101-4d6d-bb35-c460aba044df\" (UID: \"07b7fc52-e101-4d6d-bb35-c460aba044df\") " Mar 09 09:27:16 crc kubenswrapper[4861]: I0309 09:27:16.094072 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07b7fc52-e101-4d6d-bb35-c460aba044df-combined-ca-bundle\") pod \"07b7fc52-e101-4d6d-bb35-c460aba044df\" (UID: \"07b7fc52-e101-4d6d-bb35-c460aba044df\") " Mar 09 09:27:16 crc kubenswrapper[4861]: I0309 09:27:16.094146 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n26zs\" (UniqueName: \"kubernetes.io/projected/07b7fc52-e101-4d6d-bb35-c460aba044df-kube-api-access-n26zs\") pod \"07b7fc52-e101-4d6d-bb35-c460aba044df\" (UID: \"07b7fc52-e101-4d6d-bb35-c460aba044df\") " Mar 09 09:27:16 crc kubenswrapper[4861]: I0309 09:27:16.094670 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07b7fc52-e101-4d6d-bb35-c460aba044df-logs\") pod \"07b7fc52-e101-4d6d-bb35-c460aba044df\" (UID: \"07b7fc52-e101-4d6d-bb35-c460aba044df\") " Mar 09 09:27:16 crc kubenswrapper[4861]: I0309 09:27:16.095166 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07b7fc52-e101-4d6d-bb35-c460aba044df-logs" (OuterVolumeSpecName: "logs") pod "07b7fc52-e101-4d6d-bb35-c460aba044df" (UID: "07b7fc52-e101-4d6d-bb35-c460aba044df"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:27:16 crc kubenswrapper[4861]: I0309 09:27:16.103720 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07b7fc52-e101-4d6d-bb35-c460aba044df-kube-api-access-n26zs" (OuterVolumeSpecName: "kube-api-access-n26zs") pod "07b7fc52-e101-4d6d-bb35-c460aba044df" (UID: "07b7fc52-e101-4d6d-bb35-c460aba044df"). InnerVolumeSpecName "kube-api-access-n26zs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:27:16 crc kubenswrapper[4861]: I0309 09:27:16.104673 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"c5b057f4-8239-4e46-b205-81552d6cd5e6","Type":"ContainerStarted","Data":"76b324f3c050c80e72e435fd5567999b9a1e0c017435f6f96112f70f834d9e10"} Mar 09 09:27:16 crc kubenswrapper[4861]: I0309 09:27:16.109404 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b8fcc65cc-466c6" event={"ID":"cc825c65-a951-464c-87d3-3e3bedee3e50","Type":"ContainerDied","Data":"2d5edf0d00d8c4bf40fb34ce56be026dce7ad18836868225a59e3de054ee7146"} Mar 09 09:27:16 crc kubenswrapper[4861]: I0309 09:27:16.109496 4861 scope.go:117] "RemoveContainer" containerID="0e949aca604757214fec9b436182444696e848dd9f9110091d4fb4c98eca8660" Mar 09 09:27:16 crc kubenswrapper[4861]: I0309 09:27:16.110827 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b8fcc65cc-466c6" Mar 09 09:27:16 crc kubenswrapper[4861]: I0309 09:27:16.112523 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"72e92702-681e-4575-84da-71f26ef95ebf","Type":"ContainerStarted","Data":"d1afee61e833f56202c99d97a660af11019fc418c2d3ddf6c6cbdb58dfeee311"} Mar 09 09:27:16 crc kubenswrapper[4861]: I0309 09:27:16.116754 4861 generic.go:334] "Generic (PLEG): container finished" podID="07b7fc52-e101-4d6d-bb35-c460aba044df" containerID="c96457be6df970ca4fee8bb8744ee3afbda43952dd085697c4dd34445bbf7770" exitCode=0 Mar 09 09:27:16 crc kubenswrapper[4861]: I0309 09:27:16.119235 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 09:27:16 crc kubenswrapper[4861]: I0309 09:27:16.119525 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"07b7fc52-e101-4d6d-bb35-c460aba044df","Type":"ContainerDied","Data":"c96457be6df970ca4fee8bb8744ee3afbda43952dd085697c4dd34445bbf7770"} Mar 09 09:27:16 crc kubenswrapper[4861]: I0309 09:27:16.119565 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"07b7fc52-e101-4d6d-bb35-c460aba044df","Type":"ContainerDied","Data":"099410a9d54c168bab70dd4f752581123f78655a624869ddc320c80096919580"} Mar 09 09:27:16 crc kubenswrapper[4861]: I0309 09:27:16.134127 4861 scope.go:117] "RemoveContainer" containerID="76fdc8d787e68e9675eed7b6f837ce9957491cc93adce7380f304fc1866ed124" Mar 09 09:27:16 crc kubenswrapper[4861]: I0309 09:27:16.140888 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07b7fc52-e101-4d6d-bb35-c460aba044df-config-data" (OuterVolumeSpecName: "config-data") pod "07b7fc52-e101-4d6d-bb35-c460aba044df" (UID: "07b7fc52-e101-4d6d-bb35-c460aba044df"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:16 crc kubenswrapper[4861]: I0309 09:27:16.147161 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07b7fc52-e101-4d6d-bb35-c460aba044df-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "07b7fc52-e101-4d6d-bb35-c460aba044df" (UID: "07b7fc52-e101-4d6d-bb35-c460aba044df"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:16 crc kubenswrapper[4861]: I0309 09:27:16.149976 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.149958642 podStartE2EDuration="2.149958642s" podCreationTimestamp="2026-03-09 09:27:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:27:16.133038725 +0000 UTC m=+1279.218078156" watchObservedRunningTime="2026-03-09 09:27:16.149958642 +0000 UTC m=+1279.234998033" Mar 09 09:27:16 crc kubenswrapper[4861]: I0309 09:27:16.164836 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b8fcc65cc-466c6"] Mar 09 09:27:16 crc kubenswrapper[4861]: I0309 09:27:16.165837 4861 scope.go:117] "RemoveContainer" containerID="c96457be6df970ca4fee8bb8744ee3afbda43952dd085697c4dd34445bbf7770" Mar 09 09:27:16 crc kubenswrapper[4861]: I0309 09:27:16.173940 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b8fcc65cc-466c6"] Mar 09 09:27:16 crc kubenswrapper[4861]: I0309 09:27:16.188883 4861 scope.go:117] "RemoveContainer" containerID="71a1e47dc513402ccf0aa5c02a3c15b43fb149b33f9816e57f8c03e04ce60b0e" Mar 09 09:27:16 crc kubenswrapper[4861]: I0309 09:27:16.201144 4861 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07b7fc52-e101-4d6d-bb35-c460aba044df-logs\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:16 crc kubenswrapper[4861]: I0309 09:27:16.201172 4861 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07b7fc52-e101-4d6d-bb35-c460aba044df-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:16 crc kubenswrapper[4861]: I0309 09:27:16.201182 4861 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07b7fc52-e101-4d6d-bb35-c460aba044df-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:16 crc kubenswrapper[4861]: I0309 09:27:16.201192 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n26zs\" (UniqueName: \"kubernetes.io/projected/07b7fc52-e101-4d6d-bb35-c460aba044df-kube-api-access-n26zs\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:16 crc kubenswrapper[4861]: I0309 09:27:16.211495 4861 scope.go:117] "RemoveContainer" containerID="c96457be6df970ca4fee8bb8744ee3afbda43952dd085697c4dd34445bbf7770" Mar 09 09:27:16 crc kubenswrapper[4861]: E0309 09:27:16.211999 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c96457be6df970ca4fee8bb8744ee3afbda43952dd085697c4dd34445bbf7770\": container with ID starting with c96457be6df970ca4fee8bb8744ee3afbda43952dd085697c4dd34445bbf7770 not found: ID does not exist" containerID="c96457be6df970ca4fee8bb8744ee3afbda43952dd085697c4dd34445bbf7770" Mar 09 09:27:16 crc kubenswrapper[4861]: I0309 09:27:16.212044 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c96457be6df970ca4fee8bb8744ee3afbda43952dd085697c4dd34445bbf7770"} err="failed to get container status \"c96457be6df970ca4fee8bb8744ee3afbda43952dd085697c4dd34445bbf7770\": rpc error: code = NotFound desc = could not find container \"c96457be6df970ca4fee8bb8744ee3afbda43952dd085697c4dd34445bbf7770\": container with ID starting with c96457be6df970ca4fee8bb8744ee3afbda43952dd085697c4dd34445bbf7770 not found: ID does not exist" Mar 09 09:27:16 crc kubenswrapper[4861]: I0309 09:27:16.212070 4861 scope.go:117] "RemoveContainer" containerID="71a1e47dc513402ccf0aa5c02a3c15b43fb149b33f9816e57f8c03e04ce60b0e" Mar 09 09:27:16 crc kubenswrapper[4861]: E0309 09:27:16.212662 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71a1e47dc513402ccf0aa5c02a3c15b43fb149b33f9816e57f8c03e04ce60b0e\": container with ID starting with 71a1e47dc513402ccf0aa5c02a3c15b43fb149b33f9816e57f8c03e04ce60b0e not found: ID does not exist" containerID="71a1e47dc513402ccf0aa5c02a3c15b43fb149b33f9816e57f8c03e04ce60b0e" Mar 09 09:27:16 crc kubenswrapper[4861]: I0309 09:27:16.212701 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71a1e47dc513402ccf0aa5c02a3c15b43fb149b33f9816e57f8c03e04ce60b0e"} err="failed to get container status \"71a1e47dc513402ccf0aa5c02a3c15b43fb149b33f9816e57f8c03e04ce60b0e\": rpc error: code = NotFound desc = could not find container \"71a1e47dc513402ccf0aa5c02a3c15b43fb149b33f9816e57f8c03e04ce60b0e\": container with ID starting with 71a1e47dc513402ccf0aa5c02a3c15b43fb149b33f9816e57f8c03e04ce60b0e not found: ID does not exist" Mar 09 09:27:16 crc kubenswrapper[4861]: I0309 09:27:16.233514 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 09 09:27:16 crc kubenswrapper[4861]: I0309 09:27:16.455809 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 09 09:27:16 crc kubenswrapper[4861]: I0309 09:27:16.464952 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 09 09:27:16 crc kubenswrapper[4861]: I0309 09:27:16.475414 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 09 09:27:16 crc kubenswrapper[4861]: E0309 09:27:16.475892 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc825c65-a951-464c-87d3-3e3bedee3e50" containerName="init" Mar 09 09:27:16 crc kubenswrapper[4861]: I0309 09:27:16.475907 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc825c65-a951-464c-87d3-3e3bedee3e50" containerName="init" Mar 09 09:27:16 crc kubenswrapper[4861]: E0309 09:27:16.475917 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07b7fc52-e101-4d6d-bb35-c460aba044df" containerName="nova-api-api" Mar 09 09:27:16 crc kubenswrapper[4861]: I0309 09:27:16.475927 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="07b7fc52-e101-4d6d-bb35-c460aba044df" containerName="nova-api-api" Mar 09 09:27:16 crc kubenswrapper[4861]: E0309 09:27:16.475949 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07b7fc52-e101-4d6d-bb35-c460aba044df" containerName="nova-api-log" Mar 09 09:27:16 crc kubenswrapper[4861]: I0309 09:27:16.475957 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="07b7fc52-e101-4d6d-bb35-c460aba044df" containerName="nova-api-log" Mar 09 09:27:16 crc kubenswrapper[4861]: E0309 09:27:16.475994 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc825c65-a951-464c-87d3-3e3bedee3e50" containerName="dnsmasq-dns" Mar 09 09:27:16 crc kubenswrapper[4861]: I0309 09:27:16.476001 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc825c65-a951-464c-87d3-3e3bedee3e50" containerName="dnsmasq-dns" Mar 09 09:27:16 crc kubenswrapper[4861]: I0309 09:27:16.476158 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc825c65-a951-464c-87d3-3e3bedee3e50" containerName="dnsmasq-dns" Mar 09 09:27:16 crc kubenswrapper[4861]: I0309 09:27:16.476171 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="07b7fc52-e101-4d6d-bb35-c460aba044df" containerName="nova-api-log" Mar 09 09:27:16 crc kubenswrapper[4861]: I0309 09:27:16.476185 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="07b7fc52-e101-4d6d-bb35-c460aba044df" containerName="nova-api-api" Mar 09 09:27:16 crc kubenswrapper[4861]: I0309 09:27:16.477155 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 09:27:16 crc kubenswrapper[4861]: I0309 09:27:16.479431 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 09 09:27:16 crc kubenswrapper[4861]: I0309 09:27:16.494494 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 09 09:27:16 crc kubenswrapper[4861]: I0309 09:27:16.506743 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20184bd6-aec8-44d4-85de-4dddbc61ce6d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"20184bd6-aec8-44d4-85de-4dddbc61ce6d\") " pod="openstack/nova-api-0" Mar 09 09:27:16 crc kubenswrapper[4861]: I0309 09:27:16.506806 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt5tt\" (UniqueName: \"kubernetes.io/projected/20184bd6-aec8-44d4-85de-4dddbc61ce6d-kube-api-access-kt5tt\") pod \"nova-api-0\" (UID: \"20184bd6-aec8-44d4-85de-4dddbc61ce6d\") " pod="openstack/nova-api-0" Mar 09 09:27:16 crc kubenswrapper[4861]: I0309 09:27:16.506857 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20184bd6-aec8-44d4-85de-4dddbc61ce6d-config-data\") pod \"nova-api-0\" (UID: \"20184bd6-aec8-44d4-85de-4dddbc61ce6d\") " pod="openstack/nova-api-0" Mar 09 09:27:16 crc kubenswrapper[4861]: I0309 09:27:16.506955 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20184bd6-aec8-44d4-85de-4dddbc61ce6d-logs\") pod \"nova-api-0\" (UID: \"20184bd6-aec8-44d4-85de-4dddbc61ce6d\") " pod="openstack/nova-api-0" Mar 09 09:27:16 crc kubenswrapper[4861]: I0309 09:27:16.608600 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20184bd6-aec8-44d4-85de-4dddbc61ce6d-config-data\") pod \"nova-api-0\" (UID: \"20184bd6-aec8-44d4-85de-4dddbc61ce6d\") " pod="openstack/nova-api-0" Mar 09 09:27:16 crc kubenswrapper[4861]: I0309 09:27:16.608753 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20184bd6-aec8-44d4-85de-4dddbc61ce6d-logs\") pod \"nova-api-0\" (UID: \"20184bd6-aec8-44d4-85de-4dddbc61ce6d\") " pod="openstack/nova-api-0" Mar 09 09:27:16 crc kubenswrapper[4861]: I0309 09:27:16.608890 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20184bd6-aec8-44d4-85de-4dddbc61ce6d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"20184bd6-aec8-44d4-85de-4dddbc61ce6d\") " pod="openstack/nova-api-0" Mar 09 09:27:16 crc kubenswrapper[4861]: I0309 09:27:16.608925 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt5tt\" (UniqueName: \"kubernetes.io/projected/20184bd6-aec8-44d4-85de-4dddbc61ce6d-kube-api-access-kt5tt\") pod \"nova-api-0\" (UID: \"20184bd6-aec8-44d4-85de-4dddbc61ce6d\") " pod="openstack/nova-api-0" Mar 09 09:27:16 crc kubenswrapper[4861]: I0309 09:27:16.609468 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20184bd6-aec8-44d4-85de-4dddbc61ce6d-logs\") pod \"nova-api-0\" (UID: \"20184bd6-aec8-44d4-85de-4dddbc61ce6d\") " pod="openstack/nova-api-0" Mar 09 09:27:16 crc kubenswrapper[4861]: I0309 09:27:16.613177 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20184bd6-aec8-44d4-85de-4dddbc61ce6d-config-data\") pod \"nova-api-0\" (UID: \"20184bd6-aec8-44d4-85de-4dddbc61ce6d\") " pod="openstack/nova-api-0" Mar 09 09:27:16 crc kubenswrapper[4861]: I0309 09:27:16.616860 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20184bd6-aec8-44d4-85de-4dddbc61ce6d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"20184bd6-aec8-44d4-85de-4dddbc61ce6d\") " pod="openstack/nova-api-0" Mar 09 09:27:16 crc kubenswrapper[4861]: I0309 09:27:16.629147 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt5tt\" (UniqueName: \"kubernetes.io/projected/20184bd6-aec8-44d4-85de-4dddbc61ce6d-kube-api-access-kt5tt\") pod \"nova-api-0\" (UID: \"20184bd6-aec8-44d4-85de-4dddbc61ce6d\") " pod="openstack/nova-api-0" Mar 09 09:27:16 crc kubenswrapper[4861]: I0309 09:27:16.858005 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 09:27:17 crc kubenswrapper[4861]: I0309 09:27:17.136664 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"c5b057f4-8239-4e46-b205-81552d6cd5e6","Type":"ContainerStarted","Data":"9868644642e50c30aafcfbce8764ca943b02f16c20482bf93cc790c305bcb29a"} Mar 09 09:27:17 crc kubenswrapper[4861]: I0309 09:27:17.137061 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 09 09:27:17 crc kubenswrapper[4861]: I0309 09:27:17.160726 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.160703774 podStartE2EDuration="2.160703774s" podCreationTimestamp="2026-03-09 09:27:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:27:17.151313923 +0000 UTC m=+1280.236353324" watchObservedRunningTime="2026-03-09 09:27:17.160703774 +0000 UTC m=+1280.245743175" Mar 09 09:27:17 crc kubenswrapper[4861]: I0309 09:27:17.317818 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 09 09:27:17 crc kubenswrapper[4861]: W0309 09:27:17.319282 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20184bd6_aec8_44d4_85de_4dddbc61ce6d.slice/crio-b406f09f91198dd265106ee5f2a6761e95a46d738e5d82a74b18f6ba4f3baae4 WatchSource:0}: Error finding container b406f09f91198dd265106ee5f2a6761e95a46d738e5d82a74b18f6ba4f3baae4: Status 404 returned error can't find the container with id b406f09f91198dd265106ee5f2a6761e95a46d738e5d82a74b18f6ba4f3baae4 Mar 09 09:27:17 crc kubenswrapper[4861]: I0309 09:27:17.670177 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07b7fc52-e101-4d6d-bb35-c460aba044df" path="/var/lib/kubelet/pods/07b7fc52-e101-4d6d-bb35-c460aba044df/volumes" Mar 09 09:27:17 crc kubenswrapper[4861]: I0309 09:27:17.671118 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc825c65-a951-464c-87d3-3e3bedee3e50" path="/var/lib/kubelet/pods/cc825c65-a951-464c-87d3-3e3bedee3e50/volumes" Mar 09 09:27:18 crc kubenswrapper[4861]: I0309 09:27:18.155833 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"20184bd6-aec8-44d4-85de-4dddbc61ce6d","Type":"ContainerStarted","Data":"6d0d12d972c0219d9d0f4fb2d357ec09c2b975ff09c50e9eddb5fd6e4b0f69af"} Mar 09 09:27:18 crc kubenswrapper[4861]: I0309 09:27:18.155890 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"20184bd6-aec8-44d4-85de-4dddbc61ce6d","Type":"ContainerStarted","Data":"dc164e711d9c74be59a234b63af3a7cb3e2515ccf152e71fd97cb37c82c5514f"} Mar 09 09:27:18 crc kubenswrapper[4861]: I0309 09:27:18.155907 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"20184bd6-aec8-44d4-85de-4dddbc61ce6d","Type":"ContainerStarted","Data":"b406f09f91198dd265106ee5f2a6761e95a46d738e5d82a74b18f6ba4f3baae4"} Mar 09 09:27:18 crc kubenswrapper[4861]: I0309 09:27:18.182067 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.18204611 podStartE2EDuration="2.18204611s" podCreationTimestamp="2026-03-09 09:27:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:27:18.17161344 +0000 UTC m=+1281.256652851" watchObservedRunningTime="2026-03-09 09:27:18.18204611 +0000 UTC m=+1281.267085511" Mar 09 09:27:18 crc kubenswrapper[4861]: I0309 09:27:18.477479 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 09 09:27:18 crc kubenswrapper[4861]: I0309 09:27:18.477840 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 09 09:27:19 crc kubenswrapper[4861]: I0309 09:27:19.449412 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 09 09:27:19 crc kubenswrapper[4861]: I0309 09:27:19.887516 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 09:27:19 crc kubenswrapper[4861]: I0309 09:27:19.888031 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="8f114ae0-31de-4f17-9bad-1bc0b895d006" containerName="kube-state-metrics" containerID="cri-o://786aa50f7103d9a4ad3e60d0286ae85151dd2f5fb70f4204d1465f5132daf446" gracePeriod=30 Mar 09 09:27:20 crc kubenswrapper[4861]: I0309 09:27:20.195823 4861 generic.go:334] "Generic (PLEG): container finished" podID="8f114ae0-31de-4f17-9bad-1bc0b895d006" containerID="786aa50f7103d9a4ad3e60d0286ae85151dd2f5fb70f4204d1465f5132daf446" exitCode=2 Mar 09 09:27:20 crc kubenswrapper[4861]: I0309 09:27:20.195870 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8f114ae0-31de-4f17-9bad-1bc0b895d006","Type":"ContainerDied","Data":"786aa50f7103d9a4ad3e60d0286ae85151dd2f5fb70f4204d1465f5132daf446"} Mar 09 09:27:20 crc kubenswrapper[4861]: I0309 09:27:20.390027 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 09 09:27:20 crc kubenswrapper[4861]: I0309 09:27:20.479134 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tl7ph\" (UniqueName: \"kubernetes.io/projected/8f114ae0-31de-4f17-9bad-1bc0b895d006-kube-api-access-tl7ph\") pod \"8f114ae0-31de-4f17-9bad-1bc0b895d006\" (UID: \"8f114ae0-31de-4f17-9bad-1bc0b895d006\") " Mar 09 09:27:20 crc kubenswrapper[4861]: I0309 09:27:20.484793 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f114ae0-31de-4f17-9bad-1bc0b895d006-kube-api-access-tl7ph" (OuterVolumeSpecName: "kube-api-access-tl7ph") pod "8f114ae0-31de-4f17-9bad-1bc0b895d006" (UID: "8f114ae0-31de-4f17-9bad-1bc0b895d006"). InnerVolumeSpecName "kube-api-access-tl7ph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:27:20 crc kubenswrapper[4861]: I0309 09:27:20.580773 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tl7ph\" (UniqueName: \"kubernetes.io/projected/8f114ae0-31de-4f17-9bad-1bc0b895d006-kube-api-access-tl7ph\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:21 crc kubenswrapper[4861]: I0309 09:27:21.207244 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8f114ae0-31de-4f17-9bad-1bc0b895d006","Type":"ContainerDied","Data":"2cd6331216667402a7eb60f0f498e5b4faa400327cae2822314b04d94b25a544"} Mar 09 09:27:21 crc kubenswrapper[4861]: I0309 09:27:21.207290 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 09 09:27:21 crc kubenswrapper[4861]: I0309 09:27:21.207308 4861 scope.go:117] "RemoveContainer" containerID="786aa50f7103d9a4ad3e60d0286ae85151dd2f5fb70f4204d1465f5132daf446" Mar 09 09:27:21 crc kubenswrapper[4861]: I0309 09:27:21.271509 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 09:27:21 crc kubenswrapper[4861]: I0309 09:27:21.283621 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 09:27:21 crc kubenswrapper[4861]: I0309 09:27:21.293115 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 09:27:21 crc kubenswrapper[4861]: E0309 09:27:21.293560 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f114ae0-31de-4f17-9bad-1bc0b895d006" containerName="kube-state-metrics" Mar 09 09:27:21 crc kubenswrapper[4861]: I0309 09:27:21.293583 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f114ae0-31de-4f17-9bad-1bc0b895d006" containerName="kube-state-metrics" Mar 09 09:27:21 crc kubenswrapper[4861]: I0309 09:27:21.293765 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f114ae0-31de-4f17-9bad-1bc0b895d006" containerName="kube-state-metrics" Mar 09 09:27:21 crc kubenswrapper[4861]: I0309 09:27:21.294624 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 09 09:27:21 crc kubenswrapper[4861]: I0309 09:27:21.296864 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 09 09:27:21 crc kubenswrapper[4861]: I0309 09:27:21.297586 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 09 09:27:21 crc kubenswrapper[4861]: I0309 09:27:21.308421 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 09:27:21 crc kubenswrapper[4861]: I0309 09:27:21.395396 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a9682bd-f0fc-47d6-9a66-e35fb0630f44-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"5a9682bd-f0fc-47d6-9a66-e35fb0630f44\") " pod="openstack/kube-state-metrics-0" Mar 09 09:27:21 crc kubenswrapper[4861]: I0309 09:27:21.395515 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a9682bd-f0fc-47d6-9a66-e35fb0630f44-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"5a9682bd-f0fc-47d6-9a66-e35fb0630f44\") " pod="openstack/kube-state-metrics-0" Mar 09 09:27:21 crc kubenswrapper[4861]: I0309 09:27:21.395592 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/5a9682bd-f0fc-47d6-9a66-e35fb0630f44-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"5a9682bd-f0fc-47d6-9a66-e35fb0630f44\") " pod="openstack/kube-state-metrics-0" Mar 09 09:27:21 crc kubenswrapper[4861]: I0309 09:27:21.395637 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmld7\" (UniqueName: \"kubernetes.io/projected/5a9682bd-f0fc-47d6-9a66-e35fb0630f44-kube-api-access-gmld7\") pod \"kube-state-metrics-0\" (UID: \"5a9682bd-f0fc-47d6-9a66-e35fb0630f44\") " pod="openstack/kube-state-metrics-0" Mar 09 09:27:21 crc kubenswrapper[4861]: I0309 09:27:21.497435 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a9682bd-f0fc-47d6-9a66-e35fb0630f44-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"5a9682bd-f0fc-47d6-9a66-e35fb0630f44\") " pod="openstack/kube-state-metrics-0" Mar 09 09:27:21 crc kubenswrapper[4861]: I0309 09:27:21.497546 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a9682bd-f0fc-47d6-9a66-e35fb0630f44-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"5a9682bd-f0fc-47d6-9a66-e35fb0630f44\") " pod="openstack/kube-state-metrics-0" Mar 09 09:27:21 crc kubenswrapper[4861]: I0309 09:27:21.497569 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/5a9682bd-f0fc-47d6-9a66-e35fb0630f44-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"5a9682bd-f0fc-47d6-9a66-e35fb0630f44\") " pod="openstack/kube-state-metrics-0" Mar 09 09:27:21 crc kubenswrapper[4861]: I0309 09:27:21.498133 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmld7\" (UniqueName: \"kubernetes.io/projected/5a9682bd-f0fc-47d6-9a66-e35fb0630f44-kube-api-access-gmld7\") pod \"kube-state-metrics-0\" (UID: \"5a9682bd-f0fc-47d6-9a66-e35fb0630f44\") " pod="openstack/kube-state-metrics-0" Mar 09 09:27:21 crc kubenswrapper[4861]: I0309 09:27:21.513341 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/5a9682bd-f0fc-47d6-9a66-e35fb0630f44-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"5a9682bd-f0fc-47d6-9a66-e35fb0630f44\") " pod="openstack/kube-state-metrics-0" Mar 09 09:27:21 crc kubenswrapper[4861]: I0309 09:27:21.513459 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5a9682bd-f0fc-47d6-9a66-e35fb0630f44-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"5a9682bd-f0fc-47d6-9a66-e35fb0630f44\") " pod="openstack/kube-state-metrics-0" Mar 09 09:27:21 crc kubenswrapper[4861]: I0309 09:27:21.513996 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/5a9682bd-f0fc-47d6-9a66-e35fb0630f44-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"5a9682bd-f0fc-47d6-9a66-e35fb0630f44\") " pod="openstack/kube-state-metrics-0" Mar 09 09:27:21 crc kubenswrapper[4861]: I0309 09:27:21.516345 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmld7\" (UniqueName: \"kubernetes.io/projected/5a9682bd-f0fc-47d6-9a66-e35fb0630f44-kube-api-access-gmld7\") pod \"kube-state-metrics-0\" (UID: \"5a9682bd-f0fc-47d6-9a66-e35fb0630f44\") " pod="openstack/kube-state-metrics-0" Mar 09 09:27:21 crc kubenswrapper[4861]: I0309 09:27:21.619779 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 09 09:27:21 crc kubenswrapper[4861]: I0309 09:27:21.669959 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f114ae0-31de-4f17-9bad-1bc0b895d006" path="/var/lib/kubelet/pods/8f114ae0-31de-4f17-9bad-1bc0b895d006/volumes" Mar 09 09:27:21 crc kubenswrapper[4861]: I0309 09:27:21.687076 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:27:21 crc kubenswrapper[4861]: I0309 09:27:21.687526 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="edbe6993-7926-4a78-9227-b7c85af7ec66" containerName="proxy-httpd" containerID="cri-o://7f801cb67a90ced05cbc62eaecaa06f1ab05d667be9d41dd2eb4d9888ebafdea" gracePeriod=30 Mar 09 09:27:21 crc kubenswrapper[4861]: I0309 09:27:21.687608 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="edbe6993-7926-4a78-9227-b7c85af7ec66" containerName="sg-core" containerID="cri-o://e87f7e0e851f4944b2b3370f531360c735836531f36248b68167ac059e5e4495" gracePeriod=30 Mar 09 09:27:21 crc kubenswrapper[4861]: I0309 09:27:21.687735 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="edbe6993-7926-4a78-9227-b7c85af7ec66" containerName="ceilometer-central-agent" containerID="cri-o://9a6976fb3b15c1a33bf9f11c55266c90e65d8bc0b256d7961d1f06a0053a3d1f" gracePeriod=30 Mar 09 09:27:21 crc kubenswrapper[4861]: I0309 09:27:21.687741 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="edbe6993-7926-4a78-9227-b7c85af7ec66" containerName="ceilometer-notification-agent" containerID="cri-o://6360fe2180cbf6227ebf7a75a027983d6faae1823563d9854f335c4cb2aa39bc" gracePeriod=30 Mar 09 09:27:22 crc kubenswrapper[4861]: I0309 09:27:22.124060 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 09:27:22 crc kubenswrapper[4861]: W0309 09:27:22.128291 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a9682bd_f0fc_47d6_9a66_e35fb0630f44.slice/crio-80f3494c1747ec567731a58e01d0f72060ed584c70b37f2978147ed0175018cc WatchSource:0}: Error finding container 80f3494c1747ec567731a58e01d0f72060ed584c70b37f2978147ed0175018cc: Status 404 returned error can't find the container with id 80f3494c1747ec567731a58e01d0f72060ed584c70b37f2978147ed0175018cc Mar 09 09:27:22 crc kubenswrapper[4861]: I0309 09:27:22.225536 4861 generic.go:334] "Generic (PLEG): container finished" podID="edbe6993-7926-4a78-9227-b7c85af7ec66" containerID="7f801cb67a90ced05cbc62eaecaa06f1ab05d667be9d41dd2eb4d9888ebafdea" exitCode=0 Mar 09 09:27:22 crc kubenswrapper[4861]: I0309 09:27:22.225572 4861 generic.go:334] "Generic (PLEG): container finished" podID="edbe6993-7926-4a78-9227-b7c85af7ec66" containerID="e87f7e0e851f4944b2b3370f531360c735836531f36248b68167ac059e5e4495" exitCode=2 Mar 09 09:27:22 crc kubenswrapper[4861]: I0309 09:27:22.225581 4861 generic.go:334] "Generic (PLEG): container finished" podID="edbe6993-7926-4a78-9227-b7c85af7ec66" containerID="9a6976fb3b15c1a33bf9f11c55266c90e65d8bc0b256d7961d1f06a0053a3d1f" exitCode=0 Mar 09 09:27:22 crc kubenswrapper[4861]: I0309 09:27:22.225624 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"edbe6993-7926-4a78-9227-b7c85af7ec66","Type":"ContainerDied","Data":"7f801cb67a90ced05cbc62eaecaa06f1ab05d667be9d41dd2eb4d9888ebafdea"} Mar 09 09:27:22 crc kubenswrapper[4861]: I0309 09:27:22.225670 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"edbe6993-7926-4a78-9227-b7c85af7ec66","Type":"ContainerDied","Data":"e87f7e0e851f4944b2b3370f531360c735836531f36248b68167ac059e5e4495"} Mar 09 09:27:22 crc kubenswrapper[4861]: I0309 09:27:22.225690 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"edbe6993-7926-4a78-9227-b7c85af7ec66","Type":"ContainerDied","Data":"9a6976fb3b15c1a33bf9f11c55266c90e65d8bc0b256d7961d1f06a0053a3d1f"} Mar 09 09:27:22 crc kubenswrapper[4861]: I0309 09:27:22.229273 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5a9682bd-f0fc-47d6-9a66-e35fb0630f44","Type":"ContainerStarted","Data":"80f3494c1747ec567731a58e01d0f72060ed584c70b37f2978147ed0175018cc"} Mar 09 09:27:23 crc kubenswrapper[4861]: I0309 09:27:23.238258 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5a9682bd-f0fc-47d6-9a66-e35fb0630f44","Type":"ContainerStarted","Data":"2b0f2f2295e241a92e9e3842fb3eb0facaee442454c12d14159a76af54273ef9"} Mar 09 09:27:23 crc kubenswrapper[4861]: I0309 09:27:23.238623 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 09 09:27:23 crc kubenswrapper[4861]: I0309 09:27:23.265267 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.8295176579999999 podStartE2EDuration="2.265247967s" podCreationTimestamp="2026-03-09 09:27:21 +0000 UTC" firstStartedPulling="2026-03-09 09:27:22.130769946 +0000 UTC m=+1285.215809347" lastFinishedPulling="2026-03-09 09:27:22.566500255 +0000 UTC m=+1285.651539656" observedRunningTime="2026-03-09 09:27:23.25390526 +0000 UTC m=+1286.338944661" watchObservedRunningTime="2026-03-09 09:27:23.265247967 +0000 UTC m=+1286.350287368" Mar 09 09:27:23 crc kubenswrapper[4861]: I0309 09:27:23.478311 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 09 09:27:23 crc kubenswrapper[4861]: I0309 09:27:23.478555 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 09 09:27:24 crc kubenswrapper[4861]: I0309 09:27:24.438438 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 09 09:27:24 crc kubenswrapper[4861]: I0309 09:27:24.470463 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 09 09:27:24 crc kubenswrapper[4861]: I0309 09:27:24.489556 4861 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="91be51cc-9158-4ade-b36c-cb7bc65b006e" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.200:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 09:27:24 crc kubenswrapper[4861]: I0309 09:27:24.489573 4861 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="91be51cc-9158-4ade-b36c-cb7bc65b006e" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.200:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 09:27:25 crc kubenswrapper[4861]: I0309 09:27:25.296349 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 09 09:27:25 crc kubenswrapper[4861]: I0309 09:27:25.518510 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 09 09:27:26 crc kubenswrapper[4861]: I0309 09:27:26.859433 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 09 09:27:26 crc kubenswrapper[4861]: I0309 09:27:26.859484 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 09 09:27:27 crc kubenswrapper[4861]: I0309 09:27:27.279290 4861 generic.go:334] "Generic (PLEG): container finished" podID="edbe6993-7926-4a78-9227-b7c85af7ec66" containerID="6360fe2180cbf6227ebf7a75a027983d6faae1823563d9854f335c4cb2aa39bc" exitCode=0 Mar 09 09:27:27 crc kubenswrapper[4861]: I0309 09:27:27.279335 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"edbe6993-7926-4a78-9227-b7c85af7ec66","Type":"ContainerDied","Data":"6360fe2180cbf6227ebf7a75a027983d6faae1823563d9854f335c4cb2aa39bc"} Mar 09 09:27:27 crc kubenswrapper[4861]: I0309 09:27:27.279361 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"edbe6993-7926-4a78-9227-b7c85af7ec66","Type":"ContainerDied","Data":"98dd934ab934e64c0938dada01153f7f71df7b0fb60d9ac456eec54a7110323e"} Mar 09 09:27:27 crc kubenswrapper[4861]: I0309 09:27:27.279391 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98dd934ab934e64c0938dada01153f7f71df7b0fb60d9ac456eec54a7110323e" Mar 09 09:27:27 crc kubenswrapper[4861]: I0309 09:27:27.316412 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:27:27 crc kubenswrapper[4861]: I0309 09:27:27.410843 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edbe6993-7926-4a78-9227-b7c85af7ec66-config-data\") pod \"edbe6993-7926-4a78-9227-b7c85af7ec66\" (UID: \"edbe6993-7926-4a78-9227-b7c85af7ec66\") " Mar 09 09:27:27 crc kubenswrapper[4861]: I0309 09:27:27.410897 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/edbe6993-7926-4a78-9227-b7c85af7ec66-sg-core-conf-yaml\") pod \"edbe6993-7926-4a78-9227-b7c85af7ec66\" (UID: \"edbe6993-7926-4a78-9227-b7c85af7ec66\") " Mar 09 09:27:27 crc kubenswrapper[4861]: I0309 09:27:27.410966 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edbe6993-7926-4a78-9227-b7c85af7ec66-combined-ca-bundle\") pod \"edbe6993-7926-4a78-9227-b7c85af7ec66\" (UID: \"edbe6993-7926-4a78-9227-b7c85af7ec66\") " Mar 09 09:27:27 crc kubenswrapper[4861]: I0309 09:27:27.411004 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/edbe6993-7926-4a78-9227-b7c85af7ec66-scripts\") pod \"edbe6993-7926-4a78-9227-b7c85af7ec66\" (UID: \"edbe6993-7926-4a78-9227-b7c85af7ec66\") " Mar 09 09:27:27 crc kubenswrapper[4861]: I0309 09:27:27.411044 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljwqp\" (UniqueName: \"kubernetes.io/projected/edbe6993-7926-4a78-9227-b7c85af7ec66-kube-api-access-ljwqp\") pod \"edbe6993-7926-4a78-9227-b7c85af7ec66\" (UID: \"edbe6993-7926-4a78-9227-b7c85af7ec66\") " Mar 09 09:27:27 crc kubenswrapper[4861]: I0309 09:27:27.411220 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/edbe6993-7926-4a78-9227-b7c85af7ec66-log-httpd\") pod \"edbe6993-7926-4a78-9227-b7c85af7ec66\" (UID: \"edbe6993-7926-4a78-9227-b7c85af7ec66\") " Mar 09 09:27:27 crc kubenswrapper[4861]: I0309 09:27:27.411246 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/edbe6993-7926-4a78-9227-b7c85af7ec66-run-httpd\") pod \"edbe6993-7926-4a78-9227-b7c85af7ec66\" (UID: \"edbe6993-7926-4a78-9227-b7c85af7ec66\") " Mar 09 09:27:27 crc kubenswrapper[4861]: I0309 09:27:27.411928 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edbe6993-7926-4a78-9227-b7c85af7ec66-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "edbe6993-7926-4a78-9227-b7c85af7ec66" (UID: "edbe6993-7926-4a78-9227-b7c85af7ec66"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:27:27 crc kubenswrapper[4861]: I0309 09:27:27.411956 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edbe6993-7926-4a78-9227-b7c85af7ec66-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "edbe6993-7926-4a78-9227-b7c85af7ec66" (UID: "edbe6993-7926-4a78-9227-b7c85af7ec66"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:27:27 crc kubenswrapper[4861]: I0309 09:27:27.413783 4861 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/edbe6993-7926-4a78-9227-b7c85af7ec66-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:27 crc kubenswrapper[4861]: I0309 09:27:27.413810 4861 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/edbe6993-7926-4a78-9227-b7c85af7ec66-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:27 crc kubenswrapper[4861]: I0309 09:27:27.419535 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edbe6993-7926-4a78-9227-b7c85af7ec66-scripts" (OuterVolumeSpecName: "scripts") pod "edbe6993-7926-4a78-9227-b7c85af7ec66" (UID: "edbe6993-7926-4a78-9227-b7c85af7ec66"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:27 crc kubenswrapper[4861]: I0309 09:27:27.420641 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edbe6993-7926-4a78-9227-b7c85af7ec66-kube-api-access-ljwqp" (OuterVolumeSpecName: "kube-api-access-ljwqp") pod "edbe6993-7926-4a78-9227-b7c85af7ec66" (UID: "edbe6993-7926-4a78-9227-b7c85af7ec66"). InnerVolumeSpecName "kube-api-access-ljwqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:27:27 crc kubenswrapper[4861]: I0309 09:27:27.450520 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edbe6993-7926-4a78-9227-b7c85af7ec66-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "edbe6993-7926-4a78-9227-b7c85af7ec66" (UID: "edbe6993-7926-4a78-9227-b7c85af7ec66"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:27 crc kubenswrapper[4861]: I0309 09:27:27.516122 4861 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/edbe6993-7926-4a78-9227-b7c85af7ec66-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:27 crc kubenswrapper[4861]: I0309 09:27:27.516152 4861 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/edbe6993-7926-4a78-9227-b7c85af7ec66-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:27 crc kubenswrapper[4861]: I0309 09:27:27.516162 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljwqp\" (UniqueName: \"kubernetes.io/projected/edbe6993-7926-4a78-9227-b7c85af7ec66-kube-api-access-ljwqp\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:27 crc kubenswrapper[4861]: I0309 09:27:27.538182 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edbe6993-7926-4a78-9227-b7c85af7ec66-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "edbe6993-7926-4a78-9227-b7c85af7ec66" (UID: "edbe6993-7926-4a78-9227-b7c85af7ec66"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:27 crc kubenswrapper[4861]: I0309 09:27:27.557675 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edbe6993-7926-4a78-9227-b7c85af7ec66-config-data" (OuterVolumeSpecName: "config-data") pod "edbe6993-7926-4a78-9227-b7c85af7ec66" (UID: "edbe6993-7926-4a78-9227-b7c85af7ec66"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:27 crc kubenswrapper[4861]: I0309 09:27:27.618072 4861 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edbe6993-7926-4a78-9227-b7c85af7ec66-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:27 crc kubenswrapper[4861]: I0309 09:27:27.618107 4861 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edbe6993-7926-4a78-9227-b7c85af7ec66-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:27 crc kubenswrapper[4861]: I0309 09:27:27.942835 4861 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="20184bd6-aec8-44d4-85de-4dddbc61ce6d" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.203:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 09:27:27 crc kubenswrapper[4861]: I0309 09:27:27.943544 4861 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="20184bd6-aec8-44d4-85de-4dddbc61ce6d" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.203:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 09:27:28 crc kubenswrapper[4861]: I0309 09:27:28.289828 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:27:28 crc kubenswrapper[4861]: I0309 09:27:28.311673 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:27:28 crc kubenswrapper[4861]: I0309 09:27:28.321753 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:27:28 crc kubenswrapper[4861]: I0309 09:27:28.340402 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:27:28 crc kubenswrapper[4861]: E0309 09:27:28.341090 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edbe6993-7926-4a78-9227-b7c85af7ec66" containerName="ceilometer-notification-agent" Mar 09 09:27:28 crc kubenswrapper[4861]: I0309 09:27:28.341221 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="edbe6993-7926-4a78-9227-b7c85af7ec66" containerName="ceilometer-notification-agent" Mar 09 09:27:28 crc kubenswrapper[4861]: E0309 09:27:28.341317 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edbe6993-7926-4a78-9227-b7c85af7ec66" containerName="ceilometer-central-agent" Mar 09 09:27:28 crc kubenswrapper[4861]: I0309 09:27:28.341401 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="edbe6993-7926-4a78-9227-b7c85af7ec66" containerName="ceilometer-central-agent" Mar 09 09:27:28 crc kubenswrapper[4861]: E0309 09:27:28.341499 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edbe6993-7926-4a78-9227-b7c85af7ec66" containerName="sg-core" Mar 09 09:27:28 crc kubenswrapper[4861]: I0309 09:27:28.341575 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="edbe6993-7926-4a78-9227-b7c85af7ec66" containerName="sg-core" Mar 09 09:27:28 crc kubenswrapper[4861]: E0309 09:27:28.341655 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edbe6993-7926-4a78-9227-b7c85af7ec66" containerName="proxy-httpd" Mar 09 09:27:28 crc kubenswrapper[4861]: I0309 09:27:28.341735 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="edbe6993-7926-4a78-9227-b7c85af7ec66" containerName="proxy-httpd" Mar 09 09:27:28 crc kubenswrapper[4861]: I0309 09:27:28.342000 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="edbe6993-7926-4a78-9227-b7c85af7ec66" containerName="sg-core" Mar 09 09:27:28 crc kubenswrapper[4861]: I0309 09:27:28.342090 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="edbe6993-7926-4a78-9227-b7c85af7ec66" containerName="proxy-httpd" Mar 09 09:27:28 crc kubenswrapper[4861]: I0309 09:27:28.342152 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="edbe6993-7926-4a78-9227-b7c85af7ec66" containerName="ceilometer-central-agent" Mar 09 09:27:28 crc kubenswrapper[4861]: I0309 09:27:28.342218 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="edbe6993-7926-4a78-9227-b7c85af7ec66" containerName="ceilometer-notification-agent" Mar 09 09:27:28 crc kubenswrapper[4861]: I0309 09:27:28.344335 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:27:28 crc kubenswrapper[4861]: I0309 09:27:28.346195 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 09:27:28 crc kubenswrapper[4861]: I0309 09:27:28.347076 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 09 09:27:28 crc kubenswrapper[4861]: I0309 09:27:28.351196 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 09:27:28 crc kubenswrapper[4861]: I0309 09:27:28.365640 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:27:28 crc kubenswrapper[4861]: I0309 09:27:28.432693 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c333f575-244d-4a42-858d-1ec027498b09-log-httpd\") pod \"ceilometer-0\" (UID: \"c333f575-244d-4a42-858d-1ec027498b09\") " pod="openstack/ceilometer-0" Mar 09 09:27:28 crc kubenswrapper[4861]: I0309 09:27:28.432734 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c333f575-244d-4a42-858d-1ec027498b09-run-httpd\") pod \"ceilometer-0\" (UID: \"c333f575-244d-4a42-858d-1ec027498b09\") " pod="openstack/ceilometer-0" Mar 09 09:27:28 crc kubenswrapper[4861]: I0309 09:27:28.432759 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c333f575-244d-4a42-858d-1ec027498b09-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c333f575-244d-4a42-858d-1ec027498b09\") " pod="openstack/ceilometer-0" Mar 09 09:27:28 crc kubenswrapper[4861]: I0309 09:27:28.432812 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxqkq\" (UniqueName: \"kubernetes.io/projected/c333f575-244d-4a42-858d-1ec027498b09-kube-api-access-fxqkq\") pod \"ceilometer-0\" (UID: \"c333f575-244d-4a42-858d-1ec027498b09\") " pod="openstack/ceilometer-0" Mar 09 09:27:28 crc kubenswrapper[4861]: I0309 09:27:28.432844 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c333f575-244d-4a42-858d-1ec027498b09-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c333f575-244d-4a42-858d-1ec027498b09\") " pod="openstack/ceilometer-0" Mar 09 09:27:28 crc kubenswrapper[4861]: I0309 09:27:28.432872 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c333f575-244d-4a42-858d-1ec027498b09-config-data\") pod \"ceilometer-0\" (UID: \"c333f575-244d-4a42-858d-1ec027498b09\") " pod="openstack/ceilometer-0" Mar 09 09:27:28 crc kubenswrapper[4861]: I0309 09:27:28.432920 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c333f575-244d-4a42-858d-1ec027498b09-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c333f575-244d-4a42-858d-1ec027498b09\") " pod="openstack/ceilometer-0" Mar 09 09:27:28 crc kubenswrapper[4861]: I0309 09:27:28.432969 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c333f575-244d-4a42-858d-1ec027498b09-scripts\") pod \"ceilometer-0\" (UID: \"c333f575-244d-4a42-858d-1ec027498b09\") " pod="openstack/ceilometer-0" Mar 09 09:27:28 crc kubenswrapper[4861]: I0309 09:27:28.534615 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c333f575-244d-4a42-858d-1ec027498b09-scripts\") pod \"ceilometer-0\" (UID: \"c333f575-244d-4a42-858d-1ec027498b09\") " pod="openstack/ceilometer-0" Mar 09 09:27:28 crc kubenswrapper[4861]: I0309 09:27:28.534785 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c333f575-244d-4a42-858d-1ec027498b09-log-httpd\") pod \"ceilometer-0\" (UID: \"c333f575-244d-4a42-858d-1ec027498b09\") " pod="openstack/ceilometer-0" Mar 09 09:27:28 crc kubenswrapper[4861]: I0309 09:27:28.534840 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c333f575-244d-4a42-858d-1ec027498b09-run-httpd\") pod \"ceilometer-0\" (UID: \"c333f575-244d-4a42-858d-1ec027498b09\") " pod="openstack/ceilometer-0" Mar 09 09:27:28 crc kubenswrapper[4861]: I0309 09:27:28.534857 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c333f575-244d-4a42-858d-1ec027498b09-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c333f575-244d-4a42-858d-1ec027498b09\") " pod="openstack/ceilometer-0" Mar 09 09:27:28 crc kubenswrapper[4861]: I0309 09:27:28.534905 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxqkq\" (UniqueName: \"kubernetes.io/projected/c333f575-244d-4a42-858d-1ec027498b09-kube-api-access-fxqkq\") pod \"ceilometer-0\" (UID: \"c333f575-244d-4a42-858d-1ec027498b09\") " pod="openstack/ceilometer-0" Mar 09 09:27:28 crc kubenswrapper[4861]: I0309 09:27:28.534931 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c333f575-244d-4a42-858d-1ec027498b09-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c333f575-244d-4a42-858d-1ec027498b09\") " pod="openstack/ceilometer-0" Mar 09 09:27:28 crc kubenswrapper[4861]: I0309 09:27:28.534953 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c333f575-244d-4a42-858d-1ec027498b09-config-data\") pod \"ceilometer-0\" (UID: \"c333f575-244d-4a42-858d-1ec027498b09\") " pod="openstack/ceilometer-0" Mar 09 09:27:28 crc kubenswrapper[4861]: I0309 09:27:28.534993 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c333f575-244d-4a42-858d-1ec027498b09-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c333f575-244d-4a42-858d-1ec027498b09\") " pod="openstack/ceilometer-0" Mar 09 09:27:28 crc kubenswrapper[4861]: I0309 09:27:28.535791 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c333f575-244d-4a42-858d-1ec027498b09-log-httpd\") pod \"ceilometer-0\" (UID: \"c333f575-244d-4a42-858d-1ec027498b09\") " pod="openstack/ceilometer-0" Mar 09 09:27:28 crc kubenswrapper[4861]: I0309 09:27:28.536086 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c333f575-244d-4a42-858d-1ec027498b09-run-httpd\") pod \"ceilometer-0\" (UID: \"c333f575-244d-4a42-858d-1ec027498b09\") " pod="openstack/ceilometer-0" Mar 09 09:27:28 crc kubenswrapper[4861]: I0309 09:27:28.541553 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c333f575-244d-4a42-858d-1ec027498b09-scripts\") pod \"ceilometer-0\" (UID: \"c333f575-244d-4a42-858d-1ec027498b09\") " pod="openstack/ceilometer-0" Mar 09 09:27:28 crc kubenswrapper[4861]: I0309 09:27:28.542571 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c333f575-244d-4a42-858d-1ec027498b09-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c333f575-244d-4a42-858d-1ec027498b09\") " pod="openstack/ceilometer-0" Mar 09 09:27:28 crc kubenswrapper[4861]: I0309 09:27:28.543941 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c333f575-244d-4a42-858d-1ec027498b09-config-data\") pod \"ceilometer-0\" (UID: \"c333f575-244d-4a42-858d-1ec027498b09\") " pod="openstack/ceilometer-0" Mar 09 09:27:28 crc kubenswrapper[4861]: I0309 09:27:28.553578 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxqkq\" (UniqueName: \"kubernetes.io/projected/c333f575-244d-4a42-858d-1ec027498b09-kube-api-access-fxqkq\") pod \"ceilometer-0\" (UID: \"c333f575-244d-4a42-858d-1ec027498b09\") " pod="openstack/ceilometer-0" Mar 09 09:27:28 crc kubenswrapper[4861]: I0309 09:27:28.554633 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c333f575-244d-4a42-858d-1ec027498b09-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"c333f575-244d-4a42-858d-1ec027498b09\") " pod="openstack/ceilometer-0" Mar 09 09:27:28 crc kubenswrapper[4861]: I0309 09:27:28.554923 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c333f575-244d-4a42-858d-1ec027498b09-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c333f575-244d-4a42-858d-1ec027498b09\") " pod="openstack/ceilometer-0" Mar 09 09:27:28 crc kubenswrapper[4861]: I0309 09:27:28.662698 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:27:28 crc kubenswrapper[4861]: I0309 09:27:28.970513 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:27:29 crc kubenswrapper[4861]: I0309 09:27:29.299865 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c333f575-244d-4a42-858d-1ec027498b09","Type":"ContainerStarted","Data":"e2cd8f5aeca5dac08198a0d415e3d5bf6cb5826b110f45944662518b14162db6"} Mar 09 09:27:29 crc kubenswrapper[4861]: I0309 09:27:29.669893 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edbe6993-7926-4a78-9227-b7c85af7ec66" path="/var/lib/kubelet/pods/edbe6993-7926-4a78-9227-b7c85af7ec66/volumes" Mar 09 09:27:30 crc kubenswrapper[4861]: I0309 09:27:30.318172 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c333f575-244d-4a42-858d-1ec027498b09","Type":"ContainerStarted","Data":"df06d12fe08e20a1c28e93aba07d4d7bf85f76f36b603c6e7f21340c6840f683"} Mar 09 09:27:31 crc kubenswrapper[4861]: I0309 09:27:31.334861 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c333f575-244d-4a42-858d-1ec027498b09","Type":"ContainerStarted","Data":"7e61571dbef933d3c8205144f8e6ececaf787f303d735834b150cb1c94b91d0b"} Mar 09 09:27:31 crc kubenswrapper[4861]: I0309 09:27:31.334922 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c333f575-244d-4a42-858d-1ec027498b09","Type":"ContainerStarted","Data":"1ab06074cdb3cbac611d220841fc24b92e7401d909421e0b20951a2433d09dc0"} Mar 09 09:27:31 crc kubenswrapper[4861]: I0309 09:27:31.636662 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 09 09:27:33 crc kubenswrapper[4861]: I0309 09:27:33.354835 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c333f575-244d-4a42-858d-1ec027498b09","Type":"ContainerStarted","Data":"52f19ab41428d2be595489f2ea3d1f7b990ef050b76b6835762a8a1fb8b491df"} Mar 09 09:27:33 crc kubenswrapper[4861]: I0309 09:27:33.355332 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 09:27:33 crc kubenswrapper[4861]: I0309 09:27:33.382791 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.672337738 podStartE2EDuration="5.382766554s" podCreationTimestamp="2026-03-09 09:27:28 +0000 UTC" firstStartedPulling="2026-03-09 09:27:28.983547144 +0000 UTC m=+1292.068586545" lastFinishedPulling="2026-03-09 09:27:32.69397596 +0000 UTC m=+1295.779015361" observedRunningTime="2026-03-09 09:27:33.37461934 +0000 UTC m=+1296.459658761" watchObservedRunningTime="2026-03-09 09:27:33.382766554 +0000 UTC m=+1296.467805955" Mar 09 09:27:33 crc kubenswrapper[4861]: I0309 09:27:33.486868 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 09 09:27:33 crc kubenswrapper[4861]: I0309 09:27:33.486993 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 09 09:27:33 crc kubenswrapper[4861]: I0309 09:27:33.496474 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 09 09:27:33 crc kubenswrapper[4861]: I0309 09:27:33.501409 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 09 09:27:36 crc kubenswrapper[4861]: I0309 09:27:36.303483 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:27:36 crc kubenswrapper[4861]: I0309 09:27:36.379337 4861 generic.go:334] "Generic (PLEG): container finished" podID="05ca7d41-820a-463e-8129-a58a6b79542b" containerID="0e80459d4f54dac677247cecdba18abeef2c46f728648bfb355574affecdf80f" exitCode=137 Mar 09 09:27:36 crc kubenswrapper[4861]: I0309 09:27:36.379387 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"05ca7d41-820a-463e-8129-a58a6b79542b","Type":"ContainerDied","Data":"0e80459d4f54dac677247cecdba18abeef2c46f728648bfb355574affecdf80f"} Mar 09 09:27:36 crc kubenswrapper[4861]: I0309 09:27:36.379432 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"05ca7d41-820a-463e-8129-a58a6b79542b","Type":"ContainerDied","Data":"22731e9f85fc097dc86411cfa8a0169ee7ad7eeec061b49ac36778786da238c2"} Mar 09 09:27:36 crc kubenswrapper[4861]: I0309 09:27:36.379432 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:27:36 crc kubenswrapper[4861]: I0309 09:27:36.379453 4861 scope.go:117] "RemoveContainer" containerID="0e80459d4f54dac677247cecdba18abeef2c46f728648bfb355574affecdf80f" Mar 09 09:27:36 crc kubenswrapper[4861]: I0309 09:27:36.407156 4861 scope.go:117] "RemoveContainer" containerID="0e80459d4f54dac677247cecdba18abeef2c46f728648bfb355574affecdf80f" Mar 09 09:27:36 crc kubenswrapper[4861]: E0309 09:27:36.407582 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e80459d4f54dac677247cecdba18abeef2c46f728648bfb355574affecdf80f\": container with ID starting with 0e80459d4f54dac677247cecdba18abeef2c46f728648bfb355574affecdf80f not found: ID does not exist" containerID="0e80459d4f54dac677247cecdba18abeef2c46f728648bfb355574affecdf80f" Mar 09 09:27:36 crc kubenswrapper[4861]: I0309 09:27:36.407636 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e80459d4f54dac677247cecdba18abeef2c46f728648bfb355574affecdf80f"} err="failed to get container status \"0e80459d4f54dac677247cecdba18abeef2c46f728648bfb355574affecdf80f\": rpc error: code = NotFound desc = could not find container \"0e80459d4f54dac677247cecdba18abeef2c46f728648bfb355574affecdf80f\": container with ID starting with 0e80459d4f54dac677247cecdba18abeef2c46f728648bfb355574affecdf80f not found: ID does not exist" Mar 09 09:27:36 crc kubenswrapper[4861]: I0309 09:27:36.410565 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05ca7d41-820a-463e-8129-a58a6b79542b-combined-ca-bundle\") pod \"05ca7d41-820a-463e-8129-a58a6b79542b\" (UID: \"05ca7d41-820a-463e-8129-a58a6b79542b\") " Mar 09 09:27:36 crc kubenswrapper[4861]: I0309 09:27:36.410648 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05ca7d41-820a-463e-8129-a58a6b79542b-config-data\") pod \"05ca7d41-820a-463e-8129-a58a6b79542b\" (UID: \"05ca7d41-820a-463e-8129-a58a6b79542b\") " Mar 09 09:27:36 crc kubenswrapper[4861]: I0309 09:27:36.410693 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d24ts\" (UniqueName: \"kubernetes.io/projected/05ca7d41-820a-463e-8129-a58a6b79542b-kube-api-access-d24ts\") pod \"05ca7d41-820a-463e-8129-a58a6b79542b\" (UID: \"05ca7d41-820a-463e-8129-a58a6b79542b\") " Mar 09 09:27:36 crc kubenswrapper[4861]: I0309 09:27:36.417489 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05ca7d41-820a-463e-8129-a58a6b79542b-kube-api-access-d24ts" (OuterVolumeSpecName: "kube-api-access-d24ts") pod "05ca7d41-820a-463e-8129-a58a6b79542b" (UID: "05ca7d41-820a-463e-8129-a58a6b79542b"). InnerVolumeSpecName "kube-api-access-d24ts". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:27:36 crc kubenswrapper[4861]: I0309 09:27:36.440656 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05ca7d41-820a-463e-8129-a58a6b79542b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "05ca7d41-820a-463e-8129-a58a6b79542b" (UID: "05ca7d41-820a-463e-8129-a58a6b79542b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:36 crc kubenswrapper[4861]: I0309 09:27:36.441917 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05ca7d41-820a-463e-8129-a58a6b79542b-config-data" (OuterVolumeSpecName: "config-data") pod "05ca7d41-820a-463e-8129-a58a6b79542b" (UID: "05ca7d41-820a-463e-8129-a58a6b79542b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:36 crc kubenswrapper[4861]: I0309 09:27:36.513321 4861 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05ca7d41-820a-463e-8129-a58a6b79542b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:36 crc kubenswrapper[4861]: I0309 09:27:36.513361 4861 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05ca7d41-820a-463e-8129-a58a6b79542b-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:36 crc kubenswrapper[4861]: I0309 09:27:36.513396 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d24ts\" (UniqueName: \"kubernetes.io/projected/05ca7d41-820a-463e-8129-a58a6b79542b-kube-api-access-d24ts\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:36 crc kubenswrapper[4861]: I0309 09:27:36.743528 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 09:27:36 crc kubenswrapper[4861]: I0309 09:27:36.758474 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 09:27:36 crc kubenswrapper[4861]: I0309 09:27:36.770213 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 09:27:36 crc kubenswrapper[4861]: E0309 09:27:36.770874 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05ca7d41-820a-463e-8129-a58a6b79542b" containerName="nova-cell1-novncproxy-novncproxy" Mar 09 09:27:36 crc kubenswrapper[4861]: I0309 09:27:36.770901 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="05ca7d41-820a-463e-8129-a58a6b79542b" containerName="nova-cell1-novncproxy-novncproxy" Mar 09 09:27:36 crc kubenswrapper[4861]: I0309 09:27:36.771251 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="05ca7d41-820a-463e-8129-a58a6b79542b" containerName="nova-cell1-novncproxy-novncproxy" Mar 09 09:27:36 crc kubenswrapper[4861]: I0309 09:27:36.772132 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:27:36 crc kubenswrapper[4861]: I0309 09:27:36.782790 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 09 09:27:36 crc kubenswrapper[4861]: I0309 09:27:36.782988 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 09 09:27:36 crc kubenswrapper[4861]: I0309 09:27:36.783516 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 09 09:27:36 crc kubenswrapper[4861]: I0309 09:27:36.806679 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 09:27:36 crc kubenswrapper[4861]: I0309 09:27:36.820775 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tg27q\" (UniqueName: \"kubernetes.io/projected/dd2e86d2-700f-4fd8-b89b-84bd5a09069d-kube-api-access-tg27q\") pod \"nova-cell1-novncproxy-0\" (UID: \"dd2e86d2-700f-4fd8-b89b-84bd5a09069d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:27:36 crc kubenswrapper[4861]: I0309 09:27:36.821006 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd2e86d2-700f-4fd8-b89b-84bd5a09069d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"dd2e86d2-700f-4fd8-b89b-84bd5a09069d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:27:36 crc kubenswrapper[4861]: I0309 09:27:36.821093 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd2e86d2-700f-4fd8-b89b-84bd5a09069d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"dd2e86d2-700f-4fd8-b89b-84bd5a09069d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:27:36 crc kubenswrapper[4861]: I0309 09:27:36.821220 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd2e86d2-700f-4fd8-b89b-84bd5a09069d-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"dd2e86d2-700f-4fd8-b89b-84bd5a09069d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:27:36 crc kubenswrapper[4861]: I0309 09:27:36.821324 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd2e86d2-700f-4fd8-b89b-84bd5a09069d-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"dd2e86d2-700f-4fd8-b89b-84bd5a09069d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:27:36 crc kubenswrapper[4861]: I0309 09:27:36.864001 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 09 09:27:36 crc kubenswrapper[4861]: I0309 09:27:36.865085 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 09 09:27:36 crc kubenswrapper[4861]: I0309 09:27:36.865961 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 09 09:27:36 crc kubenswrapper[4861]: I0309 09:27:36.871338 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 09 09:27:36 crc kubenswrapper[4861]: I0309 09:27:36.923557 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd2e86d2-700f-4fd8-b89b-84bd5a09069d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"dd2e86d2-700f-4fd8-b89b-84bd5a09069d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:27:36 crc kubenswrapper[4861]: I0309 09:27:36.923619 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd2e86d2-700f-4fd8-b89b-84bd5a09069d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"dd2e86d2-700f-4fd8-b89b-84bd5a09069d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:27:36 crc kubenswrapper[4861]: I0309 09:27:36.923679 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd2e86d2-700f-4fd8-b89b-84bd5a09069d-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"dd2e86d2-700f-4fd8-b89b-84bd5a09069d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:27:36 crc kubenswrapper[4861]: I0309 09:27:36.923724 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd2e86d2-700f-4fd8-b89b-84bd5a09069d-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"dd2e86d2-700f-4fd8-b89b-84bd5a09069d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:27:36 crc kubenswrapper[4861]: I0309 09:27:36.923832 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tg27q\" (UniqueName: \"kubernetes.io/projected/dd2e86d2-700f-4fd8-b89b-84bd5a09069d-kube-api-access-tg27q\") pod \"nova-cell1-novncproxy-0\" (UID: \"dd2e86d2-700f-4fd8-b89b-84bd5a09069d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:27:36 crc kubenswrapper[4861]: I0309 09:27:36.928234 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd2e86d2-700f-4fd8-b89b-84bd5a09069d-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"dd2e86d2-700f-4fd8-b89b-84bd5a09069d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:27:36 crc kubenswrapper[4861]: I0309 09:27:36.928880 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd2e86d2-700f-4fd8-b89b-84bd5a09069d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"dd2e86d2-700f-4fd8-b89b-84bd5a09069d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:27:36 crc kubenswrapper[4861]: I0309 09:27:36.931698 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd2e86d2-700f-4fd8-b89b-84bd5a09069d-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"dd2e86d2-700f-4fd8-b89b-84bd5a09069d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:27:36 crc kubenswrapper[4861]: I0309 09:27:36.941130 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd2e86d2-700f-4fd8-b89b-84bd5a09069d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"dd2e86d2-700f-4fd8-b89b-84bd5a09069d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:27:36 crc kubenswrapper[4861]: I0309 09:27:36.957024 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tg27q\" (UniqueName: \"kubernetes.io/projected/dd2e86d2-700f-4fd8-b89b-84bd5a09069d-kube-api-access-tg27q\") pod \"nova-cell1-novncproxy-0\" (UID: \"dd2e86d2-700f-4fd8-b89b-84bd5a09069d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:27:37 crc kubenswrapper[4861]: I0309 09:27:37.097480 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:27:37 crc kubenswrapper[4861]: I0309 09:27:37.391319 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 09 09:27:37 crc kubenswrapper[4861]: I0309 09:27:37.394485 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 09 09:27:37 crc kubenswrapper[4861]: I0309 09:27:37.600032 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7749c44969-c57rg"] Mar 09 09:27:37 crc kubenswrapper[4861]: I0309 09:27:37.620110 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7749c44969-c57rg" Mar 09 09:27:37 crc kubenswrapper[4861]: I0309 09:27:37.628469 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 09:27:37 crc kubenswrapper[4861]: I0309 09:27:37.647345 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7749c44969-c57rg"] Mar 09 09:27:37 crc kubenswrapper[4861]: I0309 09:27:37.713813 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05ca7d41-820a-463e-8129-a58a6b79542b" path="/var/lib/kubelet/pods/05ca7d41-820a-463e-8129-a58a6b79542b/volumes" Mar 09 09:27:37 crc kubenswrapper[4861]: I0309 09:27:37.757180 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfhxw\" (UniqueName: \"kubernetes.io/projected/18b03e1a-4375-4f3e-ab0d-17b8498c0146-kube-api-access-lfhxw\") pod \"dnsmasq-dns-7749c44969-c57rg\" (UID: \"18b03e1a-4375-4f3e-ab0d-17b8498c0146\") " pod="openstack/dnsmasq-dns-7749c44969-c57rg" Mar 09 09:27:37 crc kubenswrapper[4861]: I0309 09:27:37.757236 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18b03e1a-4375-4f3e-ab0d-17b8498c0146-config\") pod \"dnsmasq-dns-7749c44969-c57rg\" (UID: \"18b03e1a-4375-4f3e-ab0d-17b8498c0146\") " pod="openstack/dnsmasq-dns-7749c44969-c57rg" Mar 09 09:27:37 crc kubenswrapper[4861]: I0309 09:27:37.757427 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18b03e1a-4375-4f3e-ab0d-17b8498c0146-dns-svc\") pod \"dnsmasq-dns-7749c44969-c57rg\" (UID: \"18b03e1a-4375-4f3e-ab0d-17b8498c0146\") " pod="openstack/dnsmasq-dns-7749c44969-c57rg" Mar 09 09:27:37 crc kubenswrapper[4861]: I0309 09:27:37.757489 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/18b03e1a-4375-4f3e-ab0d-17b8498c0146-ovsdbserver-nb\") pod \"dnsmasq-dns-7749c44969-c57rg\" (UID: \"18b03e1a-4375-4f3e-ab0d-17b8498c0146\") " pod="openstack/dnsmasq-dns-7749c44969-c57rg" Mar 09 09:27:37 crc kubenswrapper[4861]: I0309 09:27:37.757592 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/18b03e1a-4375-4f3e-ab0d-17b8498c0146-ovsdbserver-sb\") pod \"dnsmasq-dns-7749c44969-c57rg\" (UID: \"18b03e1a-4375-4f3e-ab0d-17b8498c0146\") " pod="openstack/dnsmasq-dns-7749c44969-c57rg" Mar 09 09:27:37 crc kubenswrapper[4861]: I0309 09:27:37.757806 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/18b03e1a-4375-4f3e-ab0d-17b8498c0146-dns-swift-storage-0\") pod \"dnsmasq-dns-7749c44969-c57rg\" (UID: \"18b03e1a-4375-4f3e-ab0d-17b8498c0146\") " pod="openstack/dnsmasq-dns-7749c44969-c57rg" Mar 09 09:27:37 crc kubenswrapper[4861]: I0309 09:27:37.859583 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/18b03e1a-4375-4f3e-ab0d-17b8498c0146-ovsdbserver-nb\") pod \"dnsmasq-dns-7749c44969-c57rg\" (UID: \"18b03e1a-4375-4f3e-ab0d-17b8498c0146\") " pod="openstack/dnsmasq-dns-7749c44969-c57rg" Mar 09 09:27:37 crc kubenswrapper[4861]: I0309 09:27:37.859653 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/18b03e1a-4375-4f3e-ab0d-17b8498c0146-ovsdbserver-sb\") pod \"dnsmasq-dns-7749c44969-c57rg\" (UID: \"18b03e1a-4375-4f3e-ab0d-17b8498c0146\") " pod="openstack/dnsmasq-dns-7749c44969-c57rg" Mar 09 09:27:37 crc kubenswrapper[4861]: I0309 09:27:37.859727 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/18b03e1a-4375-4f3e-ab0d-17b8498c0146-dns-swift-storage-0\") pod \"dnsmasq-dns-7749c44969-c57rg\" (UID: \"18b03e1a-4375-4f3e-ab0d-17b8498c0146\") " pod="openstack/dnsmasq-dns-7749c44969-c57rg" Mar 09 09:27:37 crc kubenswrapper[4861]: I0309 09:27:37.859808 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfhxw\" (UniqueName: \"kubernetes.io/projected/18b03e1a-4375-4f3e-ab0d-17b8498c0146-kube-api-access-lfhxw\") pod \"dnsmasq-dns-7749c44969-c57rg\" (UID: \"18b03e1a-4375-4f3e-ab0d-17b8498c0146\") " pod="openstack/dnsmasq-dns-7749c44969-c57rg" Mar 09 09:27:37 crc kubenswrapper[4861]: I0309 09:27:37.859830 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18b03e1a-4375-4f3e-ab0d-17b8498c0146-config\") pod \"dnsmasq-dns-7749c44969-c57rg\" (UID: \"18b03e1a-4375-4f3e-ab0d-17b8498c0146\") " pod="openstack/dnsmasq-dns-7749c44969-c57rg" Mar 09 09:27:37 crc kubenswrapper[4861]: I0309 09:27:37.859859 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18b03e1a-4375-4f3e-ab0d-17b8498c0146-dns-svc\") pod \"dnsmasq-dns-7749c44969-c57rg\" (UID: \"18b03e1a-4375-4f3e-ab0d-17b8498c0146\") " pod="openstack/dnsmasq-dns-7749c44969-c57rg" Mar 09 09:27:37 crc kubenswrapper[4861]: I0309 09:27:37.860659 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/18b03e1a-4375-4f3e-ab0d-17b8498c0146-ovsdbserver-nb\") pod \"dnsmasq-dns-7749c44969-c57rg\" (UID: \"18b03e1a-4375-4f3e-ab0d-17b8498c0146\") " pod="openstack/dnsmasq-dns-7749c44969-c57rg" Mar 09 09:27:37 crc kubenswrapper[4861]: I0309 09:27:37.860661 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/18b03e1a-4375-4f3e-ab0d-17b8498c0146-dns-swift-storage-0\") pod \"dnsmasq-dns-7749c44969-c57rg\" (UID: \"18b03e1a-4375-4f3e-ab0d-17b8498c0146\") " pod="openstack/dnsmasq-dns-7749c44969-c57rg" Mar 09 09:27:37 crc kubenswrapper[4861]: I0309 09:27:37.860811 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/18b03e1a-4375-4f3e-ab0d-17b8498c0146-ovsdbserver-sb\") pod \"dnsmasq-dns-7749c44969-c57rg\" (UID: \"18b03e1a-4375-4f3e-ab0d-17b8498c0146\") " pod="openstack/dnsmasq-dns-7749c44969-c57rg" Mar 09 09:27:37 crc kubenswrapper[4861]: I0309 09:27:37.860960 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18b03e1a-4375-4f3e-ab0d-17b8498c0146-dns-svc\") pod \"dnsmasq-dns-7749c44969-c57rg\" (UID: \"18b03e1a-4375-4f3e-ab0d-17b8498c0146\") " pod="openstack/dnsmasq-dns-7749c44969-c57rg" Mar 09 09:27:37 crc kubenswrapper[4861]: I0309 09:27:37.861188 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18b03e1a-4375-4f3e-ab0d-17b8498c0146-config\") pod \"dnsmasq-dns-7749c44969-c57rg\" (UID: \"18b03e1a-4375-4f3e-ab0d-17b8498c0146\") " pod="openstack/dnsmasq-dns-7749c44969-c57rg" Mar 09 09:27:37 crc kubenswrapper[4861]: I0309 09:27:37.881224 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfhxw\" (UniqueName: \"kubernetes.io/projected/18b03e1a-4375-4f3e-ab0d-17b8498c0146-kube-api-access-lfhxw\") pod \"dnsmasq-dns-7749c44969-c57rg\" (UID: \"18b03e1a-4375-4f3e-ab0d-17b8498c0146\") " pod="openstack/dnsmasq-dns-7749c44969-c57rg" Mar 09 09:27:38 crc kubenswrapper[4861]: I0309 09:27:38.018170 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7749c44969-c57rg" Mar 09 09:27:38 crc kubenswrapper[4861]: I0309 09:27:38.399988 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"dd2e86d2-700f-4fd8-b89b-84bd5a09069d","Type":"ContainerStarted","Data":"af6983cf89727896cb30c5a213a2fed66be3887868d29747c7869580c0eedd79"} Mar 09 09:27:38 crc kubenswrapper[4861]: I0309 09:27:38.400287 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"dd2e86d2-700f-4fd8-b89b-84bd5a09069d","Type":"ContainerStarted","Data":"20bb624794b2139092dd86562dd26d0927b25ed84a3658898f1fe1c1a34f33ad"} Mar 09 09:27:38 crc kubenswrapper[4861]: I0309 09:27:38.541019 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.540994888 podStartE2EDuration="2.540994888s" podCreationTimestamp="2026-03-09 09:27:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:27:38.436481463 +0000 UTC m=+1301.521520884" watchObservedRunningTime="2026-03-09 09:27:38.540994888 +0000 UTC m=+1301.626034289" Mar 09 09:27:38 crc kubenswrapper[4861]: I0309 09:27:38.543595 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7749c44969-c57rg"] Mar 09 09:27:38 crc kubenswrapper[4861]: W0309 09:27:38.577709 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18b03e1a_4375_4f3e_ab0d_17b8498c0146.slice/crio-da607d03e9a0b78997a0f1f883081a617b3123991eda13760c2fc678683fb87c WatchSource:0}: Error finding container da607d03e9a0b78997a0f1f883081a617b3123991eda13760c2fc678683fb87c: Status 404 returned error can't find the container with id da607d03e9a0b78997a0f1f883081a617b3123991eda13760c2fc678683fb87c Mar 09 09:27:39 crc kubenswrapper[4861]: I0309 09:27:39.420443 4861 generic.go:334] "Generic (PLEG): container finished" podID="18b03e1a-4375-4f3e-ab0d-17b8498c0146" containerID="e140a5deae214509e351f908f68fc1d89ef3f4da51f96fa3c7b2a1a6d756b530" exitCode=0 Mar 09 09:27:39 crc kubenswrapper[4861]: I0309 09:27:39.420510 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7749c44969-c57rg" event={"ID":"18b03e1a-4375-4f3e-ab0d-17b8498c0146","Type":"ContainerDied","Data":"e140a5deae214509e351f908f68fc1d89ef3f4da51f96fa3c7b2a1a6d756b530"} Mar 09 09:27:39 crc kubenswrapper[4861]: I0309 09:27:39.421035 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7749c44969-c57rg" event={"ID":"18b03e1a-4375-4f3e-ab0d-17b8498c0146","Type":"ContainerStarted","Data":"da607d03e9a0b78997a0f1f883081a617b3123991eda13760c2fc678683fb87c"} Mar 09 09:27:40 crc kubenswrapper[4861]: I0309 09:27:40.178578 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 09 09:27:40 crc kubenswrapper[4861]: I0309 09:27:40.432110 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7749c44969-c57rg" event={"ID":"18b03e1a-4375-4f3e-ab0d-17b8498c0146","Type":"ContainerStarted","Data":"fc2a8891cd531d8b1ecf94dd1508de6fdce3bec1719a0a4fdeb47b8730d816f4"} Mar 09 09:27:40 crc kubenswrapper[4861]: I0309 09:27:40.432257 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="20184bd6-aec8-44d4-85de-4dddbc61ce6d" containerName="nova-api-log" containerID="cri-o://dc164e711d9c74be59a234b63af3a7cb3e2515ccf152e71fd97cb37c82c5514f" gracePeriod=30 Mar 09 09:27:40 crc kubenswrapper[4861]: I0309 09:27:40.432328 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="20184bd6-aec8-44d4-85de-4dddbc61ce6d" containerName="nova-api-api" containerID="cri-o://6d0d12d972c0219d9d0f4fb2d357ec09c2b975ff09c50e9eddb5fd6e4b0f69af" gracePeriod=30 Mar 09 09:27:40 crc kubenswrapper[4861]: I0309 09:27:40.480087 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:27:40 crc kubenswrapper[4861]: I0309 09:27:40.480429 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c333f575-244d-4a42-858d-1ec027498b09" containerName="ceilometer-central-agent" containerID="cri-o://df06d12fe08e20a1c28e93aba07d4d7bf85f76f36b603c6e7f21340c6840f683" gracePeriod=30 Mar 09 09:27:40 crc kubenswrapper[4861]: I0309 09:27:40.480496 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c333f575-244d-4a42-858d-1ec027498b09" containerName="sg-core" containerID="cri-o://7e61571dbef933d3c8205144f8e6ececaf787f303d735834b150cb1c94b91d0b" gracePeriod=30 Mar 09 09:27:40 crc kubenswrapper[4861]: I0309 09:27:40.480563 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c333f575-244d-4a42-858d-1ec027498b09" containerName="ceilometer-notification-agent" containerID="cri-o://1ab06074cdb3cbac611d220841fc24b92e7401d909421e0b20951a2433d09dc0" gracePeriod=30 Mar 09 09:27:40 crc kubenswrapper[4861]: I0309 09:27:40.480557 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c333f575-244d-4a42-858d-1ec027498b09" containerName="proxy-httpd" containerID="cri-o://52f19ab41428d2be595489f2ea3d1f7b990ef050b76b6835762a8a1fb8b491df" gracePeriod=30 Mar 09 09:27:40 crc kubenswrapper[4861]: I0309 09:27:40.483674 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7749c44969-c57rg" podStartSLOduration=3.483656535 podStartE2EDuration="3.483656535s" podCreationTimestamp="2026-03-09 09:27:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:27:40.47758512 +0000 UTC m=+1303.562624521" watchObservedRunningTime="2026-03-09 09:27:40.483656535 +0000 UTC m=+1303.568695936" Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.416068 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.442963 4861 generic.go:334] "Generic (PLEG): container finished" podID="20184bd6-aec8-44d4-85de-4dddbc61ce6d" containerID="dc164e711d9c74be59a234b63af3a7cb3e2515ccf152e71fd97cb37c82c5514f" exitCode=143 Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.443079 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"20184bd6-aec8-44d4-85de-4dddbc61ce6d","Type":"ContainerDied","Data":"dc164e711d9c74be59a234b63af3a7cb3e2515ccf152e71fd97cb37c82c5514f"} Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.446324 4861 generic.go:334] "Generic (PLEG): container finished" podID="c333f575-244d-4a42-858d-1ec027498b09" containerID="52f19ab41428d2be595489f2ea3d1f7b990ef050b76b6835762a8a1fb8b491df" exitCode=0 Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.446359 4861 generic.go:334] "Generic (PLEG): container finished" podID="c333f575-244d-4a42-858d-1ec027498b09" containerID="7e61571dbef933d3c8205144f8e6ececaf787f303d735834b150cb1c94b91d0b" exitCode=2 Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.446387 4861 generic.go:334] "Generic (PLEG): container finished" podID="c333f575-244d-4a42-858d-1ec027498b09" containerID="1ab06074cdb3cbac611d220841fc24b92e7401d909421e0b20951a2433d09dc0" exitCode=0 Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.446398 4861 generic.go:334] "Generic (PLEG): container finished" podID="c333f575-244d-4a42-858d-1ec027498b09" containerID="df06d12fe08e20a1c28e93aba07d4d7bf85f76f36b603c6e7f21340c6840f683" exitCode=0 Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.446770 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.447238 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c333f575-244d-4a42-858d-1ec027498b09","Type":"ContainerDied","Data":"52f19ab41428d2be595489f2ea3d1f7b990ef050b76b6835762a8a1fb8b491df"} Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.447269 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c333f575-244d-4a42-858d-1ec027498b09","Type":"ContainerDied","Data":"7e61571dbef933d3c8205144f8e6ececaf787f303d735834b150cb1c94b91d0b"} Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.447281 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c333f575-244d-4a42-858d-1ec027498b09","Type":"ContainerDied","Data":"1ab06074cdb3cbac611d220841fc24b92e7401d909421e0b20951a2433d09dc0"} Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.447293 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c333f575-244d-4a42-858d-1ec027498b09","Type":"ContainerDied","Data":"df06d12fe08e20a1c28e93aba07d4d7bf85f76f36b603c6e7f21340c6840f683"} Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.447303 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c333f575-244d-4a42-858d-1ec027498b09","Type":"ContainerDied","Data":"e2cd8f5aeca5dac08198a0d415e3d5bf6cb5826b110f45944662518b14162db6"} Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.447319 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7749c44969-c57rg" Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.447335 4861 scope.go:117] "RemoveContainer" containerID="52f19ab41428d2be595489f2ea3d1f7b990ef050b76b6835762a8a1fb8b491df" Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.480587 4861 scope.go:117] "RemoveContainer" containerID="7e61571dbef933d3c8205144f8e6ececaf787f303d735834b150cb1c94b91d0b" Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.525579 4861 scope.go:117] "RemoveContainer" containerID="1ab06074cdb3cbac611d220841fc24b92e7401d909421e0b20951a2433d09dc0" Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.553971 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c333f575-244d-4a42-858d-1ec027498b09-combined-ca-bundle\") pod \"c333f575-244d-4a42-858d-1ec027498b09\" (UID: \"c333f575-244d-4a42-858d-1ec027498b09\") " Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.554045 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxqkq\" (UniqueName: \"kubernetes.io/projected/c333f575-244d-4a42-858d-1ec027498b09-kube-api-access-fxqkq\") pod \"c333f575-244d-4a42-858d-1ec027498b09\" (UID: \"c333f575-244d-4a42-858d-1ec027498b09\") " Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.554076 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c333f575-244d-4a42-858d-1ec027498b09-log-httpd\") pod \"c333f575-244d-4a42-858d-1ec027498b09\" (UID: \"c333f575-244d-4a42-858d-1ec027498b09\") " Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.554096 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c333f575-244d-4a42-858d-1ec027498b09-sg-core-conf-yaml\") pod \"c333f575-244d-4a42-858d-1ec027498b09\" (UID: \"c333f575-244d-4a42-858d-1ec027498b09\") " Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.554139 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c333f575-244d-4a42-858d-1ec027498b09-config-data\") pod \"c333f575-244d-4a42-858d-1ec027498b09\" (UID: \"c333f575-244d-4a42-858d-1ec027498b09\") " Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.554182 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c333f575-244d-4a42-858d-1ec027498b09-ceilometer-tls-certs\") pod \"c333f575-244d-4a42-858d-1ec027498b09\" (UID: \"c333f575-244d-4a42-858d-1ec027498b09\") " Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.554286 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c333f575-244d-4a42-858d-1ec027498b09-scripts\") pod \"c333f575-244d-4a42-858d-1ec027498b09\" (UID: \"c333f575-244d-4a42-858d-1ec027498b09\") " Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.554395 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c333f575-244d-4a42-858d-1ec027498b09-run-httpd\") pod \"c333f575-244d-4a42-858d-1ec027498b09\" (UID: \"c333f575-244d-4a42-858d-1ec027498b09\") " Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.555940 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c333f575-244d-4a42-858d-1ec027498b09-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c333f575-244d-4a42-858d-1ec027498b09" (UID: "c333f575-244d-4a42-858d-1ec027498b09"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.566144 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c333f575-244d-4a42-858d-1ec027498b09-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c333f575-244d-4a42-858d-1ec027498b09" (UID: "c333f575-244d-4a42-858d-1ec027498b09"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.587856 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c333f575-244d-4a42-858d-1ec027498b09-kube-api-access-fxqkq" (OuterVolumeSpecName: "kube-api-access-fxqkq") pod "c333f575-244d-4a42-858d-1ec027498b09" (UID: "c333f575-244d-4a42-858d-1ec027498b09"). InnerVolumeSpecName "kube-api-access-fxqkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.598541 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c333f575-244d-4a42-858d-1ec027498b09-scripts" (OuterVolumeSpecName: "scripts") pod "c333f575-244d-4a42-858d-1ec027498b09" (UID: "c333f575-244d-4a42-858d-1ec027498b09"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.639547 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c333f575-244d-4a42-858d-1ec027498b09-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c333f575-244d-4a42-858d-1ec027498b09" (UID: "c333f575-244d-4a42-858d-1ec027498b09"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.660161 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxqkq\" (UniqueName: \"kubernetes.io/projected/c333f575-244d-4a42-858d-1ec027498b09-kube-api-access-fxqkq\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.660278 4861 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c333f575-244d-4a42-858d-1ec027498b09-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.660295 4861 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c333f575-244d-4a42-858d-1ec027498b09-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.660307 4861 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c333f575-244d-4a42-858d-1ec027498b09-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.660320 4861 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c333f575-244d-4a42-858d-1ec027498b09-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.726242 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c333f575-244d-4a42-858d-1ec027498b09-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "c333f575-244d-4a42-858d-1ec027498b09" (UID: "c333f575-244d-4a42-858d-1ec027498b09"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.740532 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c333f575-244d-4a42-858d-1ec027498b09-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c333f575-244d-4a42-858d-1ec027498b09" (UID: "c333f575-244d-4a42-858d-1ec027498b09"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.762895 4861 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c333f575-244d-4a42-858d-1ec027498b09-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.762924 4861 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/c333f575-244d-4a42-858d-1ec027498b09-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.782497 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c333f575-244d-4a42-858d-1ec027498b09-config-data" (OuterVolumeSpecName: "config-data") pod "c333f575-244d-4a42-858d-1ec027498b09" (UID: "c333f575-244d-4a42-858d-1ec027498b09"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.841275 4861 scope.go:117] "RemoveContainer" containerID="df06d12fe08e20a1c28e93aba07d4d7bf85f76f36b603c6e7f21340c6840f683" Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.862818 4861 scope.go:117] "RemoveContainer" containerID="52f19ab41428d2be595489f2ea3d1f7b990ef050b76b6835762a8a1fb8b491df" Mar 09 09:27:41 crc kubenswrapper[4861]: E0309 09:27:41.863298 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52f19ab41428d2be595489f2ea3d1f7b990ef050b76b6835762a8a1fb8b491df\": container with ID starting with 52f19ab41428d2be595489f2ea3d1f7b990ef050b76b6835762a8a1fb8b491df not found: ID does not exist" containerID="52f19ab41428d2be595489f2ea3d1f7b990ef050b76b6835762a8a1fb8b491df" Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.863340 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52f19ab41428d2be595489f2ea3d1f7b990ef050b76b6835762a8a1fb8b491df"} err="failed to get container status \"52f19ab41428d2be595489f2ea3d1f7b990ef050b76b6835762a8a1fb8b491df\": rpc error: code = NotFound desc = could not find container \"52f19ab41428d2be595489f2ea3d1f7b990ef050b76b6835762a8a1fb8b491df\": container with ID starting with 52f19ab41428d2be595489f2ea3d1f7b990ef050b76b6835762a8a1fb8b491df not found: ID does not exist" Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.863378 4861 scope.go:117] "RemoveContainer" containerID="7e61571dbef933d3c8205144f8e6ececaf787f303d735834b150cb1c94b91d0b" Mar 09 09:27:41 crc kubenswrapper[4861]: E0309 09:27:41.863698 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e61571dbef933d3c8205144f8e6ececaf787f303d735834b150cb1c94b91d0b\": container with ID starting with 7e61571dbef933d3c8205144f8e6ececaf787f303d735834b150cb1c94b91d0b not found: ID does not exist" containerID="7e61571dbef933d3c8205144f8e6ececaf787f303d735834b150cb1c94b91d0b" Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.863720 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e61571dbef933d3c8205144f8e6ececaf787f303d735834b150cb1c94b91d0b"} err="failed to get container status \"7e61571dbef933d3c8205144f8e6ececaf787f303d735834b150cb1c94b91d0b\": rpc error: code = NotFound desc = could not find container \"7e61571dbef933d3c8205144f8e6ececaf787f303d735834b150cb1c94b91d0b\": container with ID starting with 7e61571dbef933d3c8205144f8e6ececaf787f303d735834b150cb1c94b91d0b not found: ID does not exist" Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.863734 4861 scope.go:117] "RemoveContainer" containerID="1ab06074cdb3cbac611d220841fc24b92e7401d909421e0b20951a2433d09dc0" Mar 09 09:27:41 crc kubenswrapper[4861]: E0309 09:27:41.864069 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ab06074cdb3cbac611d220841fc24b92e7401d909421e0b20951a2433d09dc0\": container with ID starting with 1ab06074cdb3cbac611d220841fc24b92e7401d909421e0b20951a2433d09dc0 not found: ID does not exist" containerID="1ab06074cdb3cbac611d220841fc24b92e7401d909421e0b20951a2433d09dc0" Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.864092 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ab06074cdb3cbac611d220841fc24b92e7401d909421e0b20951a2433d09dc0"} err="failed to get container status \"1ab06074cdb3cbac611d220841fc24b92e7401d909421e0b20951a2433d09dc0\": rpc error: code = NotFound desc = could not find container \"1ab06074cdb3cbac611d220841fc24b92e7401d909421e0b20951a2433d09dc0\": container with ID starting with 1ab06074cdb3cbac611d220841fc24b92e7401d909421e0b20951a2433d09dc0 not found: ID does not exist" Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.864104 4861 scope.go:117] "RemoveContainer" containerID="df06d12fe08e20a1c28e93aba07d4d7bf85f76f36b603c6e7f21340c6840f683" Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.864143 4861 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c333f575-244d-4a42-858d-1ec027498b09-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:41 crc kubenswrapper[4861]: E0309 09:27:41.864427 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df06d12fe08e20a1c28e93aba07d4d7bf85f76f36b603c6e7f21340c6840f683\": container with ID starting with df06d12fe08e20a1c28e93aba07d4d7bf85f76f36b603c6e7f21340c6840f683 not found: ID does not exist" containerID="df06d12fe08e20a1c28e93aba07d4d7bf85f76f36b603c6e7f21340c6840f683" Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.864460 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df06d12fe08e20a1c28e93aba07d4d7bf85f76f36b603c6e7f21340c6840f683"} err="failed to get container status \"df06d12fe08e20a1c28e93aba07d4d7bf85f76f36b603c6e7f21340c6840f683\": rpc error: code = NotFound desc = could not find container \"df06d12fe08e20a1c28e93aba07d4d7bf85f76f36b603c6e7f21340c6840f683\": container with ID starting with df06d12fe08e20a1c28e93aba07d4d7bf85f76f36b603c6e7f21340c6840f683 not found: ID does not exist" Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.864482 4861 scope.go:117] "RemoveContainer" containerID="52f19ab41428d2be595489f2ea3d1f7b990ef050b76b6835762a8a1fb8b491df" Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.864753 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52f19ab41428d2be595489f2ea3d1f7b990ef050b76b6835762a8a1fb8b491df"} err="failed to get container status \"52f19ab41428d2be595489f2ea3d1f7b990ef050b76b6835762a8a1fb8b491df\": rpc error: code = NotFound desc = could not find container \"52f19ab41428d2be595489f2ea3d1f7b990ef050b76b6835762a8a1fb8b491df\": container with ID starting with 52f19ab41428d2be595489f2ea3d1f7b990ef050b76b6835762a8a1fb8b491df not found: ID does not exist" Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.864773 4861 scope.go:117] "RemoveContainer" containerID="7e61571dbef933d3c8205144f8e6ececaf787f303d735834b150cb1c94b91d0b" Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.865017 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e61571dbef933d3c8205144f8e6ececaf787f303d735834b150cb1c94b91d0b"} err="failed to get container status \"7e61571dbef933d3c8205144f8e6ececaf787f303d735834b150cb1c94b91d0b\": rpc error: code = NotFound desc = could not find container \"7e61571dbef933d3c8205144f8e6ececaf787f303d735834b150cb1c94b91d0b\": container with ID starting with 7e61571dbef933d3c8205144f8e6ececaf787f303d735834b150cb1c94b91d0b not found: ID does not exist" Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.865036 4861 scope.go:117] "RemoveContainer" containerID="1ab06074cdb3cbac611d220841fc24b92e7401d909421e0b20951a2433d09dc0" Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.865348 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ab06074cdb3cbac611d220841fc24b92e7401d909421e0b20951a2433d09dc0"} err="failed to get container status \"1ab06074cdb3cbac611d220841fc24b92e7401d909421e0b20951a2433d09dc0\": rpc error: code = NotFound desc = could not find container \"1ab06074cdb3cbac611d220841fc24b92e7401d909421e0b20951a2433d09dc0\": container with ID starting with 1ab06074cdb3cbac611d220841fc24b92e7401d909421e0b20951a2433d09dc0 not found: ID does not exist" Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.865425 4861 scope.go:117] "RemoveContainer" containerID="df06d12fe08e20a1c28e93aba07d4d7bf85f76f36b603c6e7f21340c6840f683" Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.865680 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df06d12fe08e20a1c28e93aba07d4d7bf85f76f36b603c6e7f21340c6840f683"} err="failed to get container status \"df06d12fe08e20a1c28e93aba07d4d7bf85f76f36b603c6e7f21340c6840f683\": rpc error: code = NotFound desc = could not find container \"df06d12fe08e20a1c28e93aba07d4d7bf85f76f36b603c6e7f21340c6840f683\": container with ID starting with df06d12fe08e20a1c28e93aba07d4d7bf85f76f36b603c6e7f21340c6840f683 not found: ID does not exist" Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.865701 4861 scope.go:117] "RemoveContainer" containerID="52f19ab41428d2be595489f2ea3d1f7b990ef050b76b6835762a8a1fb8b491df" Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.865940 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52f19ab41428d2be595489f2ea3d1f7b990ef050b76b6835762a8a1fb8b491df"} err="failed to get container status \"52f19ab41428d2be595489f2ea3d1f7b990ef050b76b6835762a8a1fb8b491df\": rpc error: code = NotFound desc = could not find container \"52f19ab41428d2be595489f2ea3d1f7b990ef050b76b6835762a8a1fb8b491df\": container with ID starting with 52f19ab41428d2be595489f2ea3d1f7b990ef050b76b6835762a8a1fb8b491df not found: ID does not exist" Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.865966 4861 scope.go:117] "RemoveContainer" containerID="7e61571dbef933d3c8205144f8e6ececaf787f303d735834b150cb1c94b91d0b" Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.866235 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e61571dbef933d3c8205144f8e6ececaf787f303d735834b150cb1c94b91d0b"} err="failed to get container status \"7e61571dbef933d3c8205144f8e6ececaf787f303d735834b150cb1c94b91d0b\": rpc error: code = NotFound desc = could not find container \"7e61571dbef933d3c8205144f8e6ececaf787f303d735834b150cb1c94b91d0b\": container with ID starting with 7e61571dbef933d3c8205144f8e6ececaf787f303d735834b150cb1c94b91d0b not found: ID does not exist" Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.866257 4861 scope.go:117] "RemoveContainer" containerID="1ab06074cdb3cbac611d220841fc24b92e7401d909421e0b20951a2433d09dc0" Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.866533 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ab06074cdb3cbac611d220841fc24b92e7401d909421e0b20951a2433d09dc0"} err="failed to get container status \"1ab06074cdb3cbac611d220841fc24b92e7401d909421e0b20951a2433d09dc0\": rpc error: code = NotFound desc = could not find container \"1ab06074cdb3cbac611d220841fc24b92e7401d909421e0b20951a2433d09dc0\": container with ID starting with 1ab06074cdb3cbac611d220841fc24b92e7401d909421e0b20951a2433d09dc0 not found: ID does not exist" Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.866559 4861 scope.go:117] "RemoveContainer" containerID="df06d12fe08e20a1c28e93aba07d4d7bf85f76f36b603c6e7f21340c6840f683" Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.866772 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df06d12fe08e20a1c28e93aba07d4d7bf85f76f36b603c6e7f21340c6840f683"} err="failed to get container status \"df06d12fe08e20a1c28e93aba07d4d7bf85f76f36b603c6e7f21340c6840f683\": rpc error: code = NotFound desc = could not find container \"df06d12fe08e20a1c28e93aba07d4d7bf85f76f36b603c6e7f21340c6840f683\": container with ID starting with df06d12fe08e20a1c28e93aba07d4d7bf85f76f36b603c6e7f21340c6840f683 not found: ID does not exist" Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.866791 4861 scope.go:117] "RemoveContainer" containerID="52f19ab41428d2be595489f2ea3d1f7b990ef050b76b6835762a8a1fb8b491df" Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.866977 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52f19ab41428d2be595489f2ea3d1f7b990ef050b76b6835762a8a1fb8b491df"} err="failed to get container status \"52f19ab41428d2be595489f2ea3d1f7b990ef050b76b6835762a8a1fb8b491df\": rpc error: code = NotFound desc = could not find container \"52f19ab41428d2be595489f2ea3d1f7b990ef050b76b6835762a8a1fb8b491df\": container with ID starting with 52f19ab41428d2be595489f2ea3d1f7b990ef050b76b6835762a8a1fb8b491df not found: ID does not exist" Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.866995 4861 scope.go:117] "RemoveContainer" containerID="7e61571dbef933d3c8205144f8e6ececaf787f303d735834b150cb1c94b91d0b" Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.867210 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e61571dbef933d3c8205144f8e6ececaf787f303d735834b150cb1c94b91d0b"} err="failed to get container status \"7e61571dbef933d3c8205144f8e6ececaf787f303d735834b150cb1c94b91d0b\": rpc error: code = NotFound desc = could not find container \"7e61571dbef933d3c8205144f8e6ececaf787f303d735834b150cb1c94b91d0b\": container with ID starting with 7e61571dbef933d3c8205144f8e6ececaf787f303d735834b150cb1c94b91d0b not found: ID does not exist" Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.867231 4861 scope.go:117] "RemoveContainer" containerID="1ab06074cdb3cbac611d220841fc24b92e7401d909421e0b20951a2433d09dc0" Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.867706 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ab06074cdb3cbac611d220841fc24b92e7401d909421e0b20951a2433d09dc0"} err="failed to get container status \"1ab06074cdb3cbac611d220841fc24b92e7401d909421e0b20951a2433d09dc0\": rpc error: code = NotFound desc = could not find container \"1ab06074cdb3cbac611d220841fc24b92e7401d909421e0b20951a2433d09dc0\": container with ID starting with 1ab06074cdb3cbac611d220841fc24b92e7401d909421e0b20951a2433d09dc0 not found: ID does not exist" Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.867729 4861 scope.go:117] "RemoveContainer" containerID="df06d12fe08e20a1c28e93aba07d4d7bf85f76f36b603c6e7f21340c6840f683" Mar 09 09:27:41 crc kubenswrapper[4861]: I0309 09:27:41.867964 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df06d12fe08e20a1c28e93aba07d4d7bf85f76f36b603c6e7f21340c6840f683"} err="failed to get container status \"df06d12fe08e20a1c28e93aba07d4d7bf85f76f36b603c6e7f21340c6840f683\": rpc error: code = NotFound desc = could not find container \"df06d12fe08e20a1c28e93aba07d4d7bf85f76f36b603c6e7f21340c6840f683\": container with ID starting with df06d12fe08e20a1c28e93aba07d4d7bf85f76f36b603c6e7f21340c6840f683 not found: ID does not exist" Mar 09 09:27:42 crc kubenswrapper[4861]: I0309 09:27:42.081443 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:27:42 crc kubenswrapper[4861]: I0309 09:27:42.094663 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:27:42 crc kubenswrapper[4861]: I0309 09:27:42.099798 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:27:42 crc kubenswrapper[4861]: I0309 09:27:42.107952 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:27:42 crc kubenswrapper[4861]: E0309 09:27:42.108435 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c333f575-244d-4a42-858d-1ec027498b09" containerName="proxy-httpd" Mar 09 09:27:42 crc kubenswrapper[4861]: I0309 09:27:42.108458 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="c333f575-244d-4a42-858d-1ec027498b09" containerName="proxy-httpd" Mar 09 09:27:42 crc kubenswrapper[4861]: E0309 09:27:42.108485 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c333f575-244d-4a42-858d-1ec027498b09" containerName="sg-core" Mar 09 09:27:42 crc kubenswrapper[4861]: I0309 09:27:42.108493 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="c333f575-244d-4a42-858d-1ec027498b09" containerName="sg-core" Mar 09 09:27:42 crc kubenswrapper[4861]: E0309 09:27:42.108513 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c333f575-244d-4a42-858d-1ec027498b09" containerName="ceilometer-central-agent" Mar 09 09:27:42 crc kubenswrapper[4861]: I0309 09:27:42.108521 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="c333f575-244d-4a42-858d-1ec027498b09" containerName="ceilometer-central-agent" Mar 09 09:27:42 crc kubenswrapper[4861]: E0309 09:27:42.108531 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c333f575-244d-4a42-858d-1ec027498b09" containerName="ceilometer-notification-agent" Mar 09 09:27:42 crc kubenswrapper[4861]: I0309 09:27:42.108538 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="c333f575-244d-4a42-858d-1ec027498b09" containerName="ceilometer-notification-agent" Mar 09 09:27:42 crc kubenswrapper[4861]: I0309 09:27:42.108738 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="c333f575-244d-4a42-858d-1ec027498b09" containerName="ceilometer-notification-agent" Mar 09 09:27:42 crc kubenswrapper[4861]: I0309 09:27:42.108757 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="c333f575-244d-4a42-858d-1ec027498b09" containerName="proxy-httpd" Mar 09 09:27:42 crc kubenswrapper[4861]: I0309 09:27:42.108772 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="c333f575-244d-4a42-858d-1ec027498b09" containerName="sg-core" Mar 09 09:27:42 crc kubenswrapper[4861]: I0309 09:27:42.108785 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="c333f575-244d-4a42-858d-1ec027498b09" containerName="ceilometer-central-agent" Mar 09 09:27:42 crc kubenswrapper[4861]: I0309 09:27:42.110424 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:27:42 crc kubenswrapper[4861]: I0309 09:27:42.112248 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 09:27:42 crc kubenswrapper[4861]: I0309 09:27:42.113807 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 09 09:27:42 crc kubenswrapper[4861]: I0309 09:27:42.129285 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 09:27:42 crc kubenswrapper[4861]: I0309 09:27:42.154247 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:27:42 crc kubenswrapper[4861]: I0309 09:27:42.270879 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d75f0a0-d481-480b-9b00-15349a64be9d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4d75f0a0-d481-480b-9b00-15349a64be9d\") " pod="openstack/ceilometer-0" Mar 09 09:27:42 crc kubenswrapper[4861]: I0309 09:27:42.271498 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4d75f0a0-d481-480b-9b00-15349a64be9d-run-httpd\") pod \"ceilometer-0\" (UID: \"4d75f0a0-d481-480b-9b00-15349a64be9d\") " pod="openstack/ceilometer-0" Mar 09 09:27:42 crc kubenswrapper[4861]: I0309 09:27:42.271641 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d75f0a0-d481-480b-9b00-15349a64be9d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4d75f0a0-d481-480b-9b00-15349a64be9d\") " pod="openstack/ceilometer-0" Mar 09 09:27:42 crc kubenswrapper[4861]: I0309 09:27:42.271872 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4d75f0a0-d481-480b-9b00-15349a64be9d-log-httpd\") pod \"ceilometer-0\" (UID: \"4d75f0a0-d481-480b-9b00-15349a64be9d\") " pod="openstack/ceilometer-0" Mar 09 09:27:42 crc kubenswrapper[4861]: I0309 09:27:42.272097 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4d75f0a0-d481-480b-9b00-15349a64be9d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4d75f0a0-d481-480b-9b00-15349a64be9d\") " pod="openstack/ceilometer-0" Mar 09 09:27:42 crc kubenswrapper[4861]: I0309 09:27:42.272247 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d75f0a0-d481-480b-9b00-15349a64be9d-scripts\") pod \"ceilometer-0\" (UID: \"4d75f0a0-d481-480b-9b00-15349a64be9d\") " pod="openstack/ceilometer-0" Mar 09 09:27:42 crc kubenswrapper[4861]: I0309 09:27:42.272399 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6khrc\" (UniqueName: \"kubernetes.io/projected/4d75f0a0-d481-480b-9b00-15349a64be9d-kube-api-access-6khrc\") pod \"ceilometer-0\" (UID: \"4d75f0a0-d481-480b-9b00-15349a64be9d\") " pod="openstack/ceilometer-0" Mar 09 09:27:42 crc kubenswrapper[4861]: I0309 09:27:42.272489 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d75f0a0-d481-480b-9b00-15349a64be9d-config-data\") pod \"ceilometer-0\" (UID: \"4d75f0a0-d481-480b-9b00-15349a64be9d\") " pod="openstack/ceilometer-0" Mar 09 09:27:42 crc kubenswrapper[4861]: I0309 09:27:42.374406 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4d75f0a0-d481-480b-9b00-15349a64be9d-log-httpd\") pod \"ceilometer-0\" (UID: \"4d75f0a0-d481-480b-9b00-15349a64be9d\") " pod="openstack/ceilometer-0" Mar 09 09:27:42 crc kubenswrapper[4861]: I0309 09:27:42.374497 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4d75f0a0-d481-480b-9b00-15349a64be9d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4d75f0a0-d481-480b-9b00-15349a64be9d\") " pod="openstack/ceilometer-0" Mar 09 09:27:42 crc kubenswrapper[4861]: I0309 09:27:42.374546 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d75f0a0-d481-480b-9b00-15349a64be9d-scripts\") pod \"ceilometer-0\" (UID: \"4d75f0a0-d481-480b-9b00-15349a64be9d\") " pod="openstack/ceilometer-0" Mar 09 09:27:42 crc kubenswrapper[4861]: I0309 09:27:42.374615 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6khrc\" (UniqueName: \"kubernetes.io/projected/4d75f0a0-d481-480b-9b00-15349a64be9d-kube-api-access-6khrc\") pod \"ceilometer-0\" (UID: \"4d75f0a0-d481-480b-9b00-15349a64be9d\") " pod="openstack/ceilometer-0" Mar 09 09:27:42 crc kubenswrapper[4861]: I0309 09:27:42.374993 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4d75f0a0-d481-480b-9b00-15349a64be9d-log-httpd\") pod \"ceilometer-0\" (UID: \"4d75f0a0-d481-480b-9b00-15349a64be9d\") " pod="openstack/ceilometer-0" Mar 09 09:27:42 crc kubenswrapper[4861]: I0309 09:27:42.375305 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d75f0a0-d481-480b-9b00-15349a64be9d-config-data\") pod \"ceilometer-0\" (UID: \"4d75f0a0-d481-480b-9b00-15349a64be9d\") " pod="openstack/ceilometer-0" Mar 09 09:27:42 crc kubenswrapper[4861]: I0309 09:27:42.375350 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d75f0a0-d481-480b-9b00-15349a64be9d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4d75f0a0-d481-480b-9b00-15349a64be9d\") " pod="openstack/ceilometer-0" Mar 09 09:27:42 crc kubenswrapper[4861]: I0309 09:27:42.375423 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4d75f0a0-d481-480b-9b00-15349a64be9d-run-httpd\") pod \"ceilometer-0\" (UID: \"4d75f0a0-d481-480b-9b00-15349a64be9d\") " pod="openstack/ceilometer-0" Mar 09 09:27:42 crc kubenswrapper[4861]: I0309 09:27:42.375467 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d75f0a0-d481-480b-9b00-15349a64be9d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4d75f0a0-d481-480b-9b00-15349a64be9d\") " pod="openstack/ceilometer-0" Mar 09 09:27:42 crc kubenswrapper[4861]: I0309 09:27:42.376007 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4d75f0a0-d481-480b-9b00-15349a64be9d-run-httpd\") pod \"ceilometer-0\" (UID: \"4d75f0a0-d481-480b-9b00-15349a64be9d\") " pod="openstack/ceilometer-0" Mar 09 09:27:42 crc kubenswrapper[4861]: I0309 09:27:42.380633 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d75f0a0-d481-480b-9b00-15349a64be9d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4d75f0a0-d481-480b-9b00-15349a64be9d\") " pod="openstack/ceilometer-0" Mar 09 09:27:42 crc kubenswrapper[4861]: I0309 09:27:42.380856 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4d75f0a0-d481-480b-9b00-15349a64be9d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4d75f0a0-d481-480b-9b00-15349a64be9d\") " pod="openstack/ceilometer-0" Mar 09 09:27:42 crc kubenswrapper[4861]: I0309 09:27:42.381070 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d75f0a0-d481-480b-9b00-15349a64be9d-scripts\") pod \"ceilometer-0\" (UID: \"4d75f0a0-d481-480b-9b00-15349a64be9d\") " pod="openstack/ceilometer-0" Mar 09 09:27:42 crc kubenswrapper[4861]: I0309 09:27:42.381549 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d75f0a0-d481-480b-9b00-15349a64be9d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4d75f0a0-d481-480b-9b00-15349a64be9d\") " pod="openstack/ceilometer-0" Mar 09 09:27:42 crc kubenswrapper[4861]: I0309 09:27:42.381921 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d75f0a0-d481-480b-9b00-15349a64be9d-config-data\") pod \"ceilometer-0\" (UID: \"4d75f0a0-d481-480b-9b00-15349a64be9d\") " pod="openstack/ceilometer-0" Mar 09 09:27:42 crc kubenswrapper[4861]: I0309 09:27:42.392088 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6khrc\" (UniqueName: \"kubernetes.io/projected/4d75f0a0-d481-480b-9b00-15349a64be9d-kube-api-access-6khrc\") pod \"ceilometer-0\" (UID: \"4d75f0a0-d481-480b-9b00-15349a64be9d\") " pod="openstack/ceilometer-0" Mar 09 09:27:42 crc kubenswrapper[4861]: I0309 09:27:42.432624 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:27:42 crc kubenswrapper[4861]: I0309 09:27:42.733575 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:27:42 crc kubenswrapper[4861]: I0309 09:27:42.919328 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:27:42 crc kubenswrapper[4861]: W0309 09:27:42.920336 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d75f0a0_d481_480b_9b00_15349a64be9d.slice/crio-5fee09b271a024929bc606b64c0b245bf45695184b534bd857bc197e58aff359 WatchSource:0}: Error finding container 5fee09b271a024929bc606b64c0b245bf45695184b534bd857bc197e58aff359: Status 404 returned error can't find the container with id 5fee09b271a024929bc606b64c0b245bf45695184b534bd857bc197e58aff359 Mar 09 09:27:43 crc kubenswrapper[4861]: I0309 09:27:43.474228 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4d75f0a0-d481-480b-9b00-15349a64be9d","Type":"ContainerStarted","Data":"5fee09b271a024929bc606b64c0b245bf45695184b534bd857bc197e58aff359"} Mar 09 09:27:43 crc kubenswrapper[4861]: I0309 09:27:43.668813 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c333f575-244d-4a42-858d-1ec027498b09" path="/var/lib/kubelet/pods/c333f575-244d-4a42-858d-1ec027498b09/volumes" Mar 09 09:27:44 crc kubenswrapper[4861]: I0309 09:27:44.132455 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 09:27:44 crc kubenswrapper[4861]: I0309 09:27:44.310313 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20184bd6-aec8-44d4-85de-4dddbc61ce6d-logs\") pod \"20184bd6-aec8-44d4-85de-4dddbc61ce6d\" (UID: \"20184bd6-aec8-44d4-85de-4dddbc61ce6d\") " Mar 09 09:27:44 crc kubenswrapper[4861]: I0309 09:27:44.310753 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20184bd6-aec8-44d4-85de-4dddbc61ce6d-config-data\") pod \"20184bd6-aec8-44d4-85de-4dddbc61ce6d\" (UID: \"20184bd6-aec8-44d4-85de-4dddbc61ce6d\") " Mar 09 09:27:44 crc kubenswrapper[4861]: I0309 09:27:44.310794 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20184bd6-aec8-44d4-85de-4dddbc61ce6d-combined-ca-bundle\") pod \"20184bd6-aec8-44d4-85de-4dddbc61ce6d\" (UID: \"20184bd6-aec8-44d4-85de-4dddbc61ce6d\") " Mar 09 09:27:44 crc kubenswrapper[4861]: I0309 09:27:44.310819 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20184bd6-aec8-44d4-85de-4dddbc61ce6d-logs" (OuterVolumeSpecName: "logs") pod "20184bd6-aec8-44d4-85de-4dddbc61ce6d" (UID: "20184bd6-aec8-44d4-85de-4dddbc61ce6d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:27:44 crc kubenswrapper[4861]: I0309 09:27:44.310926 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kt5tt\" (UniqueName: \"kubernetes.io/projected/20184bd6-aec8-44d4-85de-4dddbc61ce6d-kube-api-access-kt5tt\") pod \"20184bd6-aec8-44d4-85de-4dddbc61ce6d\" (UID: \"20184bd6-aec8-44d4-85de-4dddbc61ce6d\") " Mar 09 09:27:44 crc kubenswrapper[4861]: I0309 09:27:44.311401 4861 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20184bd6-aec8-44d4-85de-4dddbc61ce6d-logs\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:44 crc kubenswrapper[4861]: I0309 09:27:44.316750 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20184bd6-aec8-44d4-85de-4dddbc61ce6d-kube-api-access-kt5tt" (OuterVolumeSpecName: "kube-api-access-kt5tt") pod "20184bd6-aec8-44d4-85de-4dddbc61ce6d" (UID: "20184bd6-aec8-44d4-85de-4dddbc61ce6d"). InnerVolumeSpecName "kube-api-access-kt5tt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:27:44 crc kubenswrapper[4861]: I0309 09:27:44.343548 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20184bd6-aec8-44d4-85de-4dddbc61ce6d-config-data" (OuterVolumeSpecName: "config-data") pod "20184bd6-aec8-44d4-85de-4dddbc61ce6d" (UID: "20184bd6-aec8-44d4-85de-4dddbc61ce6d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:44 crc kubenswrapper[4861]: I0309 09:27:44.346642 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20184bd6-aec8-44d4-85de-4dddbc61ce6d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "20184bd6-aec8-44d4-85de-4dddbc61ce6d" (UID: "20184bd6-aec8-44d4-85de-4dddbc61ce6d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:44 crc kubenswrapper[4861]: I0309 09:27:44.412860 4861 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20184bd6-aec8-44d4-85de-4dddbc61ce6d-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:44 crc kubenswrapper[4861]: I0309 09:27:44.412895 4861 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20184bd6-aec8-44d4-85de-4dddbc61ce6d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:44 crc kubenswrapper[4861]: I0309 09:27:44.412907 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kt5tt\" (UniqueName: \"kubernetes.io/projected/20184bd6-aec8-44d4-85de-4dddbc61ce6d-kube-api-access-kt5tt\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:44 crc kubenswrapper[4861]: I0309 09:27:44.484628 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4d75f0a0-d481-480b-9b00-15349a64be9d","Type":"ContainerStarted","Data":"2c0ce83ebcc4ed98d53ac2703b80efa2b5f7c7eada192f4e831ffc43b09f1273"} Mar 09 09:27:44 crc kubenswrapper[4861]: I0309 09:27:44.487030 4861 generic.go:334] "Generic (PLEG): container finished" podID="20184bd6-aec8-44d4-85de-4dddbc61ce6d" containerID="6d0d12d972c0219d9d0f4fb2d357ec09c2b975ff09c50e9eddb5fd6e4b0f69af" exitCode=0 Mar 09 09:27:44 crc kubenswrapper[4861]: I0309 09:27:44.487063 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"20184bd6-aec8-44d4-85de-4dddbc61ce6d","Type":"ContainerDied","Data":"6d0d12d972c0219d9d0f4fb2d357ec09c2b975ff09c50e9eddb5fd6e4b0f69af"} Mar 09 09:27:44 crc kubenswrapper[4861]: I0309 09:27:44.487082 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"20184bd6-aec8-44d4-85de-4dddbc61ce6d","Type":"ContainerDied","Data":"b406f09f91198dd265106ee5f2a6761e95a46d738e5d82a74b18f6ba4f3baae4"} Mar 09 09:27:44 crc kubenswrapper[4861]: I0309 09:27:44.487098 4861 scope.go:117] "RemoveContainer" containerID="6d0d12d972c0219d9d0f4fb2d357ec09c2b975ff09c50e9eddb5fd6e4b0f69af" Mar 09 09:27:44 crc kubenswrapper[4861]: I0309 09:27:44.487111 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 09:27:44 crc kubenswrapper[4861]: I0309 09:27:44.508984 4861 scope.go:117] "RemoveContainer" containerID="dc164e711d9c74be59a234b63af3a7cb3e2515ccf152e71fd97cb37c82c5514f" Mar 09 09:27:44 crc kubenswrapper[4861]: I0309 09:27:44.523914 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 09 09:27:44 crc kubenswrapper[4861]: I0309 09:27:44.530402 4861 scope.go:117] "RemoveContainer" containerID="6d0d12d972c0219d9d0f4fb2d357ec09c2b975ff09c50e9eddb5fd6e4b0f69af" Mar 09 09:27:44 crc kubenswrapper[4861]: E0309 09:27:44.531038 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d0d12d972c0219d9d0f4fb2d357ec09c2b975ff09c50e9eddb5fd6e4b0f69af\": container with ID starting with 6d0d12d972c0219d9d0f4fb2d357ec09c2b975ff09c50e9eddb5fd6e4b0f69af not found: ID does not exist" containerID="6d0d12d972c0219d9d0f4fb2d357ec09c2b975ff09c50e9eddb5fd6e4b0f69af" Mar 09 09:27:44 crc kubenswrapper[4861]: I0309 09:27:44.531172 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d0d12d972c0219d9d0f4fb2d357ec09c2b975ff09c50e9eddb5fd6e4b0f69af"} err="failed to get container status \"6d0d12d972c0219d9d0f4fb2d357ec09c2b975ff09c50e9eddb5fd6e4b0f69af\": rpc error: code = NotFound desc = could not find container \"6d0d12d972c0219d9d0f4fb2d357ec09c2b975ff09c50e9eddb5fd6e4b0f69af\": container with ID starting with 6d0d12d972c0219d9d0f4fb2d357ec09c2b975ff09c50e9eddb5fd6e4b0f69af not found: ID does not exist" Mar 09 09:27:44 crc kubenswrapper[4861]: I0309 09:27:44.531278 4861 scope.go:117] "RemoveContainer" containerID="dc164e711d9c74be59a234b63af3a7cb3e2515ccf152e71fd97cb37c82c5514f" Mar 09 09:27:44 crc kubenswrapper[4861]: E0309 09:27:44.531972 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc164e711d9c74be59a234b63af3a7cb3e2515ccf152e71fd97cb37c82c5514f\": container with ID starting with dc164e711d9c74be59a234b63af3a7cb3e2515ccf152e71fd97cb37c82c5514f not found: ID does not exist" containerID="dc164e711d9c74be59a234b63af3a7cb3e2515ccf152e71fd97cb37c82c5514f" Mar 09 09:27:44 crc kubenswrapper[4861]: I0309 09:27:44.532072 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc164e711d9c74be59a234b63af3a7cb3e2515ccf152e71fd97cb37c82c5514f"} err="failed to get container status \"dc164e711d9c74be59a234b63af3a7cb3e2515ccf152e71fd97cb37c82c5514f\": rpc error: code = NotFound desc = could not find container \"dc164e711d9c74be59a234b63af3a7cb3e2515ccf152e71fd97cb37c82c5514f\": container with ID starting with dc164e711d9c74be59a234b63af3a7cb3e2515ccf152e71fd97cb37c82c5514f not found: ID does not exist" Mar 09 09:27:44 crc kubenswrapper[4861]: I0309 09:27:44.540261 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 09 09:27:44 crc kubenswrapper[4861]: I0309 09:27:44.553556 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 09 09:27:44 crc kubenswrapper[4861]: E0309 09:27:44.554043 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20184bd6-aec8-44d4-85de-4dddbc61ce6d" containerName="nova-api-api" Mar 09 09:27:44 crc kubenswrapper[4861]: I0309 09:27:44.554062 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="20184bd6-aec8-44d4-85de-4dddbc61ce6d" containerName="nova-api-api" Mar 09 09:27:44 crc kubenswrapper[4861]: E0309 09:27:44.554087 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20184bd6-aec8-44d4-85de-4dddbc61ce6d" containerName="nova-api-log" Mar 09 09:27:44 crc kubenswrapper[4861]: I0309 09:27:44.554095 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="20184bd6-aec8-44d4-85de-4dddbc61ce6d" containerName="nova-api-log" Mar 09 09:27:44 crc kubenswrapper[4861]: I0309 09:27:44.554276 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="20184bd6-aec8-44d4-85de-4dddbc61ce6d" containerName="nova-api-api" Mar 09 09:27:44 crc kubenswrapper[4861]: I0309 09:27:44.554300 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="20184bd6-aec8-44d4-85de-4dddbc61ce6d" containerName="nova-api-log" Mar 09 09:27:44 crc kubenswrapper[4861]: I0309 09:27:44.555285 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 09:27:44 crc kubenswrapper[4861]: I0309 09:27:44.557777 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 09 09:27:44 crc kubenswrapper[4861]: I0309 09:27:44.557977 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 09 09:27:44 crc kubenswrapper[4861]: I0309 09:27:44.559827 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 09 09:27:44 crc kubenswrapper[4861]: I0309 09:27:44.569414 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 09 09:27:44 crc kubenswrapper[4861]: I0309 09:27:44.718802 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98hwc\" (UniqueName: \"kubernetes.io/projected/c284b70e-4d7d-4a48-9061-acfd4c1aed1e-kube-api-access-98hwc\") pod \"nova-api-0\" (UID: \"c284b70e-4d7d-4a48-9061-acfd4c1aed1e\") " pod="openstack/nova-api-0" Mar 09 09:27:44 crc kubenswrapper[4861]: I0309 09:27:44.718875 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c284b70e-4d7d-4a48-9061-acfd4c1aed1e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c284b70e-4d7d-4a48-9061-acfd4c1aed1e\") " pod="openstack/nova-api-0" Mar 09 09:27:44 crc kubenswrapper[4861]: I0309 09:27:44.718973 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c284b70e-4d7d-4a48-9061-acfd4c1aed1e-logs\") pod \"nova-api-0\" (UID: \"c284b70e-4d7d-4a48-9061-acfd4c1aed1e\") " pod="openstack/nova-api-0" Mar 09 09:27:44 crc kubenswrapper[4861]: I0309 09:27:44.719201 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c284b70e-4d7d-4a48-9061-acfd4c1aed1e-public-tls-certs\") pod \"nova-api-0\" (UID: \"c284b70e-4d7d-4a48-9061-acfd4c1aed1e\") " pod="openstack/nova-api-0" Mar 09 09:27:44 crc kubenswrapper[4861]: I0309 09:27:44.719385 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c284b70e-4d7d-4a48-9061-acfd4c1aed1e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c284b70e-4d7d-4a48-9061-acfd4c1aed1e\") " pod="openstack/nova-api-0" Mar 09 09:27:44 crc kubenswrapper[4861]: I0309 09:27:44.719464 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c284b70e-4d7d-4a48-9061-acfd4c1aed1e-config-data\") pod \"nova-api-0\" (UID: \"c284b70e-4d7d-4a48-9061-acfd4c1aed1e\") " pod="openstack/nova-api-0" Mar 09 09:27:44 crc kubenswrapper[4861]: I0309 09:27:44.821356 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c284b70e-4d7d-4a48-9061-acfd4c1aed1e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c284b70e-4d7d-4a48-9061-acfd4c1aed1e\") " pod="openstack/nova-api-0" Mar 09 09:27:44 crc kubenswrapper[4861]: I0309 09:27:44.822129 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c284b70e-4d7d-4a48-9061-acfd4c1aed1e-config-data\") pod \"nova-api-0\" (UID: \"c284b70e-4d7d-4a48-9061-acfd4c1aed1e\") " pod="openstack/nova-api-0" Mar 09 09:27:44 crc kubenswrapper[4861]: I0309 09:27:44.822166 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98hwc\" (UniqueName: \"kubernetes.io/projected/c284b70e-4d7d-4a48-9061-acfd4c1aed1e-kube-api-access-98hwc\") pod \"nova-api-0\" (UID: \"c284b70e-4d7d-4a48-9061-acfd4c1aed1e\") " pod="openstack/nova-api-0" Mar 09 09:27:44 crc kubenswrapper[4861]: I0309 09:27:44.822198 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c284b70e-4d7d-4a48-9061-acfd4c1aed1e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c284b70e-4d7d-4a48-9061-acfd4c1aed1e\") " pod="openstack/nova-api-0" Mar 09 09:27:44 crc kubenswrapper[4861]: I0309 09:27:44.822315 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c284b70e-4d7d-4a48-9061-acfd4c1aed1e-logs\") pod \"nova-api-0\" (UID: \"c284b70e-4d7d-4a48-9061-acfd4c1aed1e\") " pod="openstack/nova-api-0" Mar 09 09:27:44 crc kubenswrapper[4861]: I0309 09:27:44.822421 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c284b70e-4d7d-4a48-9061-acfd4c1aed1e-public-tls-certs\") pod \"nova-api-0\" (UID: \"c284b70e-4d7d-4a48-9061-acfd4c1aed1e\") " pod="openstack/nova-api-0" Mar 09 09:27:44 crc kubenswrapper[4861]: I0309 09:27:44.823703 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c284b70e-4d7d-4a48-9061-acfd4c1aed1e-logs\") pod \"nova-api-0\" (UID: \"c284b70e-4d7d-4a48-9061-acfd4c1aed1e\") " pod="openstack/nova-api-0" Mar 09 09:27:44 crc kubenswrapper[4861]: I0309 09:27:44.825791 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c284b70e-4d7d-4a48-9061-acfd4c1aed1e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"c284b70e-4d7d-4a48-9061-acfd4c1aed1e\") " pod="openstack/nova-api-0" Mar 09 09:27:44 crc kubenswrapper[4861]: I0309 09:27:44.826319 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c284b70e-4d7d-4a48-9061-acfd4c1aed1e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"c284b70e-4d7d-4a48-9061-acfd4c1aed1e\") " pod="openstack/nova-api-0" Mar 09 09:27:44 crc kubenswrapper[4861]: I0309 09:27:44.826785 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c284b70e-4d7d-4a48-9061-acfd4c1aed1e-public-tls-certs\") pod \"nova-api-0\" (UID: \"c284b70e-4d7d-4a48-9061-acfd4c1aed1e\") " pod="openstack/nova-api-0" Mar 09 09:27:44 crc kubenswrapper[4861]: I0309 09:27:44.834940 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c284b70e-4d7d-4a48-9061-acfd4c1aed1e-config-data\") pod \"nova-api-0\" (UID: \"c284b70e-4d7d-4a48-9061-acfd4c1aed1e\") " pod="openstack/nova-api-0" Mar 09 09:27:44 crc kubenswrapper[4861]: I0309 09:27:44.838961 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98hwc\" (UniqueName: \"kubernetes.io/projected/c284b70e-4d7d-4a48-9061-acfd4c1aed1e-kube-api-access-98hwc\") pod \"nova-api-0\" (UID: \"c284b70e-4d7d-4a48-9061-acfd4c1aed1e\") " pod="openstack/nova-api-0" Mar 09 09:27:44 crc kubenswrapper[4861]: I0309 09:27:44.892512 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 09:27:45 crc kubenswrapper[4861]: I0309 09:27:45.517836 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4d75f0a0-d481-480b-9b00-15349a64be9d","Type":"ContainerStarted","Data":"604da89ae58f6538e63da6891fb568004fb049a527d833f392bfa867e05cd4ee"} Mar 09 09:27:45 crc kubenswrapper[4861]: I0309 09:27:45.518452 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4d75f0a0-d481-480b-9b00-15349a64be9d","Type":"ContainerStarted","Data":"8eb6d4fc488f7adfd2c20aea34c63fb8a54cb4da8bde9af3629a2684ef70a2fd"} Mar 09 09:27:45 crc kubenswrapper[4861]: I0309 09:27:45.593662 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 09 09:27:45 crc kubenswrapper[4861]: I0309 09:27:45.673875 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20184bd6-aec8-44d4-85de-4dddbc61ce6d" path="/var/lib/kubelet/pods/20184bd6-aec8-44d4-85de-4dddbc61ce6d/volumes" Mar 09 09:27:46 crc kubenswrapper[4861]: I0309 09:27:46.530413 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c284b70e-4d7d-4a48-9061-acfd4c1aed1e","Type":"ContainerStarted","Data":"3598e98bff57ee3e426eae390226c7f498407cfd6550bf4b3c4bc2d62d3cb7ac"} Mar 09 09:27:46 crc kubenswrapper[4861]: I0309 09:27:46.530465 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c284b70e-4d7d-4a48-9061-acfd4c1aed1e","Type":"ContainerStarted","Data":"016f78ce0f2846310b2843e7584d10c5ff53f4878d11c014c13c8917b24e9190"} Mar 09 09:27:46 crc kubenswrapper[4861]: I0309 09:27:46.530480 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c284b70e-4d7d-4a48-9061-acfd4c1aed1e","Type":"ContainerStarted","Data":"bd8bf4421d19098432f1a0a3588b71df4b1a3eb693783e59cdf50511a516ab44"} Mar 09 09:27:46 crc kubenswrapper[4861]: I0309 09:27:46.557043 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.557022281 podStartE2EDuration="2.557022281s" podCreationTimestamp="2026-03-09 09:27:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:27:46.550335589 +0000 UTC m=+1309.635374990" watchObservedRunningTime="2026-03-09 09:27:46.557022281 +0000 UTC m=+1309.642061682" Mar 09 09:27:47 crc kubenswrapper[4861]: I0309 09:27:47.100861 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:27:47 crc kubenswrapper[4861]: I0309 09:27:47.129722 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:27:47 crc kubenswrapper[4861]: I0309 09:27:47.553150 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4d75f0a0-d481-480b-9b00-15349a64be9d","Type":"ContainerStarted","Data":"2d9ab6c1c96bbc721b04df5406d0664c61ed3c2cfe5f67798a10fd5565e2736f"} Mar 09 09:27:47 crc kubenswrapper[4861]: I0309 09:27:47.553640 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4d75f0a0-d481-480b-9b00-15349a64be9d" containerName="ceilometer-central-agent" containerID="cri-o://2c0ce83ebcc4ed98d53ac2703b80efa2b5f7c7eada192f4e831ffc43b09f1273" gracePeriod=30 Mar 09 09:27:47 crc kubenswrapper[4861]: I0309 09:27:47.553748 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4d75f0a0-d481-480b-9b00-15349a64be9d" containerName="proxy-httpd" containerID="cri-o://2d9ab6c1c96bbc721b04df5406d0664c61ed3c2cfe5f67798a10fd5565e2736f" gracePeriod=30 Mar 09 09:27:47 crc kubenswrapper[4861]: I0309 09:27:47.553784 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4d75f0a0-d481-480b-9b00-15349a64be9d" containerName="sg-core" containerID="cri-o://604da89ae58f6538e63da6891fb568004fb049a527d833f392bfa867e05cd4ee" gracePeriod=30 Mar 09 09:27:47 crc kubenswrapper[4861]: I0309 09:27:47.553816 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4d75f0a0-d481-480b-9b00-15349a64be9d" containerName="ceilometer-notification-agent" containerID="cri-o://8eb6d4fc488f7adfd2c20aea34c63fb8a54cb4da8bde9af3629a2684ef70a2fd" gracePeriod=30 Mar 09 09:27:47 crc kubenswrapper[4861]: I0309 09:27:47.572168 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 09 09:27:47 crc kubenswrapper[4861]: I0309 09:27:47.588857 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.368162963 podStartE2EDuration="5.588833879s" podCreationTimestamp="2026-03-09 09:27:42 +0000 UTC" firstStartedPulling="2026-03-09 09:27:42.923983381 +0000 UTC m=+1306.009022782" lastFinishedPulling="2026-03-09 09:27:47.144654297 +0000 UTC m=+1310.229693698" observedRunningTime="2026-03-09 09:27:47.583482745 +0000 UTC m=+1310.668522146" watchObservedRunningTime="2026-03-09 09:27:47.588833879 +0000 UTC m=+1310.673873280" Mar 09 09:27:47 crc kubenswrapper[4861]: I0309 09:27:47.744845 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-84rm5"] Mar 09 09:27:47 crc kubenswrapper[4861]: I0309 09:27:47.746106 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-84rm5" Mar 09 09:27:47 crc kubenswrapper[4861]: I0309 09:27:47.751711 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 09 09:27:47 crc kubenswrapper[4861]: I0309 09:27:47.752460 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 09 09:27:47 crc kubenswrapper[4861]: I0309 09:27:47.756636 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-84rm5"] Mar 09 09:27:47 crc kubenswrapper[4861]: I0309 09:27:47.886848 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6l89\" (UniqueName: \"kubernetes.io/projected/e81aeea2-beec-4987-b527-db644692cb14-kube-api-access-d6l89\") pod \"nova-cell1-cell-mapping-84rm5\" (UID: \"e81aeea2-beec-4987-b527-db644692cb14\") " pod="openstack/nova-cell1-cell-mapping-84rm5" Mar 09 09:27:47 crc kubenswrapper[4861]: I0309 09:27:47.886949 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e81aeea2-beec-4987-b527-db644692cb14-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-84rm5\" (UID: \"e81aeea2-beec-4987-b527-db644692cb14\") " pod="openstack/nova-cell1-cell-mapping-84rm5" Mar 09 09:27:47 crc kubenswrapper[4861]: I0309 09:27:47.887053 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e81aeea2-beec-4987-b527-db644692cb14-scripts\") pod \"nova-cell1-cell-mapping-84rm5\" (UID: \"e81aeea2-beec-4987-b527-db644692cb14\") " pod="openstack/nova-cell1-cell-mapping-84rm5" Mar 09 09:27:47 crc kubenswrapper[4861]: I0309 09:27:47.887299 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e81aeea2-beec-4987-b527-db644692cb14-config-data\") pod \"nova-cell1-cell-mapping-84rm5\" (UID: \"e81aeea2-beec-4987-b527-db644692cb14\") " pod="openstack/nova-cell1-cell-mapping-84rm5" Mar 09 09:27:47 crc kubenswrapper[4861]: I0309 09:27:47.989268 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e81aeea2-beec-4987-b527-db644692cb14-scripts\") pod \"nova-cell1-cell-mapping-84rm5\" (UID: \"e81aeea2-beec-4987-b527-db644692cb14\") " pod="openstack/nova-cell1-cell-mapping-84rm5" Mar 09 09:27:47 crc kubenswrapper[4861]: I0309 09:27:47.989339 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e81aeea2-beec-4987-b527-db644692cb14-config-data\") pod \"nova-cell1-cell-mapping-84rm5\" (UID: \"e81aeea2-beec-4987-b527-db644692cb14\") " pod="openstack/nova-cell1-cell-mapping-84rm5" Mar 09 09:27:47 crc kubenswrapper[4861]: I0309 09:27:47.989420 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6l89\" (UniqueName: \"kubernetes.io/projected/e81aeea2-beec-4987-b527-db644692cb14-kube-api-access-d6l89\") pod \"nova-cell1-cell-mapping-84rm5\" (UID: \"e81aeea2-beec-4987-b527-db644692cb14\") " pod="openstack/nova-cell1-cell-mapping-84rm5" Mar 09 09:27:47 crc kubenswrapper[4861]: I0309 09:27:47.989485 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e81aeea2-beec-4987-b527-db644692cb14-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-84rm5\" (UID: \"e81aeea2-beec-4987-b527-db644692cb14\") " pod="openstack/nova-cell1-cell-mapping-84rm5" Mar 09 09:27:47 crc kubenswrapper[4861]: I0309 09:27:47.993750 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e81aeea2-beec-4987-b527-db644692cb14-scripts\") pod \"nova-cell1-cell-mapping-84rm5\" (UID: \"e81aeea2-beec-4987-b527-db644692cb14\") " pod="openstack/nova-cell1-cell-mapping-84rm5" Mar 09 09:27:47 crc kubenswrapper[4861]: I0309 09:27:47.993880 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e81aeea2-beec-4987-b527-db644692cb14-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-84rm5\" (UID: \"e81aeea2-beec-4987-b527-db644692cb14\") " pod="openstack/nova-cell1-cell-mapping-84rm5" Mar 09 09:27:47 crc kubenswrapper[4861]: I0309 09:27:47.994302 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e81aeea2-beec-4987-b527-db644692cb14-config-data\") pod \"nova-cell1-cell-mapping-84rm5\" (UID: \"e81aeea2-beec-4987-b527-db644692cb14\") " pod="openstack/nova-cell1-cell-mapping-84rm5" Mar 09 09:27:48 crc kubenswrapper[4861]: I0309 09:27:48.007075 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6l89\" (UniqueName: \"kubernetes.io/projected/e81aeea2-beec-4987-b527-db644692cb14-kube-api-access-d6l89\") pod \"nova-cell1-cell-mapping-84rm5\" (UID: \"e81aeea2-beec-4987-b527-db644692cb14\") " pod="openstack/nova-cell1-cell-mapping-84rm5" Mar 09 09:27:48 crc kubenswrapper[4861]: I0309 09:27:48.020512 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7749c44969-c57rg" Mar 09 09:27:48 crc kubenswrapper[4861]: I0309 09:27:48.099487 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-84rm5" Mar 09 09:27:48 crc kubenswrapper[4861]: I0309 09:27:48.100588 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bd5679c8c-2ns4s"] Mar 09 09:27:48 crc kubenswrapper[4861]: I0309 09:27:48.100961 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7bd5679c8c-2ns4s" podUID="71107669-f5ae-4df6-a694-a643153ad6f4" containerName="dnsmasq-dns" containerID="cri-o://95b01cabcea7fff8d5731e8e974164abb98d11542a5c5b10358d763297daab7c" gracePeriod=10 Mar 09 09:27:48 crc kubenswrapper[4861]: I0309 09:27:48.568466 4861 generic.go:334] "Generic (PLEG): container finished" podID="4d75f0a0-d481-480b-9b00-15349a64be9d" containerID="2d9ab6c1c96bbc721b04df5406d0664c61ed3c2cfe5f67798a10fd5565e2736f" exitCode=0 Mar 09 09:27:48 crc kubenswrapper[4861]: I0309 09:27:48.568756 4861 generic.go:334] "Generic (PLEG): container finished" podID="4d75f0a0-d481-480b-9b00-15349a64be9d" containerID="604da89ae58f6538e63da6891fb568004fb049a527d833f392bfa867e05cd4ee" exitCode=2 Mar 09 09:27:48 crc kubenswrapper[4861]: I0309 09:27:48.568770 4861 generic.go:334] "Generic (PLEG): container finished" podID="4d75f0a0-d481-480b-9b00-15349a64be9d" containerID="8eb6d4fc488f7adfd2c20aea34c63fb8a54cb4da8bde9af3629a2684ef70a2fd" exitCode=0 Mar 09 09:27:48 crc kubenswrapper[4861]: I0309 09:27:48.568578 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4d75f0a0-d481-480b-9b00-15349a64be9d","Type":"ContainerDied","Data":"2d9ab6c1c96bbc721b04df5406d0664c61ed3c2cfe5f67798a10fd5565e2736f"} Mar 09 09:27:48 crc kubenswrapper[4861]: I0309 09:27:48.568896 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4d75f0a0-d481-480b-9b00-15349a64be9d","Type":"ContainerDied","Data":"604da89ae58f6538e63da6891fb568004fb049a527d833f392bfa867e05cd4ee"} Mar 09 09:27:48 crc kubenswrapper[4861]: I0309 09:27:48.568910 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4d75f0a0-d481-480b-9b00-15349a64be9d","Type":"ContainerDied","Data":"8eb6d4fc488f7adfd2c20aea34c63fb8a54cb4da8bde9af3629a2684ef70a2fd"} Mar 09 09:27:48 crc kubenswrapper[4861]: I0309 09:27:48.572012 4861 generic.go:334] "Generic (PLEG): container finished" podID="71107669-f5ae-4df6-a694-a643153ad6f4" containerID="95b01cabcea7fff8d5731e8e974164abb98d11542a5c5b10358d763297daab7c" exitCode=0 Mar 09 09:27:48 crc kubenswrapper[4861]: I0309 09:27:48.572361 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd5679c8c-2ns4s" event={"ID":"71107669-f5ae-4df6-a694-a643153ad6f4","Type":"ContainerDied","Data":"95b01cabcea7fff8d5731e8e974164abb98d11542a5c5b10358d763297daab7c"} Mar 09 09:27:49 crc kubenswrapper[4861]: I0309 09:27:49.175812 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-84rm5"] Mar 09 09:27:49 crc kubenswrapper[4861]: I0309 09:27:49.430765 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bd5679c8c-2ns4s" Mar 09 09:27:49 crc kubenswrapper[4861]: I0309 09:27:49.535690 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/71107669-f5ae-4df6-a694-a643153ad6f4-dns-swift-storage-0\") pod \"71107669-f5ae-4df6-a694-a643153ad6f4\" (UID: \"71107669-f5ae-4df6-a694-a643153ad6f4\") " Mar 09 09:27:49 crc kubenswrapper[4861]: I0309 09:27:49.535777 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/71107669-f5ae-4df6-a694-a643153ad6f4-ovsdbserver-sb\") pod \"71107669-f5ae-4df6-a694-a643153ad6f4\" (UID: \"71107669-f5ae-4df6-a694-a643153ad6f4\") " Mar 09 09:27:49 crc kubenswrapper[4861]: I0309 09:27:49.535806 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-668hf\" (UniqueName: \"kubernetes.io/projected/71107669-f5ae-4df6-a694-a643153ad6f4-kube-api-access-668hf\") pod \"71107669-f5ae-4df6-a694-a643153ad6f4\" (UID: \"71107669-f5ae-4df6-a694-a643153ad6f4\") " Mar 09 09:27:49 crc kubenswrapper[4861]: I0309 09:27:49.535888 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71107669-f5ae-4df6-a694-a643153ad6f4-config\") pod \"71107669-f5ae-4df6-a694-a643153ad6f4\" (UID: \"71107669-f5ae-4df6-a694-a643153ad6f4\") " Mar 09 09:27:49 crc kubenswrapper[4861]: I0309 09:27:49.535949 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/71107669-f5ae-4df6-a694-a643153ad6f4-ovsdbserver-nb\") pod \"71107669-f5ae-4df6-a694-a643153ad6f4\" (UID: \"71107669-f5ae-4df6-a694-a643153ad6f4\") " Mar 09 09:27:49 crc kubenswrapper[4861]: I0309 09:27:49.535986 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/71107669-f5ae-4df6-a694-a643153ad6f4-dns-svc\") pod \"71107669-f5ae-4df6-a694-a643153ad6f4\" (UID: \"71107669-f5ae-4df6-a694-a643153ad6f4\") " Mar 09 09:27:49 crc kubenswrapper[4861]: I0309 09:27:49.562667 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71107669-f5ae-4df6-a694-a643153ad6f4-kube-api-access-668hf" (OuterVolumeSpecName: "kube-api-access-668hf") pod "71107669-f5ae-4df6-a694-a643153ad6f4" (UID: "71107669-f5ae-4df6-a694-a643153ad6f4"). InnerVolumeSpecName "kube-api-access-668hf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:27:49 crc kubenswrapper[4861]: I0309 09:27:49.607964 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd5679c8c-2ns4s" event={"ID":"71107669-f5ae-4df6-a694-a643153ad6f4","Type":"ContainerDied","Data":"be2044ed451a8916fb2304cd4127b6dd157a5c7ee4195e491d44ccfa810879fe"} Mar 09 09:27:49 crc kubenswrapper[4861]: I0309 09:27:49.608086 4861 scope.go:117] "RemoveContainer" containerID="95b01cabcea7fff8d5731e8e974164abb98d11542a5c5b10358d763297daab7c" Mar 09 09:27:49 crc kubenswrapper[4861]: I0309 09:27:49.608262 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bd5679c8c-2ns4s" Mar 09 09:27:49 crc kubenswrapper[4861]: I0309 09:27:49.618946 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-84rm5" event={"ID":"e81aeea2-beec-4987-b527-db644692cb14","Type":"ContainerStarted","Data":"7aa755ba2b534ec33b73942a0dcad7c6918ecf98e43652615d68efcdc114b08e"} Mar 09 09:27:49 crc kubenswrapper[4861]: I0309 09:27:49.643426 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-668hf\" (UniqueName: \"kubernetes.io/projected/71107669-f5ae-4df6-a694-a643153ad6f4-kube-api-access-668hf\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:49 crc kubenswrapper[4861]: I0309 09:27:49.664737 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71107669-f5ae-4df6-a694-a643153ad6f4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "71107669-f5ae-4df6-a694-a643153ad6f4" (UID: "71107669-f5ae-4df6-a694-a643153ad6f4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:27:49 crc kubenswrapper[4861]: I0309 09:27:49.664824 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71107669-f5ae-4df6-a694-a643153ad6f4-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "71107669-f5ae-4df6-a694-a643153ad6f4" (UID: "71107669-f5ae-4df6-a694-a643153ad6f4"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:27:49 crc kubenswrapper[4861]: I0309 09:27:49.664861 4861 scope.go:117] "RemoveContainer" containerID="8a0d57d25fd49736c6c3dfa73f76c169f9aec799ea01849f4a3a184b99a520ad" Mar 09 09:27:49 crc kubenswrapper[4861]: I0309 09:27:49.671538 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71107669-f5ae-4df6-a694-a643153ad6f4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "71107669-f5ae-4df6-a694-a643153ad6f4" (UID: "71107669-f5ae-4df6-a694-a643153ad6f4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:27:49 crc kubenswrapper[4861]: I0309 09:27:49.672812 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71107669-f5ae-4df6-a694-a643153ad6f4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "71107669-f5ae-4df6-a694-a643153ad6f4" (UID: "71107669-f5ae-4df6-a694-a643153ad6f4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:27:49 crc kubenswrapper[4861]: I0309 09:27:49.683062 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71107669-f5ae-4df6-a694-a643153ad6f4-config" (OuterVolumeSpecName: "config") pod "71107669-f5ae-4df6-a694-a643153ad6f4" (UID: "71107669-f5ae-4df6-a694-a643153ad6f4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:27:49 crc kubenswrapper[4861]: I0309 09:27:49.755553 4861 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/71107669-f5ae-4df6-a694-a643153ad6f4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:49 crc kubenswrapper[4861]: I0309 09:27:49.755628 4861 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/71107669-f5ae-4df6-a694-a643153ad6f4-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:49 crc kubenswrapper[4861]: I0309 09:27:49.755641 4861 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/71107669-f5ae-4df6-a694-a643153ad6f4-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:49 crc kubenswrapper[4861]: I0309 09:27:49.755650 4861 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/71107669-f5ae-4df6-a694-a643153ad6f4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:49 crc kubenswrapper[4861]: I0309 09:27:49.755660 4861 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71107669-f5ae-4df6-a694-a643153ad6f4-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:49 crc kubenswrapper[4861]: I0309 09:27:49.942644 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bd5679c8c-2ns4s"] Mar 09 09:27:49 crc kubenswrapper[4861]: I0309 09:27:49.954016 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7bd5679c8c-2ns4s"] Mar 09 09:27:50 crc kubenswrapper[4861]: I0309 09:27:50.632570 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-84rm5" event={"ID":"e81aeea2-beec-4987-b527-db644692cb14","Type":"ContainerStarted","Data":"6a6e0147999328f0e670af6e9773adc78a928a0f8004229f1539591977be0d16"} Mar 09 09:27:50 crc kubenswrapper[4861]: I0309 09:27:50.650787 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-84rm5" podStartSLOduration=3.650768418 podStartE2EDuration="3.650768418s" podCreationTimestamp="2026-03-09 09:27:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:27:50.648334558 +0000 UTC m=+1313.733373959" watchObservedRunningTime="2026-03-09 09:27:50.650768418 +0000 UTC m=+1313.735807819" Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.277699 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.389795 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4d75f0a0-d481-480b-9b00-15349a64be9d-run-httpd\") pod \"4d75f0a0-d481-480b-9b00-15349a64be9d\" (UID: \"4d75f0a0-d481-480b-9b00-15349a64be9d\") " Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.390029 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d75f0a0-d481-480b-9b00-15349a64be9d-ceilometer-tls-certs\") pod \"4d75f0a0-d481-480b-9b00-15349a64be9d\" (UID: \"4d75f0a0-d481-480b-9b00-15349a64be9d\") " Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.390101 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4d75f0a0-d481-480b-9b00-15349a64be9d-log-httpd\") pod \"4d75f0a0-d481-480b-9b00-15349a64be9d\" (UID: \"4d75f0a0-d481-480b-9b00-15349a64be9d\") " Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.390158 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d75f0a0-d481-480b-9b00-15349a64be9d-config-data\") pod \"4d75f0a0-d481-480b-9b00-15349a64be9d\" (UID: \"4d75f0a0-d481-480b-9b00-15349a64be9d\") " Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.390228 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4d75f0a0-d481-480b-9b00-15349a64be9d-sg-core-conf-yaml\") pod \"4d75f0a0-d481-480b-9b00-15349a64be9d\" (UID: \"4d75f0a0-d481-480b-9b00-15349a64be9d\") " Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.390353 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6khrc\" (UniqueName: \"kubernetes.io/projected/4d75f0a0-d481-480b-9b00-15349a64be9d-kube-api-access-6khrc\") pod \"4d75f0a0-d481-480b-9b00-15349a64be9d\" (UID: \"4d75f0a0-d481-480b-9b00-15349a64be9d\") " Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.390536 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d75f0a0-d481-480b-9b00-15349a64be9d-scripts\") pod \"4d75f0a0-d481-480b-9b00-15349a64be9d\" (UID: \"4d75f0a0-d481-480b-9b00-15349a64be9d\") " Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.390627 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d75f0a0-d481-480b-9b00-15349a64be9d-combined-ca-bundle\") pod \"4d75f0a0-d481-480b-9b00-15349a64be9d\" (UID: \"4d75f0a0-d481-480b-9b00-15349a64be9d\") " Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.391323 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d75f0a0-d481-480b-9b00-15349a64be9d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4d75f0a0-d481-480b-9b00-15349a64be9d" (UID: "4d75f0a0-d481-480b-9b00-15349a64be9d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.391863 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d75f0a0-d481-480b-9b00-15349a64be9d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4d75f0a0-d481-480b-9b00-15349a64be9d" (UID: "4d75f0a0-d481-480b-9b00-15349a64be9d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.392205 4861 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4d75f0a0-d481-480b-9b00-15349a64be9d-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.392243 4861 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4d75f0a0-d481-480b-9b00-15349a64be9d-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.399552 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d75f0a0-d481-480b-9b00-15349a64be9d-kube-api-access-6khrc" (OuterVolumeSpecName: "kube-api-access-6khrc") pod "4d75f0a0-d481-480b-9b00-15349a64be9d" (UID: "4d75f0a0-d481-480b-9b00-15349a64be9d"). InnerVolumeSpecName "kube-api-access-6khrc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.401814 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d75f0a0-d481-480b-9b00-15349a64be9d-scripts" (OuterVolumeSpecName: "scripts") pod "4d75f0a0-d481-480b-9b00-15349a64be9d" (UID: "4d75f0a0-d481-480b-9b00-15349a64be9d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.427477 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d75f0a0-d481-480b-9b00-15349a64be9d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4d75f0a0-d481-480b-9b00-15349a64be9d" (UID: "4d75f0a0-d481-480b-9b00-15349a64be9d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.454569 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d75f0a0-d481-480b-9b00-15349a64be9d-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "4d75f0a0-d481-480b-9b00-15349a64be9d" (UID: "4d75f0a0-d481-480b-9b00-15349a64be9d"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.486475 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d75f0a0-d481-480b-9b00-15349a64be9d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4d75f0a0-d481-480b-9b00-15349a64be9d" (UID: "4d75f0a0-d481-480b-9b00-15349a64be9d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.494257 4861 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4d75f0a0-d481-480b-9b00-15349a64be9d-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.494298 4861 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d75f0a0-d481-480b-9b00-15349a64be9d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.494314 4861 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d75f0a0-d481-480b-9b00-15349a64be9d-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.494326 4861 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4d75f0a0-d481-480b-9b00-15349a64be9d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.494337 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6khrc\" (UniqueName: \"kubernetes.io/projected/4d75f0a0-d481-480b-9b00-15349a64be9d-kube-api-access-6khrc\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.513631 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d75f0a0-d481-480b-9b00-15349a64be9d-config-data" (OuterVolumeSpecName: "config-data") pod "4d75f0a0-d481-480b-9b00-15349a64be9d" (UID: "4d75f0a0-d481-480b-9b00-15349a64be9d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.596744 4861 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4d75f0a0-d481-480b-9b00-15349a64be9d-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.647168 4861 generic.go:334] "Generic (PLEG): container finished" podID="4d75f0a0-d481-480b-9b00-15349a64be9d" containerID="2c0ce83ebcc4ed98d53ac2703b80efa2b5f7c7eada192f4e831ffc43b09f1273" exitCode=0 Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.647389 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4d75f0a0-d481-480b-9b00-15349a64be9d","Type":"ContainerDied","Data":"2c0ce83ebcc4ed98d53ac2703b80efa2b5f7c7eada192f4e831ffc43b09f1273"} Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.647438 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4d75f0a0-d481-480b-9b00-15349a64be9d","Type":"ContainerDied","Data":"5fee09b271a024929bc606b64c0b245bf45695184b534bd857bc197e58aff359"} Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.647459 4861 scope.go:117] "RemoveContainer" containerID="2d9ab6c1c96bbc721b04df5406d0664c61ed3c2cfe5f67798a10fd5565e2736f" Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.647801 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.672632 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71107669-f5ae-4df6-a694-a643153ad6f4" path="/var/lib/kubelet/pods/71107669-f5ae-4df6-a694-a643153ad6f4/volumes" Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.673781 4861 scope.go:117] "RemoveContainer" containerID="604da89ae58f6538e63da6891fb568004fb049a527d833f392bfa867e05cd4ee" Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.703396 4861 scope.go:117] "RemoveContainer" containerID="8eb6d4fc488f7adfd2c20aea34c63fb8a54cb4da8bde9af3629a2684ef70a2fd" Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.712469 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.728850 4861 scope.go:117] "RemoveContainer" containerID="2c0ce83ebcc4ed98d53ac2703b80efa2b5f7c7eada192f4e831ffc43b09f1273" Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.751091 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.760631 4861 scope.go:117] "RemoveContainer" containerID="2d9ab6c1c96bbc721b04df5406d0664c61ed3c2cfe5f67798a10fd5565e2736f" Mar 09 09:27:51 crc kubenswrapper[4861]: E0309 09:27:51.764100 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d9ab6c1c96bbc721b04df5406d0664c61ed3c2cfe5f67798a10fd5565e2736f\": container with ID starting with 2d9ab6c1c96bbc721b04df5406d0664c61ed3c2cfe5f67798a10fd5565e2736f not found: ID does not exist" containerID="2d9ab6c1c96bbc721b04df5406d0664c61ed3c2cfe5f67798a10fd5565e2736f" Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.764165 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d9ab6c1c96bbc721b04df5406d0664c61ed3c2cfe5f67798a10fd5565e2736f"} err="failed to get container status \"2d9ab6c1c96bbc721b04df5406d0664c61ed3c2cfe5f67798a10fd5565e2736f\": rpc error: code = NotFound desc = could not find container \"2d9ab6c1c96bbc721b04df5406d0664c61ed3c2cfe5f67798a10fd5565e2736f\": container with ID starting with 2d9ab6c1c96bbc721b04df5406d0664c61ed3c2cfe5f67798a10fd5565e2736f not found: ID does not exist" Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.764193 4861 scope.go:117] "RemoveContainer" containerID="604da89ae58f6538e63da6891fb568004fb049a527d833f392bfa867e05cd4ee" Mar 09 09:27:51 crc kubenswrapper[4861]: E0309 09:27:51.764989 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"604da89ae58f6538e63da6891fb568004fb049a527d833f392bfa867e05cd4ee\": container with ID starting with 604da89ae58f6538e63da6891fb568004fb049a527d833f392bfa867e05cd4ee not found: ID does not exist" containerID="604da89ae58f6538e63da6891fb568004fb049a527d833f392bfa867e05cd4ee" Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.765038 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"604da89ae58f6538e63da6891fb568004fb049a527d833f392bfa867e05cd4ee"} err="failed to get container status \"604da89ae58f6538e63da6891fb568004fb049a527d833f392bfa867e05cd4ee\": rpc error: code = NotFound desc = could not find container \"604da89ae58f6538e63da6891fb568004fb049a527d833f392bfa867e05cd4ee\": container with ID starting with 604da89ae58f6538e63da6891fb568004fb049a527d833f392bfa867e05cd4ee not found: ID does not exist" Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.765082 4861 scope.go:117] "RemoveContainer" containerID="8eb6d4fc488f7adfd2c20aea34c63fb8a54cb4da8bde9af3629a2684ef70a2fd" Mar 09 09:27:51 crc kubenswrapper[4861]: E0309 09:27:51.765639 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8eb6d4fc488f7adfd2c20aea34c63fb8a54cb4da8bde9af3629a2684ef70a2fd\": container with ID starting with 8eb6d4fc488f7adfd2c20aea34c63fb8a54cb4da8bde9af3629a2684ef70a2fd not found: ID does not exist" containerID="8eb6d4fc488f7adfd2c20aea34c63fb8a54cb4da8bde9af3629a2684ef70a2fd" Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.765672 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8eb6d4fc488f7adfd2c20aea34c63fb8a54cb4da8bde9af3629a2684ef70a2fd"} err="failed to get container status \"8eb6d4fc488f7adfd2c20aea34c63fb8a54cb4da8bde9af3629a2684ef70a2fd\": rpc error: code = NotFound desc = could not find container \"8eb6d4fc488f7adfd2c20aea34c63fb8a54cb4da8bde9af3629a2684ef70a2fd\": container with ID starting with 8eb6d4fc488f7adfd2c20aea34c63fb8a54cb4da8bde9af3629a2684ef70a2fd not found: ID does not exist" Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.765690 4861 scope.go:117] "RemoveContainer" containerID="2c0ce83ebcc4ed98d53ac2703b80efa2b5f7c7eada192f4e831ffc43b09f1273" Mar 09 09:27:51 crc kubenswrapper[4861]: E0309 09:27:51.766094 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c0ce83ebcc4ed98d53ac2703b80efa2b5f7c7eada192f4e831ffc43b09f1273\": container with ID starting with 2c0ce83ebcc4ed98d53ac2703b80efa2b5f7c7eada192f4e831ffc43b09f1273 not found: ID does not exist" containerID="2c0ce83ebcc4ed98d53ac2703b80efa2b5f7c7eada192f4e831ffc43b09f1273" Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.766143 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c0ce83ebcc4ed98d53ac2703b80efa2b5f7c7eada192f4e831ffc43b09f1273"} err="failed to get container status \"2c0ce83ebcc4ed98d53ac2703b80efa2b5f7c7eada192f4e831ffc43b09f1273\": rpc error: code = NotFound desc = could not find container \"2c0ce83ebcc4ed98d53ac2703b80efa2b5f7c7eada192f4e831ffc43b09f1273\": container with ID starting with 2c0ce83ebcc4ed98d53ac2703b80efa2b5f7c7eada192f4e831ffc43b09f1273 not found: ID does not exist" Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.771832 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:27:51 crc kubenswrapper[4861]: E0309 09:27:51.772749 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71107669-f5ae-4df6-a694-a643153ad6f4" containerName="dnsmasq-dns" Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.772771 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="71107669-f5ae-4df6-a694-a643153ad6f4" containerName="dnsmasq-dns" Mar 09 09:27:51 crc kubenswrapper[4861]: E0309 09:27:51.772804 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71107669-f5ae-4df6-a694-a643153ad6f4" containerName="init" Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.772811 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="71107669-f5ae-4df6-a694-a643153ad6f4" containerName="init" Mar 09 09:27:51 crc kubenswrapper[4861]: E0309 09:27:51.772834 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d75f0a0-d481-480b-9b00-15349a64be9d" containerName="ceilometer-central-agent" Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.772840 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d75f0a0-d481-480b-9b00-15349a64be9d" containerName="ceilometer-central-agent" Mar 09 09:27:51 crc kubenswrapper[4861]: E0309 09:27:51.772857 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d75f0a0-d481-480b-9b00-15349a64be9d" containerName="proxy-httpd" Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.772863 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d75f0a0-d481-480b-9b00-15349a64be9d" containerName="proxy-httpd" Mar 09 09:27:51 crc kubenswrapper[4861]: E0309 09:27:51.772877 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d75f0a0-d481-480b-9b00-15349a64be9d" containerName="ceilometer-notification-agent" Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.772884 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d75f0a0-d481-480b-9b00-15349a64be9d" containerName="ceilometer-notification-agent" Mar 09 09:27:51 crc kubenswrapper[4861]: E0309 09:27:51.772892 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d75f0a0-d481-480b-9b00-15349a64be9d" containerName="sg-core" Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.772898 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d75f0a0-d481-480b-9b00-15349a64be9d" containerName="sg-core" Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.774001 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d75f0a0-d481-480b-9b00-15349a64be9d" containerName="sg-core" Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.774030 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d75f0a0-d481-480b-9b00-15349a64be9d" containerName="proxy-httpd" Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.774042 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d75f0a0-d481-480b-9b00-15349a64be9d" containerName="ceilometer-central-agent" Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.774053 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="71107669-f5ae-4df6-a694-a643153ad6f4" containerName="dnsmasq-dns" Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.774071 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d75f0a0-d481-480b-9b00-15349a64be9d" containerName="ceilometer-notification-agent" Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.777454 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.780930 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.782123 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.782383 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.786590 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.906174 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75432149-8e10-4aae-8ad4-fbf3b5a10063-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"75432149-8e10-4aae-8ad4-fbf3b5a10063\") " pod="openstack/ceilometer-0" Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.906223 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/75432149-8e10-4aae-8ad4-fbf3b5a10063-run-httpd\") pod \"ceilometer-0\" (UID: \"75432149-8e10-4aae-8ad4-fbf3b5a10063\") " pod="openstack/ceilometer-0" Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.906256 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/75432149-8e10-4aae-8ad4-fbf3b5a10063-log-httpd\") pod \"ceilometer-0\" (UID: \"75432149-8e10-4aae-8ad4-fbf3b5a10063\") " pod="openstack/ceilometer-0" Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.906315 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/75432149-8e10-4aae-8ad4-fbf3b5a10063-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"75432149-8e10-4aae-8ad4-fbf3b5a10063\") " pod="openstack/ceilometer-0" Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.906355 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/75432149-8e10-4aae-8ad4-fbf3b5a10063-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"75432149-8e10-4aae-8ad4-fbf3b5a10063\") " pod="openstack/ceilometer-0" Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.906460 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6zh9\" (UniqueName: \"kubernetes.io/projected/75432149-8e10-4aae-8ad4-fbf3b5a10063-kube-api-access-m6zh9\") pod \"ceilometer-0\" (UID: \"75432149-8e10-4aae-8ad4-fbf3b5a10063\") " pod="openstack/ceilometer-0" Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.906942 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75432149-8e10-4aae-8ad4-fbf3b5a10063-scripts\") pod \"ceilometer-0\" (UID: \"75432149-8e10-4aae-8ad4-fbf3b5a10063\") " pod="openstack/ceilometer-0" Mar 09 09:27:51 crc kubenswrapper[4861]: I0309 09:27:51.907158 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75432149-8e10-4aae-8ad4-fbf3b5a10063-config-data\") pod \"ceilometer-0\" (UID: \"75432149-8e10-4aae-8ad4-fbf3b5a10063\") " pod="openstack/ceilometer-0" Mar 09 09:27:52 crc kubenswrapper[4861]: I0309 09:27:52.009090 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/75432149-8e10-4aae-8ad4-fbf3b5a10063-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"75432149-8e10-4aae-8ad4-fbf3b5a10063\") " pod="openstack/ceilometer-0" Mar 09 09:27:52 crc kubenswrapper[4861]: I0309 09:27:52.009158 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6zh9\" (UniqueName: \"kubernetes.io/projected/75432149-8e10-4aae-8ad4-fbf3b5a10063-kube-api-access-m6zh9\") pod \"ceilometer-0\" (UID: \"75432149-8e10-4aae-8ad4-fbf3b5a10063\") " pod="openstack/ceilometer-0" Mar 09 09:27:52 crc kubenswrapper[4861]: I0309 09:27:52.009237 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75432149-8e10-4aae-8ad4-fbf3b5a10063-scripts\") pod \"ceilometer-0\" (UID: \"75432149-8e10-4aae-8ad4-fbf3b5a10063\") " pod="openstack/ceilometer-0" Mar 09 09:27:52 crc kubenswrapper[4861]: I0309 09:27:52.009274 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75432149-8e10-4aae-8ad4-fbf3b5a10063-config-data\") pod \"ceilometer-0\" (UID: \"75432149-8e10-4aae-8ad4-fbf3b5a10063\") " pod="openstack/ceilometer-0" Mar 09 09:27:52 crc kubenswrapper[4861]: I0309 09:27:52.009443 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75432149-8e10-4aae-8ad4-fbf3b5a10063-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"75432149-8e10-4aae-8ad4-fbf3b5a10063\") " pod="openstack/ceilometer-0" Mar 09 09:27:52 crc kubenswrapper[4861]: I0309 09:27:52.009487 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/75432149-8e10-4aae-8ad4-fbf3b5a10063-run-httpd\") pod \"ceilometer-0\" (UID: \"75432149-8e10-4aae-8ad4-fbf3b5a10063\") " pod="openstack/ceilometer-0" Mar 09 09:27:52 crc kubenswrapper[4861]: I0309 09:27:52.009531 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/75432149-8e10-4aae-8ad4-fbf3b5a10063-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"75432149-8e10-4aae-8ad4-fbf3b5a10063\") " pod="openstack/ceilometer-0" Mar 09 09:27:52 crc kubenswrapper[4861]: I0309 09:27:52.009563 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/75432149-8e10-4aae-8ad4-fbf3b5a10063-log-httpd\") pod \"ceilometer-0\" (UID: \"75432149-8e10-4aae-8ad4-fbf3b5a10063\") " pod="openstack/ceilometer-0" Mar 09 09:27:52 crc kubenswrapper[4861]: I0309 09:27:52.010266 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/75432149-8e10-4aae-8ad4-fbf3b5a10063-run-httpd\") pod \"ceilometer-0\" (UID: \"75432149-8e10-4aae-8ad4-fbf3b5a10063\") " pod="openstack/ceilometer-0" Mar 09 09:27:52 crc kubenswrapper[4861]: I0309 09:27:52.010948 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/75432149-8e10-4aae-8ad4-fbf3b5a10063-log-httpd\") pod \"ceilometer-0\" (UID: \"75432149-8e10-4aae-8ad4-fbf3b5a10063\") " pod="openstack/ceilometer-0" Mar 09 09:27:52 crc kubenswrapper[4861]: I0309 09:27:52.014184 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/75432149-8e10-4aae-8ad4-fbf3b5a10063-config-data\") pod \"ceilometer-0\" (UID: \"75432149-8e10-4aae-8ad4-fbf3b5a10063\") " pod="openstack/ceilometer-0" Mar 09 09:27:52 crc kubenswrapper[4861]: I0309 09:27:52.014656 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/75432149-8e10-4aae-8ad4-fbf3b5a10063-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"75432149-8e10-4aae-8ad4-fbf3b5a10063\") " pod="openstack/ceilometer-0" Mar 09 09:27:52 crc kubenswrapper[4861]: I0309 09:27:52.014785 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/75432149-8e10-4aae-8ad4-fbf3b5a10063-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"75432149-8e10-4aae-8ad4-fbf3b5a10063\") " pod="openstack/ceilometer-0" Mar 09 09:27:52 crc kubenswrapper[4861]: I0309 09:27:52.014872 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/75432149-8e10-4aae-8ad4-fbf3b5a10063-scripts\") pod \"ceilometer-0\" (UID: \"75432149-8e10-4aae-8ad4-fbf3b5a10063\") " pod="openstack/ceilometer-0" Mar 09 09:27:52 crc kubenswrapper[4861]: I0309 09:27:52.015008 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/75432149-8e10-4aae-8ad4-fbf3b5a10063-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"75432149-8e10-4aae-8ad4-fbf3b5a10063\") " pod="openstack/ceilometer-0" Mar 09 09:27:52 crc kubenswrapper[4861]: I0309 09:27:52.031835 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6zh9\" (UniqueName: \"kubernetes.io/projected/75432149-8e10-4aae-8ad4-fbf3b5a10063-kube-api-access-m6zh9\") pod \"ceilometer-0\" (UID: \"75432149-8e10-4aae-8ad4-fbf3b5a10063\") " pod="openstack/ceilometer-0" Mar 09 09:27:52 crc kubenswrapper[4861]: I0309 09:27:52.104800 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 09:27:52 crc kubenswrapper[4861]: I0309 09:27:52.669443 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 09:27:53 crc kubenswrapper[4861]: I0309 09:27:53.676209 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d75f0a0-d481-480b-9b00-15349a64be9d" path="/var/lib/kubelet/pods/4d75f0a0-d481-480b-9b00-15349a64be9d/volumes" Mar 09 09:27:53 crc kubenswrapper[4861]: I0309 09:27:53.678018 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"75432149-8e10-4aae-8ad4-fbf3b5a10063","Type":"ContainerStarted","Data":"60ee5266abf1fc7d8b7f5a47fb5e010d02e17dc1416db6584de0ec68ab727197"} Mar 09 09:27:53 crc kubenswrapper[4861]: I0309 09:27:53.678102 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"75432149-8e10-4aae-8ad4-fbf3b5a10063","Type":"ContainerStarted","Data":"dc78d298e03d6fb3f2dc222013536ea6f4da434ec7aa20b3845961df96bd7212"} Mar 09 09:27:54 crc kubenswrapper[4861]: I0309 09:27:54.608099 4861 patch_prober.go:28] interesting pod/machine-config-daemon-5g7gc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:27:54 crc kubenswrapper[4861]: I0309 09:27:54.608810 4861 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:27:54 crc kubenswrapper[4861]: I0309 09:27:54.687543 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"75432149-8e10-4aae-8ad4-fbf3b5a10063","Type":"ContainerStarted","Data":"79d04fe2896604d73371ba9e26e67c43f91e351f3d8e2dd0aa99200cf78b80fd"} Mar 09 09:27:54 crc kubenswrapper[4861]: I0309 09:27:54.893648 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 09 09:27:54 crc kubenswrapper[4861]: I0309 09:27:54.894030 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 09 09:27:55 crc kubenswrapper[4861]: I0309 09:27:55.703322 4861 generic.go:334] "Generic (PLEG): container finished" podID="e81aeea2-beec-4987-b527-db644692cb14" containerID="6a6e0147999328f0e670af6e9773adc78a928a0f8004229f1539591977be0d16" exitCode=0 Mar 09 09:27:55 crc kubenswrapper[4861]: I0309 09:27:55.703424 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-84rm5" event={"ID":"e81aeea2-beec-4987-b527-db644692cb14","Type":"ContainerDied","Data":"6a6e0147999328f0e670af6e9773adc78a928a0f8004229f1539591977be0d16"} Mar 09 09:27:55 crc kubenswrapper[4861]: I0309 09:27:55.707230 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"75432149-8e10-4aae-8ad4-fbf3b5a10063","Type":"ContainerStarted","Data":"7313c6ce42b8c7130c34ddbc61c12565523909564f58aeb4b4f89fca2abd5000"} Mar 09 09:27:55 crc kubenswrapper[4861]: I0309 09:27:55.917816 4861 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c284b70e-4d7d-4a48-9061-acfd4c1aed1e" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.209:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 09:27:55 crc kubenswrapper[4861]: I0309 09:27:55.918258 4861 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="c284b70e-4d7d-4a48-9061-acfd4c1aed1e" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.209:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 09:27:57 crc kubenswrapper[4861]: I0309 09:27:57.144186 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-84rm5" Mar 09 09:27:57 crc kubenswrapper[4861]: I0309 09:27:57.220311 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e81aeea2-beec-4987-b527-db644692cb14-combined-ca-bundle\") pod \"e81aeea2-beec-4987-b527-db644692cb14\" (UID: \"e81aeea2-beec-4987-b527-db644692cb14\") " Mar 09 09:27:57 crc kubenswrapper[4861]: I0309 09:27:57.220522 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e81aeea2-beec-4987-b527-db644692cb14-scripts\") pod \"e81aeea2-beec-4987-b527-db644692cb14\" (UID: \"e81aeea2-beec-4987-b527-db644692cb14\") " Mar 09 09:27:57 crc kubenswrapper[4861]: I0309 09:27:57.221634 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6l89\" (UniqueName: \"kubernetes.io/projected/e81aeea2-beec-4987-b527-db644692cb14-kube-api-access-d6l89\") pod \"e81aeea2-beec-4987-b527-db644692cb14\" (UID: \"e81aeea2-beec-4987-b527-db644692cb14\") " Mar 09 09:27:57 crc kubenswrapper[4861]: I0309 09:27:57.222335 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e81aeea2-beec-4987-b527-db644692cb14-config-data\") pod \"e81aeea2-beec-4987-b527-db644692cb14\" (UID: \"e81aeea2-beec-4987-b527-db644692cb14\") " Mar 09 09:27:57 crc kubenswrapper[4861]: I0309 09:27:57.226640 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e81aeea2-beec-4987-b527-db644692cb14-kube-api-access-d6l89" (OuterVolumeSpecName: "kube-api-access-d6l89") pod "e81aeea2-beec-4987-b527-db644692cb14" (UID: "e81aeea2-beec-4987-b527-db644692cb14"). InnerVolumeSpecName "kube-api-access-d6l89". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:27:57 crc kubenswrapper[4861]: I0309 09:27:57.226732 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e81aeea2-beec-4987-b527-db644692cb14-scripts" (OuterVolumeSpecName: "scripts") pod "e81aeea2-beec-4987-b527-db644692cb14" (UID: "e81aeea2-beec-4987-b527-db644692cb14"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:57 crc kubenswrapper[4861]: I0309 09:27:57.250930 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e81aeea2-beec-4987-b527-db644692cb14-config-data" (OuterVolumeSpecName: "config-data") pod "e81aeea2-beec-4987-b527-db644692cb14" (UID: "e81aeea2-beec-4987-b527-db644692cb14"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:57 crc kubenswrapper[4861]: I0309 09:27:57.253927 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e81aeea2-beec-4987-b527-db644692cb14-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e81aeea2-beec-4987-b527-db644692cb14" (UID: "e81aeea2-beec-4987-b527-db644692cb14"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:57 crc kubenswrapper[4861]: I0309 09:27:57.324799 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6l89\" (UniqueName: \"kubernetes.io/projected/e81aeea2-beec-4987-b527-db644692cb14-kube-api-access-d6l89\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:57 crc kubenswrapper[4861]: I0309 09:27:57.324845 4861 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e81aeea2-beec-4987-b527-db644692cb14-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:57 crc kubenswrapper[4861]: I0309 09:27:57.324860 4861 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e81aeea2-beec-4987-b527-db644692cb14-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:57 crc kubenswrapper[4861]: I0309 09:27:57.324871 4861 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e81aeea2-beec-4987-b527-db644692cb14-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:57 crc kubenswrapper[4861]: I0309 09:27:57.745144 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-84rm5" event={"ID":"e81aeea2-beec-4987-b527-db644692cb14","Type":"ContainerDied","Data":"7aa755ba2b534ec33b73942a0dcad7c6918ecf98e43652615d68efcdc114b08e"} Mar 09 09:27:57 crc kubenswrapper[4861]: I0309 09:27:57.745202 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7aa755ba2b534ec33b73942a0dcad7c6918ecf98e43652615d68efcdc114b08e" Mar 09 09:27:57 crc kubenswrapper[4861]: I0309 09:27:57.745161 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-84rm5" Mar 09 09:27:57 crc kubenswrapper[4861]: I0309 09:27:57.753088 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"75432149-8e10-4aae-8ad4-fbf3b5a10063","Type":"ContainerStarted","Data":"1ccc90500a5abddc655e2cfe48098c51d57a63a2436ee985f9c1b603f38afef5"} Mar 09 09:27:57 crc kubenswrapper[4861]: I0309 09:27:57.753321 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 09:27:57 crc kubenswrapper[4861]: I0309 09:27:57.780599 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.910428432 podStartE2EDuration="6.7805806s" podCreationTimestamp="2026-03-09 09:27:51 +0000 UTC" firstStartedPulling="2026-03-09 09:27:52.702412678 +0000 UTC m=+1315.787452079" lastFinishedPulling="2026-03-09 09:27:56.572564846 +0000 UTC m=+1319.657604247" observedRunningTime="2026-03-09 09:27:57.778668615 +0000 UTC m=+1320.863708016" watchObservedRunningTime="2026-03-09 09:27:57.7805806 +0000 UTC m=+1320.865620001" Mar 09 09:27:57 crc kubenswrapper[4861]: I0309 09:27:57.935658 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 09 09:27:57 crc kubenswrapper[4861]: I0309 09:27:57.936188 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c284b70e-4d7d-4a48-9061-acfd4c1aed1e" containerName="nova-api-log" containerID="cri-o://016f78ce0f2846310b2843e7584d10c5ff53f4878d11c014c13c8917b24e9190" gracePeriod=30 Mar 09 09:27:57 crc kubenswrapper[4861]: I0309 09:27:57.936242 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="c284b70e-4d7d-4a48-9061-acfd4c1aed1e" containerName="nova-api-api" containerID="cri-o://3598e98bff57ee3e426eae390226c7f498407cfd6550bf4b3c4bc2d62d3cb7ac" gracePeriod=30 Mar 09 09:27:57 crc kubenswrapper[4861]: I0309 09:27:57.959450 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 09:27:57 crc kubenswrapper[4861]: I0309 09:27:57.959695 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="72e92702-681e-4575-84da-71f26ef95ebf" containerName="nova-scheduler-scheduler" containerID="cri-o://d1afee61e833f56202c99d97a660af11019fc418c2d3ddf6c6cbdb58dfeee311" gracePeriod=30 Mar 09 09:27:57 crc kubenswrapper[4861]: I0309 09:27:57.991427 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 09:27:57 crc kubenswrapper[4861]: I0309 09:27:57.999245 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="91be51cc-9158-4ade-b36c-cb7bc65b006e" containerName="nova-metadata-log" containerID="cri-o://000c5555aae3e05ddcec2b60140e35dcbd36b7dccbfbf507e97c39d1c0d74722" gracePeriod=30 Mar 09 09:27:57 crc kubenswrapper[4861]: I0309 09:27:57.999316 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="91be51cc-9158-4ade-b36c-cb7bc65b006e" containerName="nova-metadata-metadata" containerID="cri-o://9223ada1a05d882dfb343dfc0c60faa4904ce5cbff03a162d9b13cfb54a4e2bf" gracePeriod=30 Mar 09 09:27:58 crc kubenswrapper[4861]: I0309 09:27:58.765008 4861 generic.go:334] "Generic (PLEG): container finished" podID="72e92702-681e-4575-84da-71f26ef95ebf" containerID="d1afee61e833f56202c99d97a660af11019fc418c2d3ddf6c6cbdb58dfeee311" exitCode=0 Mar 09 09:27:58 crc kubenswrapper[4861]: I0309 09:27:58.765077 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"72e92702-681e-4575-84da-71f26ef95ebf","Type":"ContainerDied","Data":"d1afee61e833f56202c99d97a660af11019fc418c2d3ddf6c6cbdb58dfeee311"} Mar 09 09:27:58 crc kubenswrapper[4861]: I0309 09:27:58.766688 4861 generic.go:334] "Generic (PLEG): container finished" podID="c284b70e-4d7d-4a48-9061-acfd4c1aed1e" containerID="016f78ce0f2846310b2843e7584d10c5ff53f4878d11c014c13c8917b24e9190" exitCode=143 Mar 09 09:27:58 crc kubenswrapper[4861]: I0309 09:27:58.766734 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c284b70e-4d7d-4a48-9061-acfd4c1aed1e","Type":"ContainerDied","Data":"016f78ce0f2846310b2843e7584d10c5ff53f4878d11c014c13c8917b24e9190"} Mar 09 09:27:58 crc kubenswrapper[4861]: I0309 09:27:58.769047 4861 generic.go:334] "Generic (PLEG): container finished" podID="91be51cc-9158-4ade-b36c-cb7bc65b006e" containerID="000c5555aae3e05ddcec2b60140e35dcbd36b7dccbfbf507e97c39d1c0d74722" exitCode=143 Mar 09 09:27:58 crc kubenswrapper[4861]: I0309 09:27:58.770070 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"91be51cc-9158-4ade-b36c-cb7bc65b006e","Type":"ContainerDied","Data":"000c5555aae3e05ddcec2b60140e35dcbd36b7dccbfbf507e97c39d1c0d74722"} Mar 09 09:27:58 crc kubenswrapper[4861]: I0309 09:27:58.898450 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 09:27:59 crc kubenswrapper[4861]: I0309 09:27:59.057706 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xn4c\" (UniqueName: \"kubernetes.io/projected/72e92702-681e-4575-84da-71f26ef95ebf-kube-api-access-6xn4c\") pod \"72e92702-681e-4575-84da-71f26ef95ebf\" (UID: \"72e92702-681e-4575-84da-71f26ef95ebf\") " Mar 09 09:27:59 crc kubenswrapper[4861]: I0309 09:27:59.057853 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72e92702-681e-4575-84da-71f26ef95ebf-combined-ca-bundle\") pod \"72e92702-681e-4575-84da-71f26ef95ebf\" (UID: \"72e92702-681e-4575-84da-71f26ef95ebf\") " Mar 09 09:27:59 crc kubenswrapper[4861]: I0309 09:27:59.057910 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72e92702-681e-4575-84da-71f26ef95ebf-config-data\") pod \"72e92702-681e-4575-84da-71f26ef95ebf\" (UID: \"72e92702-681e-4575-84da-71f26ef95ebf\") " Mar 09 09:27:59 crc kubenswrapper[4861]: I0309 09:27:59.071175 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72e92702-681e-4575-84da-71f26ef95ebf-kube-api-access-6xn4c" (OuterVolumeSpecName: "kube-api-access-6xn4c") pod "72e92702-681e-4575-84da-71f26ef95ebf" (UID: "72e92702-681e-4575-84da-71f26ef95ebf"). InnerVolumeSpecName "kube-api-access-6xn4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:27:59 crc kubenswrapper[4861]: I0309 09:27:59.125659 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72e92702-681e-4575-84da-71f26ef95ebf-config-data" (OuterVolumeSpecName: "config-data") pod "72e92702-681e-4575-84da-71f26ef95ebf" (UID: "72e92702-681e-4575-84da-71f26ef95ebf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:59 crc kubenswrapper[4861]: I0309 09:27:59.129473 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72e92702-681e-4575-84da-71f26ef95ebf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "72e92702-681e-4575-84da-71f26ef95ebf" (UID: "72e92702-681e-4575-84da-71f26ef95ebf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:27:59 crc kubenswrapper[4861]: I0309 09:27:59.160794 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xn4c\" (UniqueName: \"kubernetes.io/projected/72e92702-681e-4575-84da-71f26ef95ebf-kube-api-access-6xn4c\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:59 crc kubenswrapper[4861]: I0309 09:27:59.160870 4861 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72e92702-681e-4575-84da-71f26ef95ebf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:59 crc kubenswrapper[4861]: I0309 09:27:59.160886 4861 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72e92702-681e-4575-84da-71f26ef95ebf-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:27:59 crc kubenswrapper[4861]: E0309 09:27:59.569297 4861 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode81aeea2_beec_4987_b527_db644692cb14.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode81aeea2_beec_4987_b527_db644692cb14.slice/crio-7aa755ba2b534ec33b73942a0dcad7c6918ecf98e43652615d68efcdc114b08e\": RecentStats: unable to find data in memory cache]" Mar 09 09:27:59 crc kubenswrapper[4861]: I0309 09:27:59.779616 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"72e92702-681e-4575-84da-71f26ef95ebf","Type":"ContainerDied","Data":"552e811ffa96a97a38b9382431a9e2a2a1903ffec26e36aede014fbe3f480224"} Mar 09 09:27:59 crc kubenswrapper[4861]: I0309 09:27:59.779672 4861 scope.go:117] "RemoveContainer" containerID="d1afee61e833f56202c99d97a660af11019fc418c2d3ddf6c6cbdb58dfeee311" Mar 09 09:27:59 crc kubenswrapper[4861]: I0309 09:27:59.779686 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 09:27:59 crc kubenswrapper[4861]: I0309 09:27:59.806337 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 09:27:59 crc kubenswrapper[4861]: I0309 09:27:59.820829 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 09:27:59 crc kubenswrapper[4861]: I0309 09:27:59.838929 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 09:27:59 crc kubenswrapper[4861]: E0309 09:27:59.839278 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72e92702-681e-4575-84da-71f26ef95ebf" containerName="nova-scheduler-scheduler" Mar 09 09:27:59 crc kubenswrapper[4861]: I0309 09:27:59.839295 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="72e92702-681e-4575-84da-71f26ef95ebf" containerName="nova-scheduler-scheduler" Mar 09 09:27:59 crc kubenswrapper[4861]: E0309 09:27:59.839305 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e81aeea2-beec-4987-b527-db644692cb14" containerName="nova-manage" Mar 09 09:27:59 crc kubenswrapper[4861]: I0309 09:27:59.839311 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="e81aeea2-beec-4987-b527-db644692cb14" containerName="nova-manage" Mar 09 09:27:59 crc kubenswrapper[4861]: I0309 09:27:59.839503 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="e81aeea2-beec-4987-b527-db644692cb14" containerName="nova-manage" Mar 09 09:27:59 crc kubenswrapper[4861]: I0309 09:27:59.839521 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="72e92702-681e-4575-84da-71f26ef95ebf" containerName="nova-scheduler-scheduler" Mar 09 09:27:59 crc kubenswrapper[4861]: I0309 09:27:59.845749 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 09:27:59 crc kubenswrapper[4861]: I0309 09:27:59.848777 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 09 09:27:59 crc kubenswrapper[4861]: I0309 09:27:59.854933 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 09:27:59 crc kubenswrapper[4861]: I0309 09:27:59.978743 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a9f9492-a68d-4f37-bffc-4f13ebe23db7-config-data\") pod \"nova-scheduler-0\" (UID: \"0a9f9492-a68d-4f37-bffc-4f13ebe23db7\") " pod="openstack/nova-scheduler-0" Mar 09 09:27:59 crc kubenswrapper[4861]: I0309 09:27:59.978845 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-864z6\" (UniqueName: \"kubernetes.io/projected/0a9f9492-a68d-4f37-bffc-4f13ebe23db7-kube-api-access-864z6\") pod \"nova-scheduler-0\" (UID: \"0a9f9492-a68d-4f37-bffc-4f13ebe23db7\") " pod="openstack/nova-scheduler-0" Mar 09 09:27:59 crc kubenswrapper[4861]: I0309 09:27:59.978912 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a9f9492-a68d-4f37-bffc-4f13ebe23db7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0a9f9492-a68d-4f37-bffc-4f13ebe23db7\") " pod="openstack/nova-scheduler-0" Mar 09 09:28:00 crc kubenswrapper[4861]: I0309 09:28:00.080806 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a9f9492-a68d-4f37-bffc-4f13ebe23db7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0a9f9492-a68d-4f37-bffc-4f13ebe23db7\") " pod="openstack/nova-scheduler-0" Mar 09 09:28:00 crc kubenswrapper[4861]: I0309 09:28:00.080942 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a9f9492-a68d-4f37-bffc-4f13ebe23db7-config-data\") pod \"nova-scheduler-0\" (UID: \"0a9f9492-a68d-4f37-bffc-4f13ebe23db7\") " pod="openstack/nova-scheduler-0" Mar 09 09:28:00 crc kubenswrapper[4861]: I0309 09:28:00.081020 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-864z6\" (UniqueName: \"kubernetes.io/projected/0a9f9492-a68d-4f37-bffc-4f13ebe23db7-kube-api-access-864z6\") pod \"nova-scheduler-0\" (UID: \"0a9f9492-a68d-4f37-bffc-4f13ebe23db7\") " pod="openstack/nova-scheduler-0" Mar 09 09:28:00 crc kubenswrapper[4861]: I0309 09:28:00.087316 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a9f9492-a68d-4f37-bffc-4f13ebe23db7-config-data\") pod \"nova-scheduler-0\" (UID: \"0a9f9492-a68d-4f37-bffc-4f13ebe23db7\") " pod="openstack/nova-scheduler-0" Mar 09 09:28:00 crc kubenswrapper[4861]: I0309 09:28:00.096214 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a9f9492-a68d-4f37-bffc-4f13ebe23db7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0a9f9492-a68d-4f37-bffc-4f13ebe23db7\") " pod="openstack/nova-scheduler-0" Mar 09 09:28:00 crc kubenswrapper[4861]: I0309 09:28:00.117099 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-864z6\" (UniqueName: \"kubernetes.io/projected/0a9f9492-a68d-4f37-bffc-4f13ebe23db7-kube-api-access-864z6\") pod \"nova-scheduler-0\" (UID: \"0a9f9492-a68d-4f37-bffc-4f13ebe23db7\") " pod="openstack/nova-scheduler-0" Mar 09 09:28:00 crc kubenswrapper[4861]: I0309 09:28:00.141905 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550808-4spcd"] Mar 09 09:28:00 crc kubenswrapper[4861]: I0309 09:28:00.143885 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550808-4spcd" Mar 09 09:28:00 crc kubenswrapper[4861]: I0309 09:28:00.146807 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:28:00 crc kubenswrapper[4861]: I0309 09:28:00.146889 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:28:00 crc kubenswrapper[4861]: I0309 09:28:00.147342 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k86l8" Mar 09 09:28:00 crc kubenswrapper[4861]: I0309 09:28:00.155703 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550808-4spcd"] Mar 09 09:28:00 crc kubenswrapper[4861]: I0309 09:28:00.164000 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 09:28:00 crc kubenswrapper[4861]: I0309 09:28:00.290215 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnw7f\" (UniqueName: \"kubernetes.io/projected/50e9c52c-8e13-491e-bf32-3daa6bf663bb-kube-api-access-tnw7f\") pod \"auto-csr-approver-29550808-4spcd\" (UID: \"50e9c52c-8e13-491e-bf32-3daa6bf663bb\") " pod="openshift-infra/auto-csr-approver-29550808-4spcd" Mar 09 09:28:00 crc kubenswrapper[4861]: I0309 09:28:00.392411 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnw7f\" (UniqueName: \"kubernetes.io/projected/50e9c52c-8e13-491e-bf32-3daa6bf663bb-kube-api-access-tnw7f\") pod \"auto-csr-approver-29550808-4spcd\" (UID: \"50e9c52c-8e13-491e-bf32-3daa6bf663bb\") " pod="openshift-infra/auto-csr-approver-29550808-4spcd" Mar 09 09:28:00 crc kubenswrapper[4861]: I0309 09:28:00.423335 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnw7f\" (UniqueName: \"kubernetes.io/projected/50e9c52c-8e13-491e-bf32-3daa6bf663bb-kube-api-access-tnw7f\") pod \"auto-csr-approver-29550808-4spcd\" (UID: \"50e9c52c-8e13-491e-bf32-3daa6bf663bb\") " pod="openshift-infra/auto-csr-approver-29550808-4spcd" Mar 09 09:28:00 crc kubenswrapper[4861]: I0309 09:28:00.595782 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550808-4spcd" Mar 09 09:28:00 crc kubenswrapper[4861]: I0309 09:28:00.609420 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 09:28:00 crc kubenswrapper[4861]: I0309 09:28:00.794741 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0a9f9492-a68d-4f37-bffc-4f13ebe23db7","Type":"ContainerStarted","Data":"08ec076b7d8f369f7ba1a8f3fb8d1857ed0589eb424780fa28935cf6bc4b29ba"} Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.053560 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550808-4spcd"] Mar 09 09:28:01 crc kubenswrapper[4861]: W0309 09:28:01.054482 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50e9c52c_8e13_491e_bf32_3daa6bf663bb.slice/crio-fa5ff73d89a4eb649d1cca09caa6c7c822f64170202ee03bf4e90c9663da1fd6 WatchSource:0}: Error finding container fa5ff73d89a4eb649d1cca09caa6c7c822f64170202ee03bf4e90c9663da1fd6: Status 404 returned error can't find the container with id fa5ff73d89a4eb649d1cca09caa6c7c822f64170202ee03bf4e90c9663da1fd6 Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.148862 4861 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="91be51cc-9158-4ade-b36c-cb7bc65b006e" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.200:8775/\": read tcp 10.217.0.2:48680->10.217.0.200:8775: read: connection reset by peer" Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.148862 4861 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="91be51cc-9158-4ade-b36c-cb7bc65b006e" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.200:8775/\": read tcp 10.217.0.2:48688->10.217.0.200:8775: read: connection reset by peer" Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.595422 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.609254 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.707766 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72e92702-681e-4575-84da-71f26ef95ebf" path="/var/lib/kubelet/pods/72e92702-681e-4575-84da-71f26ef95ebf/volumes" Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.722042 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c284b70e-4d7d-4a48-9061-acfd4c1aed1e-logs\") pod \"c284b70e-4d7d-4a48-9061-acfd4c1aed1e\" (UID: \"c284b70e-4d7d-4a48-9061-acfd4c1aed1e\") " Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.722113 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91be51cc-9158-4ade-b36c-cb7bc65b006e-logs\") pod \"91be51cc-9158-4ade-b36c-cb7bc65b006e\" (UID: \"91be51cc-9158-4ade-b36c-cb7bc65b006e\") " Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.722142 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c284b70e-4d7d-4a48-9061-acfd4c1aed1e-combined-ca-bundle\") pod \"c284b70e-4d7d-4a48-9061-acfd4c1aed1e\" (UID: \"c284b70e-4d7d-4a48-9061-acfd4c1aed1e\") " Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.722161 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c284b70e-4d7d-4a48-9061-acfd4c1aed1e-internal-tls-certs\") pod \"c284b70e-4d7d-4a48-9061-acfd4c1aed1e\" (UID: \"c284b70e-4d7d-4a48-9061-acfd4c1aed1e\") " Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.722178 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c284b70e-4d7d-4a48-9061-acfd4c1aed1e-public-tls-certs\") pod \"c284b70e-4d7d-4a48-9061-acfd4c1aed1e\" (UID: \"c284b70e-4d7d-4a48-9061-acfd4c1aed1e\") " Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.722221 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdsc9\" (UniqueName: \"kubernetes.io/projected/91be51cc-9158-4ade-b36c-cb7bc65b006e-kube-api-access-pdsc9\") pod \"91be51cc-9158-4ade-b36c-cb7bc65b006e\" (UID: \"91be51cc-9158-4ade-b36c-cb7bc65b006e\") " Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.722270 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91be51cc-9158-4ade-b36c-cb7bc65b006e-config-data\") pod \"91be51cc-9158-4ade-b36c-cb7bc65b006e\" (UID: \"91be51cc-9158-4ade-b36c-cb7bc65b006e\") " Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.722289 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98hwc\" (UniqueName: \"kubernetes.io/projected/c284b70e-4d7d-4a48-9061-acfd4c1aed1e-kube-api-access-98hwc\") pod \"c284b70e-4d7d-4a48-9061-acfd4c1aed1e\" (UID: \"c284b70e-4d7d-4a48-9061-acfd4c1aed1e\") " Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.722321 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c284b70e-4d7d-4a48-9061-acfd4c1aed1e-config-data\") pod \"c284b70e-4d7d-4a48-9061-acfd4c1aed1e\" (UID: \"c284b70e-4d7d-4a48-9061-acfd4c1aed1e\") " Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.722358 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91be51cc-9158-4ade-b36c-cb7bc65b006e-combined-ca-bundle\") pod \"91be51cc-9158-4ade-b36c-cb7bc65b006e\" (UID: \"91be51cc-9158-4ade-b36c-cb7bc65b006e\") " Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.722464 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/91be51cc-9158-4ade-b36c-cb7bc65b006e-nova-metadata-tls-certs\") pod \"91be51cc-9158-4ade-b36c-cb7bc65b006e\" (UID: \"91be51cc-9158-4ade-b36c-cb7bc65b006e\") " Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.724567 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c284b70e-4d7d-4a48-9061-acfd4c1aed1e-logs" (OuterVolumeSpecName: "logs") pod "c284b70e-4d7d-4a48-9061-acfd4c1aed1e" (UID: "c284b70e-4d7d-4a48-9061-acfd4c1aed1e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.735294 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91be51cc-9158-4ade-b36c-cb7bc65b006e-logs" (OuterVolumeSpecName: "logs") pod "91be51cc-9158-4ade-b36c-cb7bc65b006e" (UID: "91be51cc-9158-4ade-b36c-cb7bc65b006e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.735598 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91be51cc-9158-4ade-b36c-cb7bc65b006e-kube-api-access-pdsc9" (OuterVolumeSpecName: "kube-api-access-pdsc9") pod "91be51cc-9158-4ade-b36c-cb7bc65b006e" (UID: "91be51cc-9158-4ade-b36c-cb7bc65b006e"). InnerVolumeSpecName "kube-api-access-pdsc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.760198 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c284b70e-4d7d-4a48-9061-acfd4c1aed1e-kube-api-access-98hwc" (OuterVolumeSpecName: "kube-api-access-98hwc") pod "c284b70e-4d7d-4a48-9061-acfd4c1aed1e" (UID: "c284b70e-4d7d-4a48-9061-acfd4c1aed1e"). InnerVolumeSpecName "kube-api-access-98hwc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.781055 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91be51cc-9158-4ade-b36c-cb7bc65b006e-config-data" (OuterVolumeSpecName: "config-data") pod "91be51cc-9158-4ade-b36c-cb7bc65b006e" (UID: "91be51cc-9158-4ade-b36c-cb7bc65b006e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.803132 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91be51cc-9158-4ade-b36c-cb7bc65b006e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "91be51cc-9158-4ade-b36c-cb7bc65b006e" (UID: "91be51cc-9158-4ade-b36c-cb7bc65b006e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.819021 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0a9f9492-a68d-4f37-bffc-4f13ebe23db7","Type":"ContainerStarted","Data":"9d9caf22cd82fbc29b00aba9d2b2e6aa2947f08681ec5266728390021dfafa80"} Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.828634 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c284b70e-4d7d-4a48-9061-acfd4c1aed1e","Type":"ContainerDied","Data":"3598e98bff57ee3e426eae390226c7f498407cfd6550bf4b3c4bc2d62d3cb7ac"} Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.828593 4861 generic.go:334] "Generic (PLEG): container finished" podID="c284b70e-4d7d-4a48-9061-acfd4c1aed1e" containerID="3598e98bff57ee3e426eae390226c7f498407cfd6550bf4b3c4bc2d62d3cb7ac" exitCode=0 Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.828752 4861 scope.go:117] "RemoveContainer" containerID="3598e98bff57ee3e426eae390226c7f498407cfd6550bf4b3c4bc2d62d3cb7ac" Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.828919 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"c284b70e-4d7d-4a48-9061-acfd4c1aed1e","Type":"ContainerDied","Data":"bd8bf4421d19098432f1a0a3588b71df4b1a3eb693783e59cdf50511a516ab44"} Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.828963 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.831868 4861 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c284b70e-4d7d-4a48-9061-acfd4c1aed1e-logs\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.839253 4861 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91be51cc-9158-4ade-b36c-cb7bc65b006e-logs\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.840661 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdsc9\" (UniqueName: \"kubernetes.io/projected/91be51cc-9158-4ade-b36c-cb7bc65b006e-kube-api-access-pdsc9\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.840798 4861 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91be51cc-9158-4ade-b36c-cb7bc65b006e-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.840888 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98hwc\" (UniqueName: \"kubernetes.io/projected/c284b70e-4d7d-4a48-9061-acfd4c1aed1e-kube-api-access-98hwc\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.841052 4861 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91be51cc-9158-4ade-b36c-cb7bc65b006e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.832798 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91be51cc-9158-4ade-b36c-cb7bc65b006e-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "91be51cc-9158-4ade-b36c-cb7bc65b006e" (UID: "91be51cc-9158-4ade-b36c-cb7bc65b006e"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.848590 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.848572186 podStartE2EDuration="2.848572186s" podCreationTimestamp="2026-03-09 09:27:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:28:01.848135214 +0000 UTC m=+1324.933174615" watchObservedRunningTime="2026-03-09 09:28:01.848572186 +0000 UTC m=+1324.933611607" Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.850919 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550808-4spcd" event={"ID":"50e9c52c-8e13-491e-bf32-3daa6bf663bb","Type":"ContainerStarted","Data":"fa5ff73d89a4eb649d1cca09caa6c7c822f64170202ee03bf4e90c9663da1fd6"} Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.856587 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c284b70e-4d7d-4a48-9061-acfd4c1aed1e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c284b70e-4d7d-4a48-9061-acfd4c1aed1e" (UID: "c284b70e-4d7d-4a48-9061-acfd4c1aed1e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.856619 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c284b70e-4d7d-4a48-9061-acfd4c1aed1e-config-data" (OuterVolumeSpecName: "config-data") pod "c284b70e-4d7d-4a48-9061-acfd4c1aed1e" (UID: "c284b70e-4d7d-4a48-9061-acfd4c1aed1e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.865147 4861 generic.go:334] "Generic (PLEG): container finished" podID="91be51cc-9158-4ade-b36c-cb7bc65b006e" containerID="9223ada1a05d882dfb343dfc0c60faa4904ce5cbff03a162d9b13cfb54a4e2bf" exitCode=0 Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.865279 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.865303 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"91be51cc-9158-4ade-b36c-cb7bc65b006e","Type":"ContainerDied","Data":"9223ada1a05d882dfb343dfc0c60faa4904ce5cbff03a162d9b13cfb54a4e2bf"} Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.866073 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"91be51cc-9158-4ade-b36c-cb7bc65b006e","Type":"ContainerDied","Data":"68c6340eb22aae18335395601f5bab91668c7125ae3d77aa613158b2f9af0b26"} Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.878518 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c284b70e-4d7d-4a48-9061-acfd4c1aed1e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c284b70e-4d7d-4a48-9061-acfd4c1aed1e" (UID: "c284b70e-4d7d-4a48-9061-acfd4c1aed1e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.902394 4861 scope.go:117] "RemoveContainer" containerID="016f78ce0f2846310b2843e7584d10c5ff53f4878d11c014c13c8917b24e9190" Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.902678 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c284b70e-4d7d-4a48-9061-acfd4c1aed1e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c284b70e-4d7d-4a48-9061-acfd4c1aed1e" (UID: "c284b70e-4d7d-4a48-9061-acfd4c1aed1e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.930667 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.943252 4861 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c284b70e-4d7d-4a48-9061-acfd4c1aed1e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.943295 4861 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c284b70e-4d7d-4a48-9061-acfd4c1aed1e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.943308 4861 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c284b70e-4d7d-4a48-9061-acfd4c1aed1e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.943319 4861 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c284b70e-4d7d-4a48-9061-acfd4c1aed1e-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.943331 4861 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/91be51cc-9158-4ade-b36c-cb7bc65b006e-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.945895 4861 scope.go:117] "RemoveContainer" containerID="3598e98bff57ee3e426eae390226c7f498407cfd6550bf4b3c4bc2d62d3cb7ac" Mar 09 09:28:01 crc kubenswrapper[4861]: E0309 09:28:01.946437 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3598e98bff57ee3e426eae390226c7f498407cfd6550bf4b3c4bc2d62d3cb7ac\": container with ID starting with 3598e98bff57ee3e426eae390226c7f498407cfd6550bf4b3c4bc2d62d3cb7ac not found: ID does not exist" containerID="3598e98bff57ee3e426eae390226c7f498407cfd6550bf4b3c4bc2d62d3cb7ac" Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.949780 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.946470 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3598e98bff57ee3e426eae390226c7f498407cfd6550bf4b3c4bc2d62d3cb7ac"} err="failed to get container status \"3598e98bff57ee3e426eae390226c7f498407cfd6550bf4b3c4bc2d62d3cb7ac\": rpc error: code = NotFound desc = could not find container \"3598e98bff57ee3e426eae390226c7f498407cfd6550bf4b3c4bc2d62d3cb7ac\": container with ID starting with 3598e98bff57ee3e426eae390226c7f498407cfd6550bf4b3c4bc2d62d3cb7ac not found: ID does not exist" Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.959566 4861 scope.go:117] "RemoveContainer" containerID="016f78ce0f2846310b2843e7584d10c5ff53f4878d11c014c13c8917b24e9190" Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.960413 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 09 09:28:01 crc kubenswrapper[4861]: E0309 09:28:01.960880 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91be51cc-9158-4ade-b36c-cb7bc65b006e" containerName="nova-metadata-metadata" Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.960895 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="91be51cc-9158-4ade-b36c-cb7bc65b006e" containerName="nova-metadata-metadata" Mar 09 09:28:01 crc kubenswrapper[4861]: E0309 09:28:01.960909 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91be51cc-9158-4ade-b36c-cb7bc65b006e" containerName="nova-metadata-log" Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.960917 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="91be51cc-9158-4ade-b36c-cb7bc65b006e" containerName="nova-metadata-log" Mar 09 09:28:01 crc kubenswrapper[4861]: E0309 09:28:01.960939 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c284b70e-4d7d-4a48-9061-acfd4c1aed1e" containerName="nova-api-log" Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.960946 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="c284b70e-4d7d-4a48-9061-acfd4c1aed1e" containerName="nova-api-log" Mar 09 09:28:01 crc kubenswrapper[4861]: E0309 09:28:01.960955 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c284b70e-4d7d-4a48-9061-acfd4c1aed1e" containerName="nova-api-api" Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.960962 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="c284b70e-4d7d-4a48-9061-acfd4c1aed1e" containerName="nova-api-api" Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.961149 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="91be51cc-9158-4ade-b36c-cb7bc65b006e" containerName="nova-metadata-metadata" Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.961170 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="c284b70e-4d7d-4a48-9061-acfd4c1aed1e" containerName="nova-api-api" Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.961179 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="91be51cc-9158-4ade-b36c-cb7bc65b006e" containerName="nova-metadata-log" Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.961193 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="c284b70e-4d7d-4a48-9061-acfd4c1aed1e" containerName="nova-api-log" Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.963044 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 09:28:01 crc kubenswrapper[4861]: E0309 09:28:01.963676 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"016f78ce0f2846310b2843e7584d10c5ff53f4878d11c014c13c8917b24e9190\": container with ID starting with 016f78ce0f2846310b2843e7584d10c5ff53f4878d11c014c13c8917b24e9190 not found: ID does not exist" containerID="016f78ce0f2846310b2843e7584d10c5ff53f4878d11c014c13c8917b24e9190" Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.964353 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"016f78ce0f2846310b2843e7584d10c5ff53f4878d11c014c13c8917b24e9190"} err="failed to get container status \"016f78ce0f2846310b2843e7584d10c5ff53f4878d11c014c13c8917b24e9190\": rpc error: code = NotFound desc = could not find container \"016f78ce0f2846310b2843e7584d10c5ff53f4878d11c014c13c8917b24e9190\": container with ID starting with 016f78ce0f2846310b2843e7584d10c5ff53f4878d11c014c13c8917b24e9190 not found: ID does not exist" Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.964453 4861 scope.go:117] "RemoveContainer" containerID="9223ada1a05d882dfb343dfc0c60faa4904ce5cbff03a162d9b13cfb54a4e2bf" Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.970486 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.970763 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.972167 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 09:28:01 crc kubenswrapper[4861]: I0309 09:28:01.996619 4861 scope.go:117] "RemoveContainer" containerID="000c5555aae3e05ddcec2b60140e35dcbd36b7dccbfbf507e97c39d1c0d74722" Mar 09 09:28:02 crc kubenswrapper[4861]: I0309 09:28:02.028558 4861 scope.go:117] "RemoveContainer" containerID="9223ada1a05d882dfb343dfc0c60faa4904ce5cbff03a162d9b13cfb54a4e2bf" Mar 09 09:28:02 crc kubenswrapper[4861]: E0309 09:28:02.032200 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9223ada1a05d882dfb343dfc0c60faa4904ce5cbff03a162d9b13cfb54a4e2bf\": container with ID starting with 9223ada1a05d882dfb343dfc0c60faa4904ce5cbff03a162d9b13cfb54a4e2bf not found: ID does not exist" containerID="9223ada1a05d882dfb343dfc0c60faa4904ce5cbff03a162d9b13cfb54a4e2bf" Mar 09 09:28:02 crc kubenswrapper[4861]: I0309 09:28:02.032245 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9223ada1a05d882dfb343dfc0c60faa4904ce5cbff03a162d9b13cfb54a4e2bf"} err="failed to get container status \"9223ada1a05d882dfb343dfc0c60faa4904ce5cbff03a162d9b13cfb54a4e2bf\": rpc error: code = NotFound desc = could not find container \"9223ada1a05d882dfb343dfc0c60faa4904ce5cbff03a162d9b13cfb54a4e2bf\": container with ID starting with 9223ada1a05d882dfb343dfc0c60faa4904ce5cbff03a162d9b13cfb54a4e2bf not found: ID does not exist" Mar 09 09:28:02 crc kubenswrapper[4861]: I0309 09:28:02.032273 4861 scope.go:117] "RemoveContainer" containerID="000c5555aae3e05ddcec2b60140e35dcbd36b7dccbfbf507e97c39d1c0d74722" Mar 09 09:28:02 crc kubenswrapper[4861]: E0309 09:28:02.032553 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"000c5555aae3e05ddcec2b60140e35dcbd36b7dccbfbf507e97c39d1c0d74722\": container with ID starting with 000c5555aae3e05ddcec2b60140e35dcbd36b7dccbfbf507e97c39d1c0d74722 not found: ID does not exist" containerID="000c5555aae3e05ddcec2b60140e35dcbd36b7dccbfbf507e97c39d1c0d74722" Mar 09 09:28:02 crc kubenswrapper[4861]: I0309 09:28:02.032590 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"000c5555aae3e05ddcec2b60140e35dcbd36b7dccbfbf507e97c39d1c0d74722"} err="failed to get container status \"000c5555aae3e05ddcec2b60140e35dcbd36b7dccbfbf507e97c39d1c0d74722\": rpc error: code = NotFound desc = could not find container \"000c5555aae3e05ddcec2b60140e35dcbd36b7dccbfbf507e97c39d1c0d74722\": container with ID starting with 000c5555aae3e05ddcec2b60140e35dcbd36b7dccbfbf507e97c39d1c0d74722 not found: ID does not exist" Mar 09 09:28:02 crc kubenswrapper[4861]: I0309 09:28:02.044761 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d5c1be9-8604-4565-9175-703ff865c6eb-logs\") pod \"nova-metadata-0\" (UID: \"9d5c1be9-8604-4565-9175-703ff865c6eb\") " pod="openstack/nova-metadata-0" Mar 09 09:28:02 crc kubenswrapper[4861]: I0309 09:28:02.044876 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d5c1be9-8604-4565-9175-703ff865c6eb-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9d5c1be9-8604-4565-9175-703ff865c6eb\") " pod="openstack/nova-metadata-0" Mar 09 09:28:02 crc kubenswrapper[4861]: I0309 09:28:02.044975 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d5c1be9-8604-4565-9175-703ff865c6eb-config-data\") pod \"nova-metadata-0\" (UID: \"9d5c1be9-8604-4565-9175-703ff865c6eb\") " pod="openstack/nova-metadata-0" Mar 09 09:28:02 crc kubenswrapper[4861]: I0309 09:28:02.045002 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d5c1be9-8604-4565-9175-703ff865c6eb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9d5c1be9-8604-4565-9175-703ff865c6eb\") " pod="openstack/nova-metadata-0" Mar 09 09:28:02 crc kubenswrapper[4861]: I0309 09:28:02.045048 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j425d\" (UniqueName: \"kubernetes.io/projected/9d5c1be9-8604-4565-9175-703ff865c6eb-kube-api-access-j425d\") pod \"nova-metadata-0\" (UID: \"9d5c1be9-8604-4565-9175-703ff865c6eb\") " pod="openstack/nova-metadata-0" Mar 09 09:28:02 crc kubenswrapper[4861]: I0309 09:28:02.146440 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d5c1be9-8604-4565-9175-703ff865c6eb-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9d5c1be9-8604-4565-9175-703ff865c6eb\") " pod="openstack/nova-metadata-0" Mar 09 09:28:02 crc kubenswrapper[4861]: I0309 09:28:02.146547 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d5c1be9-8604-4565-9175-703ff865c6eb-config-data\") pod \"nova-metadata-0\" (UID: \"9d5c1be9-8604-4565-9175-703ff865c6eb\") " pod="openstack/nova-metadata-0" Mar 09 09:28:02 crc kubenswrapper[4861]: I0309 09:28:02.146570 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d5c1be9-8604-4565-9175-703ff865c6eb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9d5c1be9-8604-4565-9175-703ff865c6eb\") " pod="openstack/nova-metadata-0" Mar 09 09:28:02 crc kubenswrapper[4861]: I0309 09:28:02.146603 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j425d\" (UniqueName: \"kubernetes.io/projected/9d5c1be9-8604-4565-9175-703ff865c6eb-kube-api-access-j425d\") pod \"nova-metadata-0\" (UID: \"9d5c1be9-8604-4565-9175-703ff865c6eb\") " pod="openstack/nova-metadata-0" Mar 09 09:28:02 crc kubenswrapper[4861]: I0309 09:28:02.146649 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d5c1be9-8604-4565-9175-703ff865c6eb-logs\") pod \"nova-metadata-0\" (UID: \"9d5c1be9-8604-4565-9175-703ff865c6eb\") " pod="openstack/nova-metadata-0" Mar 09 09:28:02 crc kubenswrapper[4861]: I0309 09:28:02.147073 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d5c1be9-8604-4565-9175-703ff865c6eb-logs\") pod \"nova-metadata-0\" (UID: \"9d5c1be9-8604-4565-9175-703ff865c6eb\") " pod="openstack/nova-metadata-0" Mar 09 09:28:02 crc kubenswrapper[4861]: I0309 09:28:02.152167 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d5c1be9-8604-4565-9175-703ff865c6eb-config-data\") pod \"nova-metadata-0\" (UID: \"9d5c1be9-8604-4565-9175-703ff865c6eb\") " pod="openstack/nova-metadata-0" Mar 09 09:28:02 crc kubenswrapper[4861]: I0309 09:28:02.154891 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d5c1be9-8604-4565-9175-703ff865c6eb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9d5c1be9-8604-4565-9175-703ff865c6eb\") " pod="openstack/nova-metadata-0" Mar 09 09:28:02 crc kubenswrapper[4861]: I0309 09:28:02.155431 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/9d5c1be9-8604-4565-9175-703ff865c6eb-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"9d5c1be9-8604-4565-9175-703ff865c6eb\") " pod="openstack/nova-metadata-0" Mar 09 09:28:02 crc kubenswrapper[4861]: I0309 09:28:02.168306 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j425d\" (UniqueName: \"kubernetes.io/projected/9d5c1be9-8604-4565-9175-703ff865c6eb-kube-api-access-j425d\") pod \"nova-metadata-0\" (UID: \"9d5c1be9-8604-4565-9175-703ff865c6eb\") " pod="openstack/nova-metadata-0" Mar 09 09:28:02 crc kubenswrapper[4861]: I0309 09:28:02.288184 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 09 09:28:02 crc kubenswrapper[4861]: I0309 09:28:02.289548 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 09:28:02 crc kubenswrapper[4861]: I0309 09:28:02.299227 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 09 09:28:02 crc kubenswrapper[4861]: I0309 09:28:02.317035 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 09 09:28:02 crc kubenswrapper[4861]: I0309 09:28:02.319765 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 09:28:02 crc kubenswrapper[4861]: I0309 09:28:02.322487 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 09 09:28:02 crc kubenswrapper[4861]: I0309 09:28:02.323326 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 09 09:28:02 crc kubenswrapper[4861]: I0309 09:28:02.323771 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 09 09:28:02 crc kubenswrapper[4861]: I0309 09:28:02.347622 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 09 09:28:02 crc kubenswrapper[4861]: I0309 09:28:02.453436 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7f9f9a6-c593-4015-833b-ef237f492b70-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a7f9f9a6-c593-4015-833b-ef237f492b70\") " pod="openstack/nova-api-0" Mar 09 09:28:02 crc kubenswrapper[4861]: I0309 09:28:02.453787 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl8rx\" (UniqueName: \"kubernetes.io/projected/a7f9f9a6-c593-4015-833b-ef237f492b70-kube-api-access-fl8rx\") pod \"nova-api-0\" (UID: \"a7f9f9a6-c593-4015-833b-ef237f492b70\") " pod="openstack/nova-api-0" Mar 09 09:28:02 crc kubenswrapper[4861]: I0309 09:28:02.453830 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7f9f9a6-c593-4015-833b-ef237f492b70-config-data\") pod \"nova-api-0\" (UID: \"a7f9f9a6-c593-4015-833b-ef237f492b70\") " pod="openstack/nova-api-0" Mar 09 09:28:02 crc kubenswrapper[4861]: I0309 09:28:02.453936 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7f9f9a6-c593-4015-833b-ef237f492b70-logs\") pod \"nova-api-0\" (UID: \"a7f9f9a6-c593-4015-833b-ef237f492b70\") " pod="openstack/nova-api-0" Mar 09 09:28:02 crc kubenswrapper[4861]: I0309 09:28:02.453953 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7f9f9a6-c593-4015-833b-ef237f492b70-public-tls-certs\") pod \"nova-api-0\" (UID: \"a7f9f9a6-c593-4015-833b-ef237f492b70\") " pod="openstack/nova-api-0" Mar 09 09:28:02 crc kubenswrapper[4861]: I0309 09:28:02.453974 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7f9f9a6-c593-4015-833b-ef237f492b70-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a7f9f9a6-c593-4015-833b-ef237f492b70\") " pod="openstack/nova-api-0" Mar 09 09:28:02 crc kubenswrapper[4861]: I0309 09:28:02.556973 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7f9f9a6-c593-4015-833b-ef237f492b70-logs\") pod \"nova-api-0\" (UID: \"a7f9f9a6-c593-4015-833b-ef237f492b70\") " pod="openstack/nova-api-0" Mar 09 09:28:02 crc kubenswrapper[4861]: I0309 09:28:02.557018 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7f9f9a6-c593-4015-833b-ef237f492b70-public-tls-certs\") pod \"nova-api-0\" (UID: \"a7f9f9a6-c593-4015-833b-ef237f492b70\") " pod="openstack/nova-api-0" Mar 09 09:28:02 crc kubenswrapper[4861]: I0309 09:28:02.557040 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7f9f9a6-c593-4015-833b-ef237f492b70-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a7f9f9a6-c593-4015-833b-ef237f492b70\") " pod="openstack/nova-api-0" Mar 09 09:28:02 crc kubenswrapper[4861]: I0309 09:28:02.557094 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7f9f9a6-c593-4015-833b-ef237f492b70-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a7f9f9a6-c593-4015-833b-ef237f492b70\") " pod="openstack/nova-api-0" Mar 09 09:28:02 crc kubenswrapper[4861]: I0309 09:28:02.557479 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7f9f9a6-c593-4015-833b-ef237f492b70-logs\") pod \"nova-api-0\" (UID: \"a7f9f9a6-c593-4015-833b-ef237f492b70\") " pod="openstack/nova-api-0" Mar 09 09:28:02 crc kubenswrapper[4861]: I0309 09:28:02.557598 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fl8rx\" (UniqueName: \"kubernetes.io/projected/a7f9f9a6-c593-4015-833b-ef237f492b70-kube-api-access-fl8rx\") pod \"nova-api-0\" (UID: \"a7f9f9a6-c593-4015-833b-ef237f492b70\") " pod="openstack/nova-api-0" Mar 09 09:28:02 crc kubenswrapper[4861]: I0309 09:28:02.557635 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7f9f9a6-c593-4015-833b-ef237f492b70-config-data\") pod \"nova-api-0\" (UID: \"a7f9f9a6-c593-4015-833b-ef237f492b70\") " pod="openstack/nova-api-0" Mar 09 09:28:02 crc kubenswrapper[4861]: I0309 09:28:02.571229 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7f9f9a6-c593-4015-833b-ef237f492b70-public-tls-certs\") pod \"nova-api-0\" (UID: \"a7f9f9a6-c593-4015-833b-ef237f492b70\") " pod="openstack/nova-api-0" Mar 09 09:28:02 crc kubenswrapper[4861]: I0309 09:28:02.573688 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7f9f9a6-c593-4015-833b-ef237f492b70-config-data\") pod \"nova-api-0\" (UID: \"a7f9f9a6-c593-4015-833b-ef237f492b70\") " pod="openstack/nova-api-0" Mar 09 09:28:02 crc kubenswrapper[4861]: I0309 09:28:02.573714 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7f9f9a6-c593-4015-833b-ef237f492b70-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a7f9f9a6-c593-4015-833b-ef237f492b70\") " pod="openstack/nova-api-0" Mar 09 09:28:02 crc kubenswrapper[4861]: I0309 09:28:02.574269 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7f9f9a6-c593-4015-833b-ef237f492b70-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a7f9f9a6-c593-4015-833b-ef237f492b70\") " pod="openstack/nova-api-0" Mar 09 09:28:02 crc kubenswrapper[4861]: I0309 09:28:02.578392 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl8rx\" (UniqueName: \"kubernetes.io/projected/a7f9f9a6-c593-4015-833b-ef237f492b70-kube-api-access-fl8rx\") pod \"nova-api-0\" (UID: \"a7f9f9a6-c593-4015-833b-ef237f492b70\") " pod="openstack/nova-api-0" Mar 09 09:28:02 crc kubenswrapper[4861]: I0309 09:28:02.750162 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 09:28:02 crc kubenswrapper[4861]: I0309 09:28:02.835643 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 09:28:02 crc kubenswrapper[4861]: W0309 09:28:02.842559 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d5c1be9_8604_4565_9175_703ff865c6eb.slice/crio-9a0c3b572bfbe7426e60e21aefcd2e194c5927f5168ca0944fa815f31c681d84 WatchSource:0}: Error finding container 9a0c3b572bfbe7426e60e21aefcd2e194c5927f5168ca0944fa815f31c681d84: Status 404 returned error can't find the container with id 9a0c3b572bfbe7426e60e21aefcd2e194c5927f5168ca0944fa815f31c681d84 Mar 09 09:28:02 crc kubenswrapper[4861]: I0309 09:28:02.880887 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9d5c1be9-8604-4565-9175-703ff865c6eb","Type":"ContainerStarted","Data":"9a0c3b572bfbe7426e60e21aefcd2e194c5927f5168ca0944fa815f31c681d84"} Mar 09 09:28:02 crc kubenswrapper[4861]: I0309 09:28:02.885324 4861 generic.go:334] "Generic (PLEG): container finished" podID="50e9c52c-8e13-491e-bf32-3daa6bf663bb" containerID="75910fe5cac8dcbedce2876d4f517bf5a25ee5904f405755b0e71469fa4c4c39" exitCode=0 Mar 09 09:28:02 crc kubenswrapper[4861]: I0309 09:28:02.885562 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550808-4spcd" event={"ID":"50e9c52c-8e13-491e-bf32-3daa6bf663bb","Type":"ContainerDied","Data":"75910fe5cac8dcbedce2876d4f517bf5a25ee5904f405755b0e71469fa4c4c39"} Mar 09 09:28:03 crc kubenswrapper[4861]: I0309 09:28:03.289805 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 09 09:28:03 crc kubenswrapper[4861]: I0309 09:28:03.712689 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91be51cc-9158-4ade-b36c-cb7bc65b006e" path="/var/lib/kubelet/pods/91be51cc-9158-4ade-b36c-cb7bc65b006e/volumes" Mar 09 09:28:03 crc kubenswrapper[4861]: I0309 09:28:03.713818 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c284b70e-4d7d-4a48-9061-acfd4c1aed1e" path="/var/lib/kubelet/pods/c284b70e-4d7d-4a48-9061-acfd4c1aed1e/volumes" Mar 09 09:28:03 crc kubenswrapper[4861]: I0309 09:28:03.896706 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a7f9f9a6-c593-4015-833b-ef237f492b70","Type":"ContainerStarted","Data":"57c453eb2b24ef9e46bd47fd626c480af9d0ab59496402c091f01dc44e620966"} Mar 09 09:28:03 crc kubenswrapper[4861]: I0309 09:28:03.896757 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a7f9f9a6-c593-4015-833b-ef237f492b70","Type":"ContainerStarted","Data":"49236ea68cb21b7ad55fa94eca6bb8d628a9982ce727f02bf55d42fd8491a4a3"} Mar 09 09:28:03 crc kubenswrapper[4861]: I0309 09:28:03.896769 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a7f9f9a6-c593-4015-833b-ef237f492b70","Type":"ContainerStarted","Data":"cae980585dcd64e6a92bfc1bd0b4f1e7cb533e0f61a34f065111954f5fda337a"} Mar 09 09:28:03 crc kubenswrapper[4861]: I0309 09:28:03.900582 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9d5c1be9-8604-4565-9175-703ff865c6eb","Type":"ContainerStarted","Data":"009fa359188fbb5692ec79c406ed2bbdc4bc0a2749acddeede0e5fa4136b4859"} Mar 09 09:28:03 crc kubenswrapper[4861]: I0309 09:28:03.900620 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9d5c1be9-8604-4565-9175-703ff865c6eb","Type":"ContainerStarted","Data":"14d24ddd86da9768f368af0b0c932fa01711d5dc7c76c5e57304715f4582f074"} Mar 09 09:28:03 crc kubenswrapper[4861]: I0309 09:28:03.931417 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.931390643 podStartE2EDuration="2.931390643s" podCreationTimestamp="2026-03-09 09:28:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:28:03.91912339 +0000 UTC m=+1327.004162801" watchObservedRunningTime="2026-03-09 09:28:03.931390643 +0000 UTC m=+1327.016430054" Mar 09 09:28:04 crc kubenswrapper[4861]: I0309 09:28:04.309789 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550808-4spcd" Mar 09 09:28:04 crc kubenswrapper[4861]: I0309 09:28:04.410593 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnw7f\" (UniqueName: \"kubernetes.io/projected/50e9c52c-8e13-491e-bf32-3daa6bf663bb-kube-api-access-tnw7f\") pod \"50e9c52c-8e13-491e-bf32-3daa6bf663bb\" (UID: \"50e9c52c-8e13-491e-bf32-3daa6bf663bb\") " Mar 09 09:28:04 crc kubenswrapper[4861]: I0309 09:28:04.418784 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50e9c52c-8e13-491e-bf32-3daa6bf663bb-kube-api-access-tnw7f" (OuterVolumeSpecName: "kube-api-access-tnw7f") pod "50e9c52c-8e13-491e-bf32-3daa6bf663bb" (UID: "50e9c52c-8e13-491e-bf32-3daa6bf663bb"). InnerVolumeSpecName "kube-api-access-tnw7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:28:04 crc kubenswrapper[4861]: I0309 09:28:04.513247 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnw7f\" (UniqueName: \"kubernetes.io/projected/50e9c52c-8e13-491e-bf32-3daa6bf663bb-kube-api-access-tnw7f\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:04 crc kubenswrapper[4861]: I0309 09:28:04.909927 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550808-4spcd" event={"ID":"50e9c52c-8e13-491e-bf32-3daa6bf663bb","Type":"ContainerDied","Data":"fa5ff73d89a4eb649d1cca09caa6c7c822f64170202ee03bf4e90c9663da1fd6"} Mar 09 09:28:04 crc kubenswrapper[4861]: I0309 09:28:04.910292 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa5ff73d89a4eb649d1cca09caa6c7c822f64170202ee03bf4e90c9663da1fd6" Mar 09 09:28:04 crc kubenswrapper[4861]: I0309 09:28:04.909980 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550808-4spcd" Mar 09 09:28:04 crc kubenswrapper[4861]: I0309 09:28:04.933352 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.933334681 podStartE2EDuration="2.933334681s" podCreationTimestamp="2026-03-09 09:28:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:28:04.929747589 +0000 UTC m=+1328.014786990" watchObservedRunningTime="2026-03-09 09:28:04.933334681 +0000 UTC m=+1328.018374082" Mar 09 09:28:05 crc kubenswrapper[4861]: I0309 09:28:05.164461 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 09 09:28:05 crc kubenswrapper[4861]: I0309 09:28:05.389525 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550802-7tqjn"] Mar 09 09:28:05 crc kubenswrapper[4861]: I0309 09:28:05.398117 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550802-7tqjn"] Mar 09 09:28:05 crc kubenswrapper[4861]: I0309 09:28:05.670030 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d6f6548-9254-43a1-b236-d393886fb553" path="/var/lib/kubelet/pods/7d6f6548-9254-43a1-b236-d393886fb553/volumes" Mar 09 09:28:06 crc kubenswrapper[4861]: I0309 09:28:06.475342 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mmdfr"] Mar 09 09:28:06 crc kubenswrapper[4861]: E0309 09:28:06.482085 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50e9c52c-8e13-491e-bf32-3daa6bf663bb" containerName="oc" Mar 09 09:28:06 crc kubenswrapper[4861]: I0309 09:28:06.482174 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="50e9c52c-8e13-491e-bf32-3daa6bf663bb" containerName="oc" Mar 09 09:28:06 crc kubenswrapper[4861]: I0309 09:28:06.482443 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="50e9c52c-8e13-491e-bf32-3daa6bf663bb" containerName="oc" Mar 09 09:28:06 crc kubenswrapper[4861]: I0309 09:28:06.483803 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mmdfr" Mar 09 09:28:06 crc kubenswrapper[4861]: I0309 09:28:06.497966 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mmdfr"] Mar 09 09:28:06 crc kubenswrapper[4861]: I0309 09:28:06.541316 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/379c2378-7f4c-402c-af79-d6b0996dcac3-utilities\") pod \"redhat-operators-mmdfr\" (UID: \"379c2378-7f4c-402c-af79-d6b0996dcac3\") " pod="openshift-marketplace/redhat-operators-mmdfr" Mar 09 09:28:06 crc kubenswrapper[4861]: I0309 09:28:06.541516 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/379c2378-7f4c-402c-af79-d6b0996dcac3-catalog-content\") pod \"redhat-operators-mmdfr\" (UID: \"379c2378-7f4c-402c-af79-d6b0996dcac3\") " pod="openshift-marketplace/redhat-operators-mmdfr" Mar 09 09:28:06 crc kubenswrapper[4861]: I0309 09:28:06.541559 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phjt2\" (UniqueName: \"kubernetes.io/projected/379c2378-7f4c-402c-af79-d6b0996dcac3-kube-api-access-phjt2\") pod \"redhat-operators-mmdfr\" (UID: \"379c2378-7f4c-402c-af79-d6b0996dcac3\") " pod="openshift-marketplace/redhat-operators-mmdfr" Mar 09 09:28:06 crc kubenswrapper[4861]: I0309 09:28:06.643417 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/379c2378-7f4c-402c-af79-d6b0996dcac3-utilities\") pod \"redhat-operators-mmdfr\" (UID: \"379c2378-7f4c-402c-af79-d6b0996dcac3\") " pod="openshift-marketplace/redhat-operators-mmdfr" Mar 09 09:28:06 crc kubenswrapper[4861]: I0309 09:28:06.643547 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/379c2378-7f4c-402c-af79-d6b0996dcac3-catalog-content\") pod \"redhat-operators-mmdfr\" (UID: \"379c2378-7f4c-402c-af79-d6b0996dcac3\") " pod="openshift-marketplace/redhat-operators-mmdfr" Mar 09 09:28:06 crc kubenswrapper[4861]: I0309 09:28:06.643590 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phjt2\" (UniqueName: \"kubernetes.io/projected/379c2378-7f4c-402c-af79-d6b0996dcac3-kube-api-access-phjt2\") pod \"redhat-operators-mmdfr\" (UID: \"379c2378-7f4c-402c-af79-d6b0996dcac3\") " pod="openshift-marketplace/redhat-operators-mmdfr" Mar 09 09:28:06 crc kubenswrapper[4861]: I0309 09:28:06.643975 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/379c2378-7f4c-402c-af79-d6b0996dcac3-utilities\") pod \"redhat-operators-mmdfr\" (UID: \"379c2378-7f4c-402c-af79-d6b0996dcac3\") " pod="openshift-marketplace/redhat-operators-mmdfr" Mar 09 09:28:06 crc kubenswrapper[4861]: I0309 09:28:06.644046 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/379c2378-7f4c-402c-af79-d6b0996dcac3-catalog-content\") pod \"redhat-operators-mmdfr\" (UID: \"379c2378-7f4c-402c-af79-d6b0996dcac3\") " pod="openshift-marketplace/redhat-operators-mmdfr" Mar 09 09:28:06 crc kubenswrapper[4861]: I0309 09:28:06.664195 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phjt2\" (UniqueName: \"kubernetes.io/projected/379c2378-7f4c-402c-af79-d6b0996dcac3-kube-api-access-phjt2\") pod \"redhat-operators-mmdfr\" (UID: \"379c2378-7f4c-402c-af79-d6b0996dcac3\") " pod="openshift-marketplace/redhat-operators-mmdfr" Mar 09 09:28:06 crc kubenswrapper[4861]: I0309 09:28:06.834672 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mmdfr" Mar 09 09:28:07 crc kubenswrapper[4861]: I0309 09:28:07.289795 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 09 09:28:07 crc kubenswrapper[4861]: I0309 09:28:07.290700 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 09 09:28:07 crc kubenswrapper[4861]: I0309 09:28:07.325270 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mmdfr"] Mar 09 09:28:08 crc kubenswrapper[4861]: I0309 09:28:08.120840 4861 generic.go:334] "Generic (PLEG): container finished" podID="379c2378-7f4c-402c-af79-d6b0996dcac3" containerID="c18a92d677bd7dbfbc4f09e60597631bcfad90a45fef48e18855957c7eadd709" exitCode=0 Mar 09 09:28:08 crc kubenswrapper[4861]: I0309 09:28:08.121002 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mmdfr" event={"ID":"379c2378-7f4c-402c-af79-d6b0996dcac3","Type":"ContainerDied","Data":"c18a92d677bd7dbfbc4f09e60597631bcfad90a45fef48e18855957c7eadd709"} Mar 09 09:28:08 crc kubenswrapper[4861]: I0309 09:28:08.121700 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mmdfr" event={"ID":"379c2378-7f4c-402c-af79-d6b0996dcac3","Type":"ContainerStarted","Data":"5b2b63dc5b67d4b04c7935f828d1d60b0be0193468c0b77b46692f0b917744be"} Mar 09 09:28:09 crc kubenswrapper[4861]: I0309 09:28:09.133280 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mmdfr" event={"ID":"379c2378-7f4c-402c-af79-d6b0996dcac3","Type":"ContainerStarted","Data":"48f906d13752529add49e639f53b4060e19162ca9ea057695f8cda449df4cd1a"} Mar 09 09:28:09 crc kubenswrapper[4861]: E0309 09:28:09.816586 4861 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode81aeea2_beec_4987_b527_db644692cb14.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode81aeea2_beec_4987_b527_db644692cb14.slice/crio-7aa755ba2b534ec33b73942a0dcad7c6918ecf98e43652615d68efcdc114b08e\": RecentStats: unable to find data in memory cache]" Mar 09 09:28:10 crc kubenswrapper[4861]: I0309 09:28:10.164473 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 09 09:28:10 crc kubenswrapper[4861]: I0309 09:28:10.192853 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 09 09:28:10 crc kubenswrapper[4861]: I0309 09:28:10.451949 4861 scope.go:117] "RemoveContainer" containerID="ff7c9025092c9f3c0f4abb7501ffbadeb30751bc3f3e130b40b0e00b37b51d64" Mar 09 09:28:11 crc kubenswrapper[4861]: I0309 09:28:11.153116 4861 generic.go:334] "Generic (PLEG): container finished" podID="379c2378-7f4c-402c-af79-d6b0996dcac3" containerID="48f906d13752529add49e639f53b4060e19162ca9ea057695f8cda449df4cd1a" exitCode=0 Mar 09 09:28:11 crc kubenswrapper[4861]: I0309 09:28:11.153190 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mmdfr" event={"ID":"379c2378-7f4c-402c-af79-d6b0996dcac3","Type":"ContainerDied","Data":"48f906d13752529add49e639f53b4060e19162ca9ea057695f8cda449df4cd1a"} Mar 09 09:28:11 crc kubenswrapper[4861]: I0309 09:28:11.182173 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 09 09:28:12 crc kubenswrapper[4861]: I0309 09:28:12.290034 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 09 09:28:12 crc kubenswrapper[4861]: I0309 09:28:12.290091 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 09 09:28:12 crc kubenswrapper[4861]: I0309 09:28:12.752022 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 09 09:28:12 crc kubenswrapper[4861]: I0309 09:28:12.752843 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 09 09:28:13 crc kubenswrapper[4861]: I0309 09:28:13.174317 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mmdfr" event={"ID":"379c2378-7f4c-402c-af79-d6b0996dcac3","Type":"ContainerStarted","Data":"2b418ccea885e78071dee4874397a4fea350a5143e95eb4b5022f394cf1a158f"} Mar 09 09:28:13 crc kubenswrapper[4861]: I0309 09:28:13.191036 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mmdfr" podStartSLOduration=2.741614042 podStartE2EDuration="7.191018714s" podCreationTimestamp="2026-03-09 09:28:06 +0000 UTC" firstStartedPulling="2026-03-09 09:28:08.124691153 +0000 UTC m=+1331.209730564" lastFinishedPulling="2026-03-09 09:28:12.574095845 +0000 UTC m=+1335.659135236" observedRunningTime="2026-03-09 09:28:13.189563802 +0000 UTC m=+1336.274603203" watchObservedRunningTime="2026-03-09 09:28:13.191018714 +0000 UTC m=+1336.276058115" Mar 09 09:28:13 crc kubenswrapper[4861]: I0309 09:28:13.304515 4861 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="9d5c1be9-8604-4565-9175-703ff865c6eb" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.214:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 09:28:13 crc kubenswrapper[4861]: I0309 09:28:13.304516 4861 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="9d5c1be9-8604-4565-9175-703ff865c6eb" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.214:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 09:28:13 crc kubenswrapper[4861]: I0309 09:28:13.768642 4861 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a7f9f9a6-c593-4015-833b-ef237f492b70" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.215:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 09:28:13 crc kubenswrapper[4861]: I0309 09:28:13.768869 4861 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a7f9f9a6-c593-4015-833b-ef237f492b70" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.215:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 09:28:16 crc kubenswrapper[4861]: I0309 09:28:16.834802 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mmdfr" Mar 09 09:28:16 crc kubenswrapper[4861]: I0309 09:28:16.835328 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mmdfr" Mar 09 09:28:17 crc kubenswrapper[4861]: I0309 09:28:17.896959 4861 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mmdfr" podUID="379c2378-7f4c-402c-af79-d6b0996dcac3" containerName="registry-server" probeResult="failure" output=< Mar 09 09:28:17 crc kubenswrapper[4861]: timeout: failed to connect service ":50051" within 1s Mar 09 09:28:17 crc kubenswrapper[4861]: > Mar 09 09:28:20 crc kubenswrapper[4861]: E0309 09:28:20.056459 4861 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode81aeea2_beec_4987_b527_db644692cb14.slice/crio-7aa755ba2b534ec33b73942a0dcad7c6918ecf98e43652615d68efcdc114b08e\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode81aeea2_beec_4987_b527_db644692cb14.slice\": RecentStats: unable to find data in memory cache]" Mar 09 09:28:22 crc kubenswrapper[4861]: I0309 09:28:22.116436 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 09 09:28:22 crc kubenswrapper[4861]: I0309 09:28:22.294812 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 09 09:28:22 crc kubenswrapper[4861]: I0309 09:28:22.295327 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 09 09:28:22 crc kubenswrapper[4861]: I0309 09:28:22.303590 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 09 09:28:22 crc kubenswrapper[4861]: I0309 09:28:22.760591 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 09 09:28:22 crc kubenswrapper[4861]: I0309 09:28:22.760649 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 09 09:28:22 crc kubenswrapper[4861]: I0309 09:28:22.761339 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 09 09:28:22 crc kubenswrapper[4861]: I0309 09:28:22.761797 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 09 09:28:22 crc kubenswrapper[4861]: I0309 09:28:22.768728 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 09 09:28:22 crc kubenswrapper[4861]: I0309 09:28:22.770565 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 09 09:28:23 crc kubenswrapper[4861]: I0309 09:28:23.269469 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 09 09:28:24 crc kubenswrapper[4861]: I0309 09:28:24.605580 4861 patch_prober.go:28] interesting pod/machine-config-daemon-5g7gc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:28:24 crc kubenswrapper[4861]: I0309 09:28:24.605912 4861 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:28:27 crc kubenswrapper[4861]: I0309 09:28:27.879853 4861 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mmdfr" podUID="379c2378-7f4c-402c-af79-d6b0996dcac3" containerName="registry-server" probeResult="failure" output=< Mar 09 09:28:27 crc kubenswrapper[4861]: timeout: failed to connect service ":50051" within 1s Mar 09 09:28:27 crc kubenswrapper[4861]: > Mar 09 09:28:30 crc kubenswrapper[4861]: E0309 09:28:30.274675 4861 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode81aeea2_beec_4987_b527_db644692cb14.slice/crio-7aa755ba2b534ec33b73942a0dcad7c6918ecf98e43652615d68efcdc114b08e\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode81aeea2_beec_4987_b527_db644692cb14.slice\": RecentStats: unable to find data in memory cache]" Mar 09 09:28:30 crc kubenswrapper[4861]: I0309 09:28:30.733666 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 09:28:31 crc kubenswrapper[4861]: I0309 09:28:31.601768 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 09:28:35 crc kubenswrapper[4861]: I0309 09:28:35.018858 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="b9b83355-ea40-4408-9b77-c717df91e1a9" containerName="rabbitmq" containerID="cri-o://346826633ece68c2b0831d42c9df1b8a5515d5eb3dc4b95444c645750c0ab2c1" gracePeriod=604796 Mar 09 09:28:35 crc kubenswrapper[4861]: I0309 09:28:35.559448 4861 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="b9b83355-ea40-4408-9b77-c717df91e1a9" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Mar 09 09:28:35 crc kubenswrapper[4861]: I0309 09:28:35.824538 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="03452acf-c21f-4d68-a813-772c30604a60" containerName="rabbitmq" containerID="cri-o://e064de58e735f17e2c137502c0da87ac3828eb3e26b02f3443abe788b0ffe732" gracePeriod=604796 Mar 09 09:28:37 crc kubenswrapper[4861]: I0309 09:28:37.885290 4861 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mmdfr" podUID="379c2378-7f4c-402c-af79-d6b0996dcac3" containerName="registry-server" probeResult="failure" output=< Mar 09 09:28:37 crc kubenswrapper[4861]: timeout: failed to connect service ":50051" within 1s Mar 09 09:28:37 crc kubenswrapper[4861]: > Mar 09 09:28:40 crc kubenswrapper[4861]: E0309 09:28:40.518918 4861 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode81aeea2_beec_4987_b527_db644692cb14.slice/crio-7aa755ba2b534ec33b73942a0dcad7c6918ecf98e43652615d68efcdc114b08e\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode81aeea2_beec_4987_b527_db644692cb14.slice\": RecentStats: unable to find data in memory cache]" Mar 09 09:28:41 crc kubenswrapper[4861]: I0309 09:28:41.414955 4861 generic.go:334] "Generic (PLEG): container finished" podID="b9b83355-ea40-4408-9b77-c717df91e1a9" containerID="346826633ece68c2b0831d42c9df1b8a5515d5eb3dc4b95444c645750c0ab2c1" exitCode=0 Mar 09 09:28:41 crc kubenswrapper[4861]: I0309 09:28:41.415007 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b9b83355-ea40-4408-9b77-c717df91e1a9","Type":"ContainerDied","Data":"346826633ece68c2b0831d42c9df1b8a5515d5eb3dc4b95444c645750c0ab2c1"} Mar 09 09:28:41 crc kubenswrapper[4861]: I0309 09:28:41.735016 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 09 09:28:41 crc kubenswrapper[4861]: I0309 09:28:41.854880 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b9b83355-ea40-4408-9b77-c717df91e1a9-rabbitmq-plugins\") pod \"b9b83355-ea40-4408-9b77-c717df91e1a9\" (UID: \"b9b83355-ea40-4408-9b77-c717df91e1a9\") " Mar 09 09:28:41 crc kubenswrapper[4861]: I0309 09:28:41.854962 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b9b83355-ea40-4408-9b77-c717df91e1a9-server-conf\") pod \"b9b83355-ea40-4408-9b77-c717df91e1a9\" (UID: \"b9b83355-ea40-4408-9b77-c717df91e1a9\") " Mar 09 09:28:41 crc kubenswrapper[4861]: I0309 09:28:41.855031 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b9b83355-ea40-4408-9b77-c717df91e1a9-pod-info\") pod \"b9b83355-ea40-4408-9b77-c717df91e1a9\" (UID: \"b9b83355-ea40-4408-9b77-c717df91e1a9\") " Mar 09 09:28:41 crc kubenswrapper[4861]: I0309 09:28:41.855066 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r246d\" (UniqueName: \"kubernetes.io/projected/b9b83355-ea40-4408-9b77-c717df91e1a9-kube-api-access-r246d\") pod \"b9b83355-ea40-4408-9b77-c717df91e1a9\" (UID: \"b9b83355-ea40-4408-9b77-c717df91e1a9\") " Mar 09 09:28:41 crc kubenswrapper[4861]: I0309 09:28:41.855140 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b9b83355-ea40-4408-9b77-c717df91e1a9-plugins-conf\") pod \"b9b83355-ea40-4408-9b77-c717df91e1a9\" (UID: \"b9b83355-ea40-4408-9b77-c717df91e1a9\") " Mar 09 09:28:41 crc kubenswrapper[4861]: I0309 09:28:41.855189 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b9b83355-ea40-4408-9b77-c717df91e1a9-rabbitmq-tls\") pod \"b9b83355-ea40-4408-9b77-c717df91e1a9\" (UID: \"b9b83355-ea40-4408-9b77-c717df91e1a9\") " Mar 09 09:28:41 crc kubenswrapper[4861]: I0309 09:28:41.855217 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b9b83355-ea40-4408-9b77-c717df91e1a9-config-data\") pod \"b9b83355-ea40-4408-9b77-c717df91e1a9\" (UID: \"b9b83355-ea40-4408-9b77-c717df91e1a9\") " Mar 09 09:28:41 crc kubenswrapper[4861]: I0309 09:28:41.855235 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b9b83355-ea40-4408-9b77-c717df91e1a9-rabbitmq-confd\") pod \"b9b83355-ea40-4408-9b77-c717df91e1a9\" (UID: \"b9b83355-ea40-4408-9b77-c717df91e1a9\") " Mar 09 09:28:41 crc kubenswrapper[4861]: I0309 09:28:41.855285 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"b9b83355-ea40-4408-9b77-c717df91e1a9\" (UID: \"b9b83355-ea40-4408-9b77-c717df91e1a9\") " Mar 09 09:28:41 crc kubenswrapper[4861]: I0309 09:28:41.855301 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b9b83355-ea40-4408-9b77-c717df91e1a9-erlang-cookie-secret\") pod \"b9b83355-ea40-4408-9b77-c717df91e1a9\" (UID: \"b9b83355-ea40-4408-9b77-c717df91e1a9\") " Mar 09 09:28:41 crc kubenswrapper[4861]: I0309 09:28:41.855344 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b9b83355-ea40-4408-9b77-c717df91e1a9-rabbitmq-erlang-cookie\") pod \"b9b83355-ea40-4408-9b77-c717df91e1a9\" (UID: \"b9b83355-ea40-4408-9b77-c717df91e1a9\") " Mar 09 09:28:41 crc kubenswrapper[4861]: I0309 09:28:41.857603 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9b83355-ea40-4408-9b77-c717df91e1a9-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "b9b83355-ea40-4408-9b77-c717df91e1a9" (UID: "b9b83355-ea40-4408-9b77-c717df91e1a9"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:28:41 crc kubenswrapper[4861]: I0309 09:28:41.858910 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9b83355-ea40-4408-9b77-c717df91e1a9-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "b9b83355-ea40-4408-9b77-c717df91e1a9" (UID: "b9b83355-ea40-4408-9b77-c717df91e1a9"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:28:41 crc kubenswrapper[4861]: I0309 09:28:41.863182 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9b83355-ea40-4408-9b77-c717df91e1a9-kube-api-access-r246d" (OuterVolumeSpecName: "kube-api-access-r246d") pod "b9b83355-ea40-4408-9b77-c717df91e1a9" (UID: "b9b83355-ea40-4408-9b77-c717df91e1a9"). InnerVolumeSpecName "kube-api-access-r246d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:28:41 crc kubenswrapper[4861]: I0309 09:28:41.864184 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9b83355-ea40-4408-9b77-c717df91e1a9-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "b9b83355-ea40-4408-9b77-c717df91e1a9" (UID: "b9b83355-ea40-4408-9b77-c717df91e1a9"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:28:41 crc kubenswrapper[4861]: I0309 09:28:41.865698 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/b9b83355-ea40-4408-9b77-c717df91e1a9-pod-info" (OuterVolumeSpecName: "pod-info") pod "b9b83355-ea40-4408-9b77-c717df91e1a9" (UID: "b9b83355-ea40-4408-9b77-c717df91e1a9"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 09 09:28:41 crc kubenswrapper[4861]: I0309 09:28:41.870915 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9b83355-ea40-4408-9b77-c717df91e1a9-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "b9b83355-ea40-4408-9b77-c717df91e1a9" (UID: "b9b83355-ea40-4408-9b77-c717df91e1a9"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:28:41 crc kubenswrapper[4861]: I0309 09:28:41.872300 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "b9b83355-ea40-4408-9b77-c717df91e1a9" (UID: "b9b83355-ea40-4408-9b77-c717df91e1a9"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 09 09:28:41 crc kubenswrapper[4861]: I0309 09:28:41.887404 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9b83355-ea40-4408-9b77-c717df91e1a9-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "b9b83355-ea40-4408-9b77-c717df91e1a9" (UID: "b9b83355-ea40-4408-9b77-c717df91e1a9"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:28:41 crc kubenswrapper[4861]: I0309 09:28:41.889187 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9b83355-ea40-4408-9b77-c717df91e1a9-config-data" (OuterVolumeSpecName: "config-data") pod "b9b83355-ea40-4408-9b77-c717df91e1a9" (UID: "b9b83355-ea40-4408-9b77-c717df91e1a9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:28:41 crc kubenswrapper[4861]: I0309 09:28:41.929048 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9b83355-ea40-4408-9b77-c717df91e1a9-server-conf" (OuterVolumeSpecName: "server-conf") pod "b9b83355-ea40-4408-9b77-c717df91e1a9" (UID: "b9b83355-ea40-4408-9b77-c717df91e1a9"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:28:41 crc kubenswrapper[4861]: I0309 09:28:41.959189 4861 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b9b83355-ea40-4408-9b77-c717df91e1a9-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:41 crc kubenswrapper[4861]: I0309 09:28:41.959233 4861 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b9b83355-ea40-4408-9b77-c717df91e1a9-server-conf\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:41 crc kubenswrapper[4861]: I0309 09:28:41.959247 4861 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b9b83355-ea40-4408-9b77-c717df91e1a9-pod-info\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:41 crc kubenswrapper[4861]: I0309 09:28:41.959259 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r246d\" (UniqueName: \"kubernetes.io/projected/b9b83355-ea40-4408-9b77-c717df91e1a9-kube-api-access-r246d\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:41 crc kubenswrapper[4861]: I0309 09:28:41.959273 4861 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b9b83355-ea40-4408-9b77-c717df91e1a9-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:41 crc kubenswrapper[4861]: I0309 09:28:41.959284 4861 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b9b83355-ea40-4408-9b77-c717df91e1a9-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:41 crc kubenswrapper[4861]: I0309 09:28:41.959294 4861 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b9b83355-ea40-4408-9b77-c717df91e1a9-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:41 crc kubenswrapper[4861]: I0309 09:28:41.959323 4861 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Mar 09 09:28:41 crc kubenswrapper[4861]: I0309 09:28:41.959334 4861 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b9b83355-ea40-4408-9b77-c717df91e1a9-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:41 crc kubenswrapper[4861]: I0309 09:28:41.959345 4861 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b9b83355-ea40-4408-9b77-c717df91e1a9-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:41 crc kubenswrapper[4861]: I0309 09:28:41.986535 4861 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.012754 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9b83355-ea40-4408-9b77-c717df91e1a9-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "b9b83355-ea40-4408-9b77-c717df91e1a9" (UID: "b9b83355-ea40-4408-9b77-c717df91e1a9"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.061704 4861 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b9b83355-ea40-4408-9b77-c717df91e1a9-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.061751 4861 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.313437 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.366740 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/03452acf-c21f-4d68-a813-772c30604a60-server-conf\") pod \"03452acf-c21f-4d68-a813-772c30604a60\" (UID: \"03452acf-c21f-4d68-a813-772c30604a60\") " Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.366790 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttclt\" (UniqueName: \"kubernetes.io/projected/03452acf-c21f-4d68-a813-772c30604a60-kube-api-access-ttclt\") pod \"03452acf-c21f-4d68-a813-772c30604a60\" (UID: \"03452acf-c21f-4d68-a813-772c30604a60\") " Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.366825 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/03452acf-c21f-4d68-a813-772c30604a60-rabbitmq-erlang-cookie\") pod \"03452acf-c21f-4d68-a813-772c30604a60\" (UID: \"03452acf-c21f-4d68-a813-772c30604a60\") " Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.366850 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"03452acf-c21f-4d68-a813-772c30604a60\" (UID: \"03452acf-c21f-4d68-a813-772c30604a60\") " Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.366897 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/03452acf-c21f-4d68-a813-772c30604a60-rabbitmq-confd\") pod \"03452acf-c21f-4d68-a813-772c30604a60\" (UID: \"03452acf-c21f-4d68-a813-772c30604a60\") " Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.367019 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/03452acf-c21f-4d68-a813-772c30604a60-erlang-cookie-secret\") pod \"03452acf-c21f-4d68-a813-772c30604a60\" (UID: \"03452acf-c21f-4d68-a813-772c30604a60\") " Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.367069 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/03452acf-c21f-4d68-a813-772c30604a60-pod-info\") pod \"03452acf-c21f-4d68-a813-772c30604a60\" (UID: \"03452acf-c21f-4d68-a813-772c30604a60\") " Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.367116 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/03452acf-c21f-4d68-a813-772c30604a60-plugins-conf\") pod \"03452acf-c21f-4d68-a813-772c30604a60\" (UID: \"03452acf-c21f-4d68-a813-772c30604a60\") " Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.367137 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/03452acf-c21f-4d68-a813-772c30604a60-rabbitmq-tls\") pod \"03452acf-c21f-4d68-a813-772c30604a60\" (UID: \"03452acf-c21f-4d68-a813-772c30604a60\") " Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.367151 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/03452acf-c21f-4d68-a813-772c30604a60-config-data\") pod \"03452acf-c21f-4d68-a813-772c30604a60\" (UID: \"03452acf-c21f-4d68-a813-772c30604a60\") " Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.367174 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/03452acf-c21f-4d68-a813-772c30604a60-rabbitmq-plugins\") pod \"03452acf-c21f-4d68-a813-772c30604a60\" (UID: \"03452acf-c21f-4d68-a813-772c30604a60\") " Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.367932 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03452acf-c21f-4d68-a813-772c30604a60-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "03452acf-c21f-4d68-a813-772c30604a60" (UID: "03452acf-c21f-4d68-a813-772c30604a60"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.369823 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03452acf-c21f-4d68-a813-772c30604a60-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "03452acf-c21f-4d68-a813-772c30604a60" (UID: "03452acf-c21f-4d68-a813-772c30604a60"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.372137 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03452acf-c21f-4d68-a813-772c30604a60-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "03452acf-c21f-4d68-a813-772c30604a60" (UID: "03452acf-c21f-4d68-a813-772c30604a60"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.373257 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03452acf-c21f-4d68-a813-772c30604a60-kube-api-access-ttclt" (OuterVolumeSpecName: "kube-api-access-ttclt") pod "03452acf-c21f-4d68-a813-772c30604a60" (UID: "03452acf-c21f-4d68-a813-772c30604a60"). InnerVolumeSpecName "kube-api-access-ttclt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.373590 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03452acf-c21f-4d68-a813-772c30604a60-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "03452acf-c21f-4d68-a813-772c30604a60" (UID: "03452acf-c21f-4d68-a813-772c30604a60"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.378826 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "persistence") pod "03452acf-c21f-4d68-a813-772c30604a60" (UID: "03452acf-c21f-4d68-a813-772c30604a60"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.382588 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03452acf-c21f-4d68-a813-772c30604a60-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "03452acf-c21f-4d68-a813-772c30604a60" (UID: "03452acf-c21f-4d68-a813-772c30604a60"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.383052 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/03452acf-c21f-4d68-a813-772c30604a60-pod-info" (OuterVolumeSpecName: "pod-info") pod "03452acf-c21f-4d68-a813-772c30604a60" (UID: "03452acf-c21f-4d68-a813-772c30604a60"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.397295 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03452acf-c21f-4d68-a813-772c30604a60-config-data" (OuterVolumeSpecName: "config-data") pod "03452acf-c21f-4d68-a813-772c30604a60" (UID: "03452acf-c21f-4d68-a813-772c30604a60"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.458586 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b9b83355-ea40-4408-9b77-c717df91e1a9","Type":"ContainerDied","Data":"aa8b234a3c997c263789e227094be28f7940c63c80e3a887eaa0d8442315f90d"} Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.458676 4861 scope.go:117] "RemoveContainer" containerID="346826633ece68c2b0831d42c9df1b8a5515d5eb3dc4b95444c645750c0ab2c1" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.458633 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.466599 4861 generic.go:334] "Generic (PLEG): container finished" podID="03452acf-c21f-4d68-a813-772c30604a60" containerID="e064de58e735f17e2c137502c0da87ac3828eb3e26b02f3443abe788b0ffe732" exitCode=0 Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.466646 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"03452acf-c21f-4d68-a813-772c30604a60","Type":"ContainerDied","Data":"e064de58e735f17e2c137502c0da87ac3828eb3e26b02f3443abe788b0ffe732"} Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.466678 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"03452acf-c21f-4d68-a813-772c30604a60","Type":"ContainerDied","Data":"55cc8a611264a1d0d865a6150bf4e505782ad035c68ffdad302f5ad95fd38caf"} Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.467563 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.471162 4861 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/03452acf-c21f-4d68-a813-772c30604a60-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.471195 4861 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/03452acf-c21f-4d68-a813-772c30604a60-pod-info\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.471208 4861 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/03452acf-c21f-4d68-a813-772c30604a60-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.471259 4861 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/03452acf-c21f-4d68-a813-772c30604a60-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.471271 4861 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/03452acf-c21f-4d68-a813-772c30604a60-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.471281 4861 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/03452acf-c21f-4d68-a813-772c30604a60-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.471292 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttclt\" (UniqueName: \"kubernetes.io/projected/03452acf-c21f-4d68-a813-772c30604a60-kube-api-access-ttclt\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.471305 4861 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/03452acf-c21f-4d68-a813-772c30604a60-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.471418 4861 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.476304 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03452acf-c21f-4d68-a813-772c30604a60-server-conf" (OuterVolumeSpecName: "server-conf") pod "03452acf-c21f-4d68-a813-772c30604a60" (UID: "03452acf-c21f-4d68-a813-772c30604a60"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.496326 4861 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.552255 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03452acf-c21f-4d68-a813-772c30604a60-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "03452acf-c21f-4d68-a813-772c30604a60" (UID: "03452acf-c21f-4d68-a813-772c30604a60"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.573957 4861 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/03452acf-c21f-4d68-a813-772c30604a60-server-conf\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.573996 4861 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.574008 4861 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/03452acf-c21f-4d68-a813-772c30604a60-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.583787 4861 scope.go:117] "RemoveContainer" containerID="31586b8681c909460b5f41e700f0e45d5675654a1c0fc223423f5e764a90eb87" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.587519 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.597692 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.621346 4861 scope.go:117] "RemoveContainer" containerID="e064de58e735f17e2c137502c0da87ac3828eb3e26b02f3443abe788b0ffe732" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.622752 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 09:28:42 crc kubenswrapper[4861]: E0309 09:28:42.623174 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9b83355-ea40-4408-9b77-c717df91e1a9" containerName="rabbitmq" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.623198 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9b83355-ea40-4408-9b77-c717df91e1a9" containerName="rabbitmq" Mar 09 09:28:42 crc kubenswrapper[4861]: E0309 09:28:42.623216 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03452acf-c21f-4d68-a813-772c30604a60" containerName="setup-container" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.623224 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="03452acf-c21f-4d68-a813-772c30604a60" containerName="setup-container" Mar 09 09:28:42 crc kubenswrapper[4861]: E0309 09:28:42.623250 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03452acf-c21f-4d68-a813-772c30604a60" containerName="rabbitmq" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.623258 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="03452acf-c21f-4d68-a813-772c30604a60" containerName="rabbitmq" Mar 09 09:28:42 crc kubenswrapper[4861]: E0309 09:28:42.623287 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9b83355-ea40-4408-9b77-c717df91e1a9" containerName="setup-container" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.623296 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9b83355-ea40-4408-9b77-c717df91e1a9" containerName="setup-container" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.623596 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9b83355-ea40-4408-9b77-c717df91e1a9" containerName="rabbitmq" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.623621 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="03452acf-c21f-4d68-a813-772c30604a60" containerName="rabbitmq" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.625016 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.627906 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.630627 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.630911 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.631025 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.631237 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.631422 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.631554 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-zmgq2" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.636388 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.675736 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2c3f8770-f9a3-49ae-81e0-caad7b40ac46-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"2c3f8770-f9a3-49ae-81e0-caad7b40ac46\") " pod="openstack/rabbitmq-server-0" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.675792 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2c3f8770-f9a3-49ae-81e0-caad7b40ac46-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"2c3f8770-f9a3-49ae-81e0-caad7b40ac46\") " pod="openstack/rabbitmq-server-0" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.675825 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2c3f8770-f9a3-49ae-81e0-caad7b40ac46-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"2c3f8770-f9a3-49ae-81e0-caad7b40ac46\") " pod="openstack/rabbitmq-server-0" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.675878 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2c3f8770-f9a3-49ae-81e0-caad7b40ac46-pod-info\") pod \"rabbitmq-server-0\" (UID: \"2c3f8770-f9a3-49ae-81e0-caad7b40ac46\") " pod="openstack/rabbitmq-server-0" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.675896 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2c3f8770-f9a3-49ae-81e0-caad7b40ac46-config-data\") pod \"rabbitmq-server-0\" (UID: \"2c3f8770-f9a3-49ae-81e0-caad7b40ac46\") " pod="openstack/rabbitmq-server-0" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.675913 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bpxz\" (UniqueName: \"kubernetes.io/projected/2c3f8770-f9a3-49ae-81e0-caad7b40ac46-kube-api-access-6bpxz\") pod \"rabbitmq-server-0\" (UID: \"2c3f8770-f9a3-49ae-81e0-caad7b40ac46\") " pod="openstack/rabbitmq-server-0" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.675958 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2c3f8770-f9a3-49ae-81e0-caad7b40ac46-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"2c3f8770-f9a3-49ae-81e0-caad7b40ac46\") " pod="openstack/rabbitmq-server-0" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.675991 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"2c3f8770-f9a3-49ae-81e0-caad7b40ac46\") " pod="openstack/rabbitmq-server-0" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.676027 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2c3f8770-f9a3-49ae-81e0-caad7b40ac46-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"2c3f8770-f9a3-49ae-81e0-caad7b40ac46\") " pod="openstack/rabbitmq-server-0" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.676099 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2c3f8770-f9a3-49ae-81e0-caad7b40ac46-server-conf\") pod \"rabbitmq-server-0\" (UID: \"2c3f8770-f9a3-49ae-81e0-caad7b40ac46\") " pod="openstack/rabbitmq-server-0" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.676125 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2c3f8770-f9a3-49ae-81e0-caad7b40ac46-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"2c3f8770-f9a3-49ae-81e0-caad7b40ac46\") " pod="openstack/rabbitmq-server-0" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.680125 4861 scope.go:117] "RemoveContainer" containerID="a9f20eea0867df7623d564d9ee258fb8c480222ebfae15bc43c9f672e31b8bfc" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.737433 4861 scope.go:117] "RemoveContainer" containerID="e064de58e735f17e2c137502c0da87ac3828eb3e26b02f3443abe788b0ffe732" Mar 09 09:28:42 crc kubenswrapper[4861]: E0309 09:28:42.738445 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e064de58e735f17e2c137502c0da87ac3828eb3e26b02f3443abe788b0ffe732\": container with ID starting with e064de58e735f17e2c137502c0da87ac3828eb3e26b02f3443abe788b0ffe732 not found: ID does not exist" containerID="e064de58e735f17e2c137502c0da87ac3828eb3e26b02f3443abe788b0ffe732" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.738482 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e064de58e735f17e2c137502c0da87ac3828eb3e26b02f3443abe788b0ffe732"} err="failed to get container status \"e064de58e735f17e2c137502c0da87ac3828eb3e26b02f3443abe788b0ffe732\": rpc error: code = NotFound desc = could not find container \"e064de58e735f17e2c137502c0da87ac3828eb3e26b02f3443abe788b0ffe732\": container with ID starting with e064de58e735f17e2c137502c0da87ac3828eb3e26b02f3443abe788b0ffe732 not found: ID does not exist" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.738526 4861 scope.go:117] "RemoveContainer" containerID="a9f20eea0867df7623d564d9ee258fb8c480222ebfae15bc43c9f672e31b8bfc" Mar 09 09:28:42 crc kubenswrapper[4861]: E0309 09:28:42.738958 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9f20eea0867df7623d564d9ee258fb8c480222ebfae15bc43c9f672e31b8bfc\": container with ID starting with a9f20eea0867df7623d564d9ee258fb8c480222ebfae15bc43c9f672e31b8bfc not found: ID does not exist" containerID="a9f20eea0867df7623d564d9ee258fb8c480222ebfae15bc43c9f672e31b8bfc" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.738999 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9f20eea0867df7623d564d9ee258fb8c480222ebfae15bc43c9f672e31b8bfc"} err="failed to get container status \"a9f20eea0867df7623d564d9ee258fb8c480222ebfae15bc43c9f672e31b8bfc\": rpc error: code = NotFound desc = could not find container \"a9f20eea0867df7623d564d9ee258fb8c480222ebfae15bc43c9f672e31b8bfc\": container with ID starting with a9f20eea0867df7623d564d9ee258fb8c480222ebfae15bc43c9f672e31b8bfc not found: ID does not exist" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.778304 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"2c3f8770-f9a3-49ae-81e0-caad7b40ac46\") " pod="openstack/rabbitmq-server-0" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.778353 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2c3f8770-f9a3-49ae-81e0-caad7b40ac46-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"2c3f8770-f9a3-49ae-81e0-caad7b40ac46\") " pod="openstack/rabbitmq-server-0" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.778494 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2c3f8770-f9a3-49ae-81e0-caad7b40ac46-server-conf\") pod \"rabbitmq-server-0\" (UID: \"2c3f8770-f9a3-49ae-81e0-caad7b40ac46\") " pod="openstack/rabbitmq-server-0" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.778533 4861 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"2c3f8770-f9a3-49ae-81e0-caad7b40ac46\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.778958 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2c3f8770-f9a3-49ae-81e0-caad7b40ac46-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"2c3f8770-f9a3-49ae-81e0-caad7b40ac46\") " pod="openstack/rabbitmq-server-0" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.778540 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2c3f8770-f9a3-49ae-81e0-caad7b40ac46-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"2c3f8770-f9a3-49ae-81e0-caad7b40ac46\") " pod="openstack/rabbitmq-server-0" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.779305 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2c3f8770-f9a3-49ae-81e0-caad7b40ac46-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"2c3f8770-f9a3-49ae-81e0-caad7b40ac46\") " pod="openstack/rabbitmq-server-0" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.779362 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2c3f8770-f9a3-49ae-81e0-caad7b40ac46-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"2c3f8770-f9a3-49ae-81e0-caad7b40ac46\") " pod="openstack/rabbitmq-server-0" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.779474 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2c3f8770-f9a3-49ae-81e0-caad7b40ac46-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"2c3f8770-f9a3-49ae-81e0-caad7b40ac46\") " pod="openstack/rabbitmq-server-0" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.779609 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2c3f8770-f9a3-49ae-81e0-caad7b40ac46-pod-info\") pod \"rabbitmq-server-0\" (UID: \"2c3f8770-f9a3-49ae-81e0-caad7b40ac46\") " pod="openstack/rabbitmq-server-0" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.779645 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2c3f8770-f9a3-49ae-81e0-caad7b40ac46-config-data\") pod \"rabbitmq-server-0\" (UID: \"2c3f8770-f9a3-49ae-81e0-caad7b40ac46\") " pod="openstack/rabbitmq-server-0" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.779675 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bpxz\" (UniqueName: \"kubernetes.io/projected/2c3f8770-f9a3-49ae-81e0-caad7b40ac46-kube-api-access-6bpxz\") pod \"rabbitmq-server-0\" (UID: \"2c3f8770-f9a3-49ae-81e0-caad7b40ac46\") " pod="openstack/rabbitmq-server-0" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.779724 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2c3f8770-f9a3-49ae-81e0-caad7b40ac46-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"2c3f8770-f9a3-49ae-81e0-caad7b40ac46\") " pod="openstack/rabbitmq-server-0" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.779785 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2c3f8770-f9a3-49ae-81e0-caad7b40ac46-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"2c3f8770-f9a3-49ae-81e0-caad7b40ac46\") " pod="openstack/rabbitmq-server-0" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.779927 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2c3f8770-f9a3-49ae-81e0-caad7b40ac46-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"2c3f8770-f9a3-49ae-81e0-caad7b40ac46\") " pod="openstack/rabbitmq-server-0" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.780058 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2c3f8770-f9a3-49ae-81e0-caad7b40ac46-server-conf\") pod \"rabbitmq-server-0\" (UID: \"2c3f8770-f9a3-49ae-81e0-caad7b40ac46\") " pod="openstack/rabbitmq-server-0" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.780613 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2c3f8770-f9a3-49ae-81e0-caad7b40ac46-config-data\") pod \"rabbitmq-server-0\" (UID: \"2c3f8770-f9a3-49ae-81e0-caad7b40ac46\") " pod="openstack/rabbitmq-server-0" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.786007 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2c3f8770-f9a3-49ae-81e0-caad7b40ac46-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"2c3f8770-f9a3-49ae-81e0-caad7b40ac46\") " pod="openstack/rabbitmq-server-0" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.786328 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2c3f8770-f9a3-49ae-81e0-caad7b40ac46-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"2c3f8770-f9a3-49ae-81e0-caad7b40ac46\") " pod="openstack/rabbitmq-server-0" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.786356 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2c3f8770-f9a3-49ae-81e0-caad7b40ac46-pod-info\") pod \"rabbitmq-server-0\" (UID: \"2c3f8770-f9a3-49ae-81e0-caad7b40ac46\") " pod="openstack/rabbitmq-server-0" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.798023 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2c3f8770-f9a3-49ae-81e0-caad7b40ac46-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"2c3f8770-f9a3-49ae-81e0-caad7b40ac46\") " pod="openstack/rabbitmq-server-0" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.812103 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bpxz\" (UniqueName: \"kubernetes.io/projected/2c3f8770-f9a3-49ae-81e0-caad7b40ac46-kube-api-access-6bpxz\") pod \"rabbitmq-server-0\" (UID: \"2c3f8770-f9a3-49ae-81e0-caad7b40ac46\") " pod="openstack/rabbitmq-server-0" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.818046 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.827540 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.836154 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.837464 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"2c3f8770-f9a3-49ae-81e0-caad7b40ac46\") " pod="openstack/rabbitmq-server-0" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.838051 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.840302 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.840549 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-lv2dg" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.841648 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.841925 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.842063 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.843022 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.846009 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.890620 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.978778 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.982591 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/36ab59d0-e730-43a5-a7f1-99f136e5f9d3-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"36ab59d0-e730-43a5-a7f1-99f136e5f9d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.982647 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/36ab59d0-e730-43a5-a7f1-99f136e5f9d3-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"36ab59d0-e730-43a5-a7f1-99f136e5f9d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.982694 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/36ab59d0-e730-43a5-a7f1-99f136e5f9d3-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"36ab59d0-e730-43a5-a7f1-99f136e5f9d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.982712 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/36ab59d0-e730-43a5-a7f1-99f136e5f9d3-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"36ab59d0-e730-43a5-a7f1-99f136e5f9d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.982756 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"36ab59d0-e730-43a5-a7f1-99f136e5f9d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.982775 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xf2h5\" (UniqueName: \"kubernetes.io/projected/36ab59d0-e730-43a5-a7f1-99f136e5f9d3-kube-api-access-xf2h5\") pod \"rabbitmq-cell1-server-0\" (UID: \"36ab59d0-e730-43a5-a7f1-99f136e5f9d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.982806 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/36ab59d0-e730-43a5-a7f1-99f136e5f9d3-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"36ab59d0-e730-43a5-a7f1-99f136e5f9d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.982821 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/36ab59d0-e730-43a5-a7f1-99f136e5f9d3-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"36ab59d0-e730-43a5-a7f1-99f136e5f9d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.982858 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/36ab59d0-e730-43a5-a7f1-99f136e5f9d3-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"36ab59d0-e730-43a5-a7f1-99f136e5f9d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.982875 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/36ab59d0-e730-43a5-a7f1-99f136e5f9d3-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"36ab59d0-e730-43a5-a7f1-99f136e5f9d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:28:42 crc kubenswrapper[4861]: I0309 09:28:42.982906 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/36ab59d0-e730-43a5-a7f1-99f136e5f9d3-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"36ab59d0-e730-43a5-a7f1-99f136e5f9d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:28:43 crc kubenswrapper[4861]: I0309 09:28:43.086274 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/36ab59d0-e730-43a5-a7f1-99f136e5f9d3-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"36ab59d0-e730-43a5-a7f1-99f136e5f9d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:28:43 crc kubenswrapper[4861]: I0309 09:28:43.086534 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/36ab59d0-e730-43a5-a7f1-99f136e5f9d3-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"36ab59d0-e730-43a5-a7f1-99f136e5f9d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:28:43 crc kubenswrapper[4861]: I0309 09:28:43.086594 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"36ab59d0-e730-43a5-a7f1-99f136e5f9d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:28:43 crc kubenswrapper[4861]: I0309 09:28:43.086620 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xf2h5\" (UniqueName: \"kubernetes.io/projected/36ab59d0-e730-43a5-a7f1-99f136e5f9d3-kube-api-access-xf2h5\") pod \"rabbitmq-cell1-server-0\" (UID: \"36ab59d0-e730-43a5-a7f1-99f136e5f9d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:28:43 crc kubenswrapper[4861]: I0309 09:28:43.086654 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/36ab59d0-e730-43a5-a7f1-99f136e5f9d3-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"36ab59d0-e730-43a5-a7f1-99f136e5f9d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:28:43 crc kubenswrapper[4861]: I0309 09:28:43.086672 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/36ab59d0-e730-43a5-a7f1-99f136e5f9d3-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"36ab59d0-e730-43a5-a7f1-99f136e5f9d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:28:43 crc kubenswrapper[4861]: I0309 09:28:43.086710 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/36ab59d0-e730-43a5-a7f1-99f136e5f9d3-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"36ab59d0-e730-43a5-a7f1-99f136e5f9d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:28:43 crc kubenswrapper[4861]: I0309 09:28:43.086731 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/36ab59d0-e730-43a5-a7f1-99f136e5f9d3-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"36ab59d0-e730-43a5-a7f1-99f136e5f9d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:28:43 crc kubenswrapper[4861]: I0309 09:28:43.086765 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/36ab59d0-e730-43a5-a7f1-99f136e5f9d3-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"36ab59d0-e730-43a5-a7f1-99f136e5f9d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:28:43 crc kubenswrapper[4861]: I0309 09:28:43.086803 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/36ab59d0-e730-43a5-a7f1-99f136e5f9d3-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"36ab59d0-e730-43a5-a7f1-99f136e5f9d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:28:43 crc kubenswrapper[4861]: I0309 09:28:43.086825 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/36ab59d0-e730-43a5-a7f1-99f136e5f9d3-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"36ab59d0-e730-43a5-a7f1-99f136e5f9d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:28:43 crc kubenswrapper[4861]: I0309 09:28:43.087652 4861 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"36ab59d0-e730-43a5-a7f1-99f136e5f9d3\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:28:43 crc kubenswrapper[4861]: I0309 09:28:43.091799 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/36ab59d0-e730-43a5-a7f1-99f136e5f9d3-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"36ab59d0-e730-43a5-a7f1-99f136e5f9d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:28:43 crc kubenswrapper[4861]: I0309 09:28:43.093135 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/36ab59d0-e730-43a5-a7f1-99f136e5f9d3-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"36ab59d0-e730-43a5-a7f1-99f136e5f9d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:28:43 crc kubenswrapper[4861]: I0309 09:28:43.093884 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/36ab59d0-e730-43a5-a7f1-99f136e5f9d3-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"36ab59d0-e730-43a5-a7f1-99f136e5f9d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:28:43 crc kubenswrapper[4861]: I0309 09:28:43.094542 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/36ab59d0-e730-43a5-a7f1-99f136e5f9d3-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"36ab59d0-e730-43a5-a7f1-99f136e5f9d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:28:43 crc kubenswrapper[4861]: I0309 09:28:43.094794 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/36ab59d0-e730-43a5-a7f1-99f136e5f9d3-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"36ab59d0-e730-43a5-a7f1-99f136e5f9d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:28:43 crc kubenswrapper[4861]: I0309 09:28:43.094933 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/36ab59d0-e730-43a5-a7f1-99f136e5f9d3-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"36ab59d0-e730-43a5-a7f1-99f136e5f9d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:28:43 crc kubenswrapper[4861]: I0309 09:28:43.106014 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/36ab59d0-e730-43a5-a7f1-99f136e5f9d3-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"36ab59d0-e730-43a5-a7f1-99f136e5f9d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:28:43 crc kubenswrapper[4861]: I0309 09:28:43.109282 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/36ab59d0-e730-43a5-a7f1-99f136e5f9d3-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"36ab59d0-e730-43a5-a7f1-99f136e5f9d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:28:43 crc kubenswrapper[4861]: I0309 09:28:43.113037 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/36ab59d0-e730-43a5-a7f1-99f136e5f9d3-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"36ab59d0-e730-43a5-a7f1-99f136e5f9d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:28:43 crc kubenswrapper[4861]: I0309 09:28:43.113391 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xf2h5\" (UniqueName: \"kubernetes.io/projected/36ab59d0-e730-43a5-a7f1-99f136e5f9d3-kube-api-access-xf2h5\") pod \"rabbitmq-cell1-server-0\" (UID: \"36ab59d0-e730-43a5-a7f1-99f136e5f9d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:28:43 crc kubenswrapper[4861]: I0309 09:28:43.133465 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"36ab59d0-e730-43a5-a7f1-99f136e5f9d3\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:28:43 crc kubenswrapper[4861]: I0309 09:28:43.159017 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:28:43 crc kubenswrapper[4861]: W0309 09:28:43.445113 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c3f8770_f9a3_49ae_81e0_caad7b40ac46.slice/crio-044c33c2d162896532f8b9f614f2ac9c540f423b33c2608fbf0c5c3ac286c484 WatchSource:0}: Error finding container 044c33c2d162896532f8b9f614f2ac9c540f423b33c2608fbf0c5c3ac286c484: Status 404 returned error can't find the container with id 044c33c2d162896532f8b9f614f2ac9c540f423b33c2608fbf0c5c3ac286c484 Mar 09 09:28:43 crc kubenswrapper[4861]: I0309 09:28:43.449127 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 09:28:43 crc kubenswrapper[4861]: I0309 09:28:43.476690 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2c3f8770-f9a3-49ae-81e0-caad7b40ac46","Type":"ContainerStarted","Data":"044c33c2d162896532f8b9f614f2ac9c540f423b33c2608fbf0c5c3ac286c484"} Mar 09 09:28:43 crc kubenswrapper[4861]: I0309 09:28:43.588111 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 09:28:43 crc kubenswrapper[4861]: W0309 09:28:43.596529 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36ab59d0_e730_43a5_a7f1_99f136e5f9d3.slice/crio-a42ebec9e9e389176553d4ec7f88d7e9ebdc4d34744f67a1de9e86ec1f7913a1 WatchSource:0}: Error finding container a42ebec9e9e389176553d4ec7f88d7e9ebdc4d34744f67a1de9e86ec1f7913a1: Status 404 returned error can't find the container with id a42ebec9e9e389176553d4ec7f88d7e9ebdc4d34744f67a1de9e86ec1f7913a1 Mar 09 09:28:43 crc kubenswrapper[4861]: I0309 09:28:43.670408 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03452acf-c21f-4d68-a813-772c30604a60" path="/var/lib/kubelet/pods/03452acf-c21f-4d68-a813-772c30604a60/volumes" Mar 09 09:28:43 crc kubenswrapper[4861]: I0309 09:28:43.671529 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9b83355-ea40-4408-9b77-c717df91e1a9" path="/var/lib/kubelet/pods/b9b83355-ea40-4408-9b77-c717df91e1a9/volumes" Mar 09 09:28:44 crc kubenswrapper[4861]: I0309 09:28:44.519070 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"36ab59d0-e730-43a5-a7f1-99f136e5f9d3","Type":"ContainerStarted","Data":"a42ebec9e9e389176553d4ec7f88d7e9ebdc4d34744f67a1de9e86ec1f7913a1"} Mar 09 09:28:45 crc kubenswrapper[4861]: I0309 09:28:45.530135 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"36ab59d0-e730-43a5-a7f1-99f136e5f9d3","Type":"ContainerStarted","Data":"e9d3981ac87148061872c32306fba139c5910bb559f093d164f9796913a623dd"} Mar 09 09:28:45 crc kubenswrapper[4861]: I0309 09:28:45.532541 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2c3f8770-f9a3-49ae-81e0-caad7b40ac46","Type":"ContainerStarted","Data":"fdfa8739d3ec136badc411ee9b64de65893baab74b8c993e9d438bc06c96f073"} Mar 09 09:28:46 crc kubenswrapper[4861]: I0309 09:28:46.960263 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bfb45b47-l8dxz"] Mar 09 09:28:46 crc kubenswrapper[4861]: I0309 09:28:46.962821 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bfb45b47-l8dxz" Mar 09 09:28:46 crc kubenswrapper[4861]: I0309 09:28:46.965857 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Mar 09 09:28:46 crc kubenswrapper[4861]: I0309 09:28:46.977384 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bfb45b47-l8dxz"] Mar 09 09:28:47 crc kubenswrapper[4861]: I0309 09:28:47.075243 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/62c19310-bd29-41b1-a8af-7abd3d9a955a-openstack-edpm-ipam\") pod \"dnsmasq-dns-bfb45b47-l8dxz\" (UID: \"62c19310-bd29-41b1-a8af-7abd3d9a955a\") " pod="openstack/dnsmasq-dns-bfb45b47-l8dxz" Mar 09 09:28:47 crc kubenswrapper[4861]: I0309 09:28:47.075339 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/62c19310-bd29-41b1-a8af-7abd3d9a955a-ovsdbserver-nb\") pod \"dnsmasq-dns-bfb45b47-l8dxz\" (UID: \"62c19310-bd29-41b1-a8af-7abd3d9a955a\") " pod="openstack/dnsmasq-dns-bfb45b47-l8dxz" Mar 09 09:28:47 crc kubenswrapper[4861]: I0309 09:28:47.075431 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62c19310-bd29-41b1-a8af-7abd3d9a955a-config\") pod \"dnsmasq-dns-bfb45b47-l8dxz\" (UID: \"62c19310-bd29-41b1-a8af-7abd3d9a955a\") " pod="openstack/dnsmasq-dns-bfb45b47-l8dxz" Mar 09 09:28:47 crc kubenswrapper[4861]: I0309 09:28:47.075464 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/62c19310-bd29-41b1-a8af-7abd3d9a955a-ovsdbserver-sb\") pod \"dnsmasq-dns-bfb45b47-l8dxz\" (UID: \"62c19310-bd29-41b1-a8af-7abd3d9a955a\") " pod="openstack/dnsmasq-dns-bfb45b47-l8dxz" Mar 09 09:28:47 crc kubenswrapper[4861]: I0309 09:28:47.075485 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/62c19310-bd29-41b1-a8af-7abd3d9a955a-dns-swift-storage-0\") pod \"dnsmasq-dns-bfb45b47-l8dxz\" (UID: \"62c19310-bd29-41b1-a8af-7abd3d9a955a\") " pod="openstack/dnsmasq-dns-bfb45b47-l8dxz" Mar 09 09:28:47 crc kubenswrapper[4861]: I0309 09:28:47.075507 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62c19310-bd29-41b1-a8af-7abd3d9a955a-dns-svc\") pod \"dnsmasq-dns-bfb45b47-l8dxz\" (UID: \"62c19310-bd29-41b1-a8af-7abd3d9a955a\") " pod="openstack/dnsmasq-dns-bfb45b47-l8dxz" Mar 09 09:28:47 crc kubenswrapper[4861]: I0309 09:28:47.075538 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd74b\" (UniqueName: \"kubernetes.io/projected/62c19310-bd29-41b1-a8af-7abd3d9a955a-kube-api-access-sd74b\") pod \"dnsmasq-dns-bfb45b47-l8dxz\" (UID: \"62c19310-bd29-41b1-a8af-7abd3d9a955a\") " pod="openstack/dnsmasq-dns-bfb45b47-l8dxz" Mar 09 09:28:47 crc kubenswrapper[4861]: I0309 09:28:47.177526 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/62c19310-bd29-41b1-a8af-7abd3d9a955a-openstack-edpm-ipam\") pod \"dnsmasq-dns-bfb45b47-l8dxz\" (UID: \"62c19310-bd29-41b1-a8af-7abd3d9a955a\") " pod="openstack/dnsmasq-dns-bfb45b47-l8dxz" Mar 09 09:28:47 crc kubenswrapper[4861]: I0309 09:28:47.177911 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/62c19310-bd29-41b1-a8af-7abd3d9a955a-ovsdbserver-nb\") pod \"dnsmasq-dns-bfb45b47-l8dxz\" (UID: \"62c19310-bd29-41b1-a8af-7abd3d9a955a\") " pod="openstack/dnsmasq-dns-bfb45b47-l8dxz" Mar 09 09:28:47 crc kubenswrapper[4861]: I0309 09:28:47.177967 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62c19310-bd29-41b1-a8af-7abd3d9a955a-config\") pod \"dnsmasq-dns-bfb45b47-l8dxz\" (UID: \"62c19310-bd29-41b1-a8af-7abd3d9a955a\") " pod="openstack/dnsmasq-dns-bfb45b47-l8dxz" Mar 09 09:28:47 crc kubenswrapper[4861]: I0309 09:28:47.177993 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/62c19310-bd29-41b1-a8af-7abd3d9a955a-ovsdbserver-sb\") pod \"dnsmasq-dns-bfb45b47-l8dxz\" (UID: \"62c19310-bd29-41b1-a8af-7abd3d9a955a\") " pod="openstack/dnsmasq-dns-bfb45b47-l8dxz" Mar 09 09:28:47 crc kubenswrapper[4861]: I0309 09:28:47.178012 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/62c19310-bd29-41b1-a8af-7abd3d9a955a-dns-swift-storage-0\") pod \"dnsmasq-dns-bfb45b47-l8dxz\" (UID: \"62c19310-bd29-41b1-a8af-7abd3d9a955a\") " pod="openstack/dnsmasq-dns-bfb45b47-l8dxz" Mar 09 09:28:47 crc kubenswrapper[4861]: I0309 09:28:47.178034 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62c19310-bd29-41b1-a8af-7abd3d9a955a-dns-svc\") pod \"dnsmasq-dns-bfb45b47-l8dxz\" (UID: \"62c19310-bd29-41b1-a8af-7abd3d9a955a\") " pod="openstack/dnsmasq-dns-bfb45b47-l8dxz" Mar 09 09:28:47 crc kubenswrapper[4861]: I0309 09:28:47.178058 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sd74b\" (UniqueName: \"kubernetes.io/projected/62c19310-bd29-41b1-a8af-7abd3d9a955a-kube-api-access-sd74b\") pod \"dnsmasq-dns-bfb45b47-l8dxz\" (UID: \"62c19310-bd29-41b1-a8af-7abd3d9a955a\") " pod="openstack/dnsmasq-dns-bfb45b47-l8dxz" Mar 09 09:28:47 crc kubenswrapper[4861]: I0309 09:28:47.179058 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/62c19310-bd29-41b1-a8af-7abd3d9a955a-openstack-edpm-ipam\") pod \"dnsmasq-dns-bfb45b47-l8dxz\" (UID: \"62c19310-bd29-41b1-a8af-7abd3d9a955a\") " pod="openstack/dnsmasq-dns-bfb45b47-l8dxz" Mar 09 09:28:47 crc kubenswrapper[4861]: I0309 09:28:47.179402 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/62c19310-bd29-41b1-a8af-7abd3d9a955a-dns-swift-storage-0\") pod \"dnsmasq-dns-bfb45b47-l8dxz\" (UID: \"62c19310-bd29-41b1-a8af-7abd3d9a955a\") " pod="openstack/dnsmasq-dns-bfb45b47-l8dxz" Mar 09 09:28:47 crc kubenswrapper[4861]: I0309 09:28:47.180035 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62c19310-bd29-41b1-a8af-7abd3d9a955a-config\") pod \"dnsmasq-dns-bfb45b47-l8dxz\" (UID: \"62c19310-bd29-41b1-a8af-7abd3d9a955a\") " pod="openstack/dnsmasq-dns-bfb45b47-l8dxz" Mar 09 09:28:47 crc kubenswrapper[4861]: I0309 09:28:47.180114 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/62c19310-bd29-41b1-a8af-7abd3d9a955a-ovsdbserver-sb\") pod \"dnsmasq-dns-bfb45b47-l8dxz\" (UID: \"62c19310-bd29-41b1-a8af-7abd3d9a955a\") " pod="openstack/dnsmasq-dns-bfb45b47-l8dxz" Mar 09 09:28:47 crc kubenswrapper[4861]: I0309 09:28:47.180517 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62c19310-bd29-41b1-a8af-7abd3d9a955a-dns-svc\") pod \"dnsmasq-dns-bfb45b47-l8dxz\" (UID: \"62c19310-bd29-41b1-a8af-7abd3d9a955a\") " pod="openstack/dnsmasq-dns-bfb45b47-l8dxz" Mar 09 09:28:47 crc kubenswrapper[4861]: I0309 09:28:47.180700 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/62c19310-bd29-41b1-a8af-7abd3d9a955a-ovsdbserver-nb\") pod \"dnsmasq-dns-bfb45b47-l8dxz\" (UID: \"62c19310-bd29-41b1-a8af-7abd3d9a955a\") " pod="openstack/dnsmasq-dns-bfb45b47-l8dxz" Mar 09 09:28:47 crc kubenswrapper[4861]: I0309 09:28:47.200070 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd74b\" (UniqueName: \"kubernetes.io/projected/62c19310-bd29-41b1-a8af-7abd3d9a955a-kube-api-access-sd74b\") pod \"dnsmasq-dns-bfb45b47-l8dxz\" (UID: \"62c19310-bd29-41b1-a8af-7abd3d9a955a\") " pod="openstack/dnsmasq-dns-bfb45b47-l8dxz" Mar 09 09:28:47 crc kubenswrapper[4861]: I0309 09:28:47.286497 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bfb45b47-l8dxz" Mar 09 09:28:47 crc kubenswrapper[4861]: I0309 09:28:47.786227 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bfb45b47-l8dxz"] Mar 09 09:28:47 crc kubenswrapper[4861]: W0309 09:28:47.786838 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62c19310_bd29_41b1_a8af_7abd3d9a955a.slice/crio-7972bf990c882d375d18658851e3f1e639424f0e7dd80bd108bddc33880c5d7f WatchSource:0}: Error finding container 7972bf990c882d375d18658851e3f1e639424f0e7dd80bd108bddc33880c5d7f: Status 404 returned error can't find the container with id 7972bf990c882d375d18658851e3f1e639424f0e7dd80bd108bddc33880c5d7f Mar 09 09:28:47 crc kubenswrapper[4861]: I0309 09:28:47.886821 4861 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mmdfr" podUID="379c2378-7f4c-402c-af79-d6b0996dcac3" containerName="registry-server" probeResult="failure" output=< Mar 09 09:28:47 crc kubenswrapper[4861]: timeout: failed to connect service ":50051" within 1s Mar 09 09:28:47 crc kubenswrapper[4861]: > Mar 09 09:28:48 crc kubenswrapper[4861]: I0309 09:28:48.572075 4861 generic.go:334] "Generic (PLEG): container finished" podID="62c19310-bd29-41b1-a8af-7abd3d9a955a" containerID="94d193ce751d030a272f0a73fe4cfcb4680cf49525dcb7786f25552e381d8f7c" exitCode=0 Mar 09 09:28:48 crc kubenswrapper[4861]: I0309 09:28:48.572464 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bfb45b47-l8dxz" event={"ID":"62c19310-bd29-41b1-a8af-7abd3d9a955a","Type":"ContainerDied","Data":"94d193ce751d030a272f0a73fe4cfcb4680cf49525dcb7786f25552e381d8f7c"} Mar 09 09:28:48 crc kubenswrapper[4861]: I0309 09:28:48.572517 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bfb45b47-l8dxz" event={"ID":"62c19310-bd29-41b1-a8af-7abd3d9a955a","Type":"ContainerStarted","Data":"7972bf990c882d375d18658851e3f1e639424f0e7dd80bd108bddc33880c5d7f"} Mar 09 09:28:49 crc kubenswrapper[4861]: I0309 09:28:49.582153 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bfb45b47-l8dxz" event={"ID":"62c19310-bd29-41b1-a8af-7abd3d9a955a","Type":"ContainerStarted","Data":"5cfa9d18a24201e5896c8400091d3e1685ecdb90f44f5270a15636af8d4e4491"} Mar 09 09:28:49 crc kubenswrapper[4861]: I0309 09:28:49.583281 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bfb45b47-l8dxz" Mar 09 09:28:49 crc kubenswrapper[4861]: I0309 09:28:49.611321 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bfb45b47-l8dxz" podStartSLOduration=3.611297681 podStartE2EDuration="3.611297681s" podCreationTimestamp="2026-03-09 09:28:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:28:49.602655957 +0000 UTC m=+1372.687695378" watchObservedRunningTime="2026-03-09 09:28:49.611297681 +0000 UTC m=+1372.696337082" Mar 09 09:28:50 crc kubenswrapper[4861]: E0309 09:28:50.761736 4861 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode81aeea2_beec_4987_b527_db644692cb14.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode81aeea2_beec_4987_b527_db644692cb14.slice/crio-7aa755ba2b534ec33b73942a0dcad7c6918ecf98e43652615d68efcdc114b08e\": RecentStats: unable to find data in memory cache]" Mar 09 09:28:54 crc kubenswrapper[4861]: I0309 09:28:54.606413 4861 patch_prober.go:28] interesting pod/machine-config-daemon-5g7gc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:28:54 crc kubenswrapper[4861]: I0309 09:28:54.606941 4861 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:28:54 crc kubenswrapper[4861]: I0309 09:28:54.606983 4861 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" Mar 09 09:28:54 crc kubenswrapper[4861]: I0309 09:28:54.607770 4861 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7fdd7f3e15e67ed60e9bdb64538958917f435e5fb6449fedcf993ae2c627b46e"} pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 09:28:54 crc kubenswrapper[4861]: I0309 09:28:54.607836 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" containerID="cri-o://7fdd7f3e15e67ed60e9bdb64538958917f435e5fb6449fedcf993ae2c627b46e" gracePeriod=600 Mar 09 09:28:55 crc kubenswrapper[4861]: I0309 09:28:55.650239 4861 generic.go:334] "Generic (PLEG): container finished" podID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerID="7fdd7f3e15e67ed60e9bdb64538958917f435e5fb6449fedcf993ae2c627b46e" exitCode=0 Mar 09 09:28:55 crc kubenswrapper[4861]: I0309 09:28:55.650276 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" event={"ID":"6f7875e3-174f-4c67-8675-d878de74aa4f","Type":"ContainerDied","Data":"7fdd7f3e15e67ed60e9bdb64538958917f435e5fb6449fedcf993ae2c627b46e"} Mar 09 09:28:55 crc kubenswrapper[4861]: I0309 09:28:55.650854 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" event={"ID":"6f7875e3-174f-4c67-8675-d878de74aa4f","Type":"ContainerStarted","Data":"c0726d3ac822004eacb4f8d12bb4cbaf2815fc9d29aaa8ba7db9d4fae1717ee1"} Mar 09 09:28:55 crc kubenswrapper[4861]: I0309 09:28:55.650876 4861 scope.go:117] "RemoveContainer" containerID="925e707d587470ad152a1a9ef2490c9fccb36de6da22acc63f3054b647081cf1" Mar 09 09:28:57 crc kubenswrapper[4861]: I0309 09:28:57.288557 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-bfb45b47-l8dxz" Mar 09 09:28:57 crc kubenswrapper[4861]: I0309 09:28:57.349108 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7749c44969-c57rg"] Mar 09 09:28:57 crc kubenswrapper[4861]: I0309 09:28:57.349754 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7749c44969-c57rg" podUID="18b03e1a-4375-4f3e-ab0d-17b8498c0146" containerName="dnsmasq-dns" containerID="cri-o://fc2a8891cd531d8b1ecf94dd1508de6fdce3bec1719a0a4fdeb47b8730d816f4" gracePeriod=10 Mar 09 09:28:57 crc kubenswrapper[4861]: I0309 09:28:57.496630 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79fcc958f9-jb5bq"] Mar 09 09:28:57 crc kubenswrapper[4861]: I0309 09:28:57.502586 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79fcc958f9-jb5bq" Mar 09 09:28:57 crc kubenswrapper[4861]: I0309 09:28:57.520142 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79fcc958f9-jb5bq"] Mar 09 09:28:57 crc kubenswrapper[4861]: I0309 09:28:57.581492 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twvwr\" (UniqueName: \"kubernetes.io/projected/780ba45c-97cb-4382-9d7a-268051c773d1-kube-api-access-twvwr\") pod \"dnsmasq-dns-79fcc958f9-jb5bq\" (UID: \"780ba45c-97cb-4382-9d7a-268051c773d1\") " pod="openstack/dnsmasq-dns-79fcc958f9-jb5bq" Mar 09 09:28:57 crc kubenswrapper[4861]: I0309 09:28:57.581561 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/780ba45c-97cb-4382-9d7a-268051c773d1-ovsdbserver-sb\") pod \"dnsmasq-dns-79fcc958f9-jb5bq\" (UID: \"780ba45c-97cb-4382-9d7a-268051c773d1\") " pod="openstack/dnsmasq-dns-79fcc958f9-jb5bq" Mar 09 09:28:57 crc kubenswrapper[4861]: I0309 09:28:57.581592 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/780ba45c-97cb-4382-9d7a-268051c773d1-openstack-edpm-ipam\") pod \"dnsmasq-dns-79fcc958f9-jb5bq\" (UID: \"780ba45c-97cb-4382-9d7a-268051c773d1\") " pod="openstack/dnsmasq-dns-79fcc958f9-jb5bq" Mar 09 09:28:57 crc kubenswrapper[4861]: I0309 09:28:57.581650 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/780ba45c-97cb-4382-9d7a-268051c773d1-dns-swift-storage-0\") pod \"dnsmasq-dns-79fcc958f9-jb5bq\" (UID: \"780ba45c-97cb-4382-9d7a-268051c773d1\") " pod="openstack/dnsmasq-dns-79fcc958f9-jb5bq" Mar 09 09:28:57 crc kubenswrapper[4861]: I0309 09:28:57.582610 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/780ba45c-97cb-4382-9d7a-268051c773d1-config\") pod \"dnsmasq-dns-79fcc958f9-jb5bq\" (UID: \"780ba45c-97cb-4382-9d7a-268051c773d1\") " pod="openstack/dnsmasq-dns-79fcc958f9-jb5bq" Mar 09 09:28:57 crc kubenswrapper[4861]: I0309 09:28:57.582852 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/780ba45c-97cb-4382-9d7a-268051c773d1-dns-svc\") pod \"dnsmasq-dns-79fcc958f9-jb5bq\" (UID: \"780ba45c-97cb-4382-9d7a-268051c773d1\") " pod="openstack/dnsmasq-dns-79fcc958f9-jb5bq" Mar 09 09:28:57 crc kubenswrapper[4861]: I0309 09:28:57.582994 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/780ba45c-97cb-4382-9d7a-268051c773d1-ovsdbserver-nb\") pod \"dnsmasq-dns-79fcc958f9-jb5bq\" (UID: \"780ba45c-97cb-4382-9d7a-268051c773d1\") " pod="openstack/dnsmasq-dns-79fcc958f9-jb5bq" Mar 09 09:28:57 crc kubenswrapper[4861]: I0309 09:28:57.683512 4861 generic.go:334] "Generic (PLEG): container finished" podID="18b03e1a-4375-4f3e-ab0d-17b8498c0146" containerID="fc2a8891cd531d8b1ecf94dd1508de6fdce3bec1719a0a4fdeb47b8730d816f4" exitCode=0 Mar 09 09:28:57 crc kubenswrapper[4861]: I0309 09:28:57.683564 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7749c44969-c57rg" event={"ID":"18b03e1a-4375-4f3e-ab0d-17b8498c0146","Type":"ContainerDied","Data":"fc2a8891cd531d8b1ecf94dd1508de6fdce3bec1719a0a4fdeb47b8730d816f4"} Mar 09 09:28:57 crc kubenswrapper[4861]: I0309 09:28:57.684626 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/780ba45c-97cb-4382-9d7a-268051c773d1-config\") pod \"dnsmasq-dns-79fcc958f9-jb5bq\" (UID: \"780ba45c-97cb-4382-9d7a-268051c773d1\") " pod="openstack/dnsmasq-dns-79fcc958f9-jb5bq" Mar 09 09:28:57 crc kubenswrapper[4861]: I0309 09:28:57.684716 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/780ba45c-97cb-4382-9d7a-268051c773d1-dns-svc\") pod \"dnsmasq-dns-79fcc958f9-jb5bq\" (UID: \"780ba45c-97cb-4382-9d7a-268051c773d1\") " pod="openstack/dnsmasq-dns-79fcc958f9-jb5bq" Mar 09 09:28:57 crc kubenswrapper[4861]: I0309 09:28:57.684790 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/780ba45c-97cb-4382-9d7a-268051c773d1-ovsdbserver-nb\") pod \"dnsmasq-dns-79fcc958f9-jb5bq\" (UID: \"780ba45c-97cb-4382-9d7a-268051c773d1\") " pod="openstack/dnsmasq-dns-79fcc958f9-jb5bq" Mar 09 09:28:57 crc kubenswrapper[4861]: I0309 09:28:57.684819 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twvwr\" (UniqueName: \"kubernetes.io/projected/780ba45c-97cb-4382-9d7a-268051c773d1-kube-api-access-twvwr\") pod \"dnsmasq-dns-79fcc958f9-jb5bq\" (UID: \"780ba45c-97cb-4382-9d7a-268051c773d1\") " pod="openstack/dnsmasq-dns-79fcc958f9-jb5bq" Mar 09 09:28:57 crc kubenswrapper[4861]: I0309 09:28:57.684843 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/780ba45c-97cb-4382-9d7a-268051c773d1-ovsdbserver-sb\") pod \"dnsmasq-dns-79fcc958f9-jb5bq\" (UID: \"780ba45c-97cb-4382-9d7a-268051c773d1\") " pod="openstack/dnsmasq-dns-79fcc958f9-jb5bq" Mar 09 09:28:57 crc kubenswrapper[4861]: I0309 09:28:57.684872 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/780ba45c-97cb-4382-9d7a-268051c773d1-openstack-edpm-ipam\") pod \"dnsmasq-dns-79fcc958f9-jb5bq\" (UID: \"780ba45c-97cb-4382-9d7a-268051c773d1\") " pod="openstack/dnsmasq-dns-79fcc958f9-jb5bq" Mar 09 09:28:57 crc kubenswrapper[4861]: I0309 09:28:57.684942 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/780ba45c-97cb-4382-9d7a-268051c773d1-dns-swift-storage-0\") pod \"dnsmasq-dns-79fcc958f9-jb5bq\" (UID: \"780ba45c-97cb-4382-9d7a-268051c773d1\") " pod="openstack/dnsmasq-dns-79fcc958f9-jb5bq" Mar 09 09:28:57 crc kubenswrapper[4861]: I0309 09:28:57.686184 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/780ba45c-97cb-4382-9d7a-268051c773d1-dns-swift-storage-0\") pod \"dnsmasq-dns-79fcc958f9-jb5bq\" (UID: \"780ba45c-97cb-4382-9d7a-268051c773d1\") " pod="openstack/dnsmasq-dns-79fcc958f9-jb5bq" Mar 09 09:28:57 crc kubenswrapper[4861]: I0309 09:28:57.686543 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/780ba45c-97cb-4382-9d7a-268051c773d1-ovsdbserver-nb\") pod \"dnsmasq-dns-79fcc958f9-jb5bq\" (UID: \"780ba45c-97cb-4382-9d7a-268051c773d1\") " pod="openstack/dnsmasq-dns-79fcc958f9-jb5bq" Mar 09 09:28:57 crc kubenswrapper[4861]: I0309 09:28:57.686780 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/780ba45c-97cb-4382-9d7a-268051c773d1-ovsdbserver-sb\") pod \"dnsmasq-dns-79fcc958f9-jb5bq\" (UID: \"780ba45c-97cb-4382-9d7a-268051c773d1\") " pod="openstack/dnsmasq-dns-79fcc958f9-jb5bq" Mar 09 09:28:57 crc kubenswrapper[4861]: I0309 09:28:57.686927 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/780ba45c-97cb-4382-9d7a-268051c773d1-config\") pod \"dnsmasq-dns-79fcc958f9-jb5bq\" (UID: \"780ba45c-97cb-4382-9d7a-268051c773d1\") " pod="openstack/dnsmasq-dns-79fcc958f9-jb5bq" Mar 09 09:28:57 crc kubenswrapper[4861]: I0309 09:28:57.688156 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/780ba45c-97cb-4382-9d7a-268051c773d1-dns-svc\") pod \"dnsmasq-dns-79fcc958f9-jb5bq\" (UID: \"780ba45c-97cb-4382-9d7a-268051c773d1\") " pod="openstack/dnsmasq-dns-79fcc958f9-jb5bq" Mar 09 09:28:57 crc kubenswrapper[4861]: I0309 09:28:57.689201 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/780ba45c-97cb-4382-9d7a-268051c773d1-openstack-edpm-ipam\") pod \"dnsmasq-dns-79fcc958f9-jb5bq\" (UID: \"780ba45c-97cb-4382-9d7a-268051c773d1\") " pod="openstack/dnsmasq-dns-79fcc958f9-jb5bq" Mar 09 09:28:57 crc kubenswrapper[4861]: I0309 09:28:57.711707 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twvwr\" (UniqueName: \"kubernetes.io/projected/780ba45c-97cb-4382-9d7a-268051c773d1-kube-api-access-twvwr\") pod \"dnsmasq-dns-79fcc958f9-jb5bq\" (UID: \"780ba45c-97cb-4382-9d7a-268051c773d1\") " pod="openstack/dnsmasq-dns-79fcc958f9-jb5bq" Mar 09 09:28:57 crc kubenswrapper[4861]: I0309 09:28:57.838489 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79fcc958f9-jb5bq" Mar 09 09:28:57 crc kubenswrapper[4861]: I0309 09:28:57.894204 4861 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mmdfr" podUID="379c2378-7f4c-402c-af79-d6b0996dcac3" containerName="registry-server" probeResult="failure" output=< Mar 09 09:28:57 crc kubenswrapper[4861]: timeout: failed to connect service ":50051" within 1s Mar 09 09:28:57 crc kubenswrapper[4861]: > Mar 09 09:28:57 crc kubenswrapper[4861]: I0309 09:28:57.986121 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7749c44969-c57rg" Mar 09 09:28:58 crc kubenswrapper[4861]: I0309 09:28:58.094906 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/18b03e1a-4375-4f3e-ab0d-17b8498c0146-ovsdbserver-nb\") pod \"18b03e1a-4375-4f3e-ab0d-17b8498c0146\" (UID: \"18b03e1a-4375-4f3e-ab0d-17b8498c0146\") " Mar 09 09:28:58 crc kubenswrapper[4861]: I0309 09:28:58.095230 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18b03e1a-4375-4f3e-ab0d-17b8498c0146-dns-svc\") pod \"18b03e1a-4375-4f3e-ab0d-17b8498c0146\" (UID: \"18b03e1a-4375-4f3e-ab0d-17b8498c0146\") " Mar 09 09:28:58 crc kubenswrapper[4861]: I0309 09:28:58.095316 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/18b03e1a-4375-4f3e-ab0d-17b8498c0146-ovsdbserver-sb\") pod \"18b03e1a-4375-4f3e-ab0d-17b8498c0146\" (UID: \"18b03e1a-4375-4f3e-ab0d-17b8498c0146\") " Mar 09 09:28:58 crc kubenswrapper[4861]: I0309 09:28:58.095351 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfhxw\" (UniqueName: \"kubernetes.io/projected/18b03e1a-4375-4f3e-ab0d-17b8498c0146-kube-api-access-lfhxw\") pod \"18b03e1a-4375-4f3e-ab0d-17b8498c0146\" (UID: \"18b03e1a-4375-4f3e-ab0d-17b8498c0146\") " Mar 09 09:28:58 crc kubenswrapper[4861]: I0309 09:28:58.095435 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18b03e1a-4375-4f3e-ab0d-17b8498c0146-config\") pod \"18b03e1a-4375-4f3e-ab0d-17b8498c0146\" (UID: \"18b03e1a-4375-4f3e-ab0d-17b8498c0146\") " Mar 09 09:28:58 crc kubenswrapper[4861]: I0309 09:28:58.095465 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/18b03e1a-4375-4f3e-ab0d-17b8498c0146-dns-swift-storage-0\") pod \"18b03e1a-4375-4f3e-ab0d-17b8498c0146\" (UID: \"18b03e1a-4375-4f3e-ab0d-17b8498c0146\") " Mar 09 09:28:58 crc kubenswrapper[4861]: I0309 09:28:58.107873 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18b03e1a-4375-4f3e-ab0d-17b8498c0146-kube-api-access-lfhxw" (OuterVolumeSpecName: "kube-api-access-lfhxw") pod "18b03e1a-4375-4f3e-ab0d-17b8498c0146" (UID: "18b03e1a-4375-4f3e-ab0d-17b8498c0146"). InnerVolumeSpecName "kube-api-access-lfhxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:28:58 crc kubenswrapper[4861]: I0309 09:28:58.149155 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18b03e1a-4375-4f3e-ab0d-17b8498c0146-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "18b03e1a-4375-4f3e-ab0d-17b8498c0146" (UID: "18b03e1a-4375-4f3e-ab0d-17b8498c0146"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:28:58 crc kubenswrapper[4861]: I0309 09:28:58.150179 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18b03e1a-4375-4f3e-ab0d-17b8498c0146-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "18b03e1a-4375-4f3e-ab0d-17b8498c0146" (UID: "18b03e1a-4375-4f3e-ab0d-17b8498c0146"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:28:58 crc kubenswrapper[4861]: I0309 09:28:58.152624 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18b03e1a-4375-4f3e-ab0d-17b8498c0146-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "18b03e1a-4375-4f3e-ab0d-17b8498c0146" (UID: "18b03e1a-4375-4f3e-ab0d-17b8498c0146"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:28:58 crc kubenswrapper[4861]: I0309 09:28:58.154129 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18b03e1a-4375-4f3e-ab0d-17b8498c0146-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "18b03e1a-4375-4f3e-ab0d-17b8498c0146" (UID: "18b03e1a-4375-4f3e-ab0d-17b8498c0146"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:28:58 crc kubenswrapper[4861]: I0309 09:28:58.156188 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18b03e1a-4375-4f3e-ab0d-17b8498c0146-config" (OuterVolumeSpecName: "config") pod "18b03e1a-4375-4f3e-ab0d-17b8498c0146" (UID: "18b03e1a-4375-4f3e-ab0d-17b8498c0146"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:28:58 crc kubenswrapper[4861]: I0309 09:28:58.197450 4861 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/18b03e1a-4375-4f3e-ab0d-17b8498c0146-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:58 crc kubenswrapper[4861]: I0309 09:28:58.197491 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfhxw\" (UniqueName: \"kubernetes.io/projected/18b03e1a-4375-4f3e-ab0d-17b8498c0146-kube-api-access-lfhxw\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:58 crc kubenswrapper[4861]: I0309 09:28:58.197505 4861 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18b03e1a-4375-4f3e-ab0d-17b8498c0146-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:58 crc kubenswrapper[4861]: I0309 09:28:58.197514 4861 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/18b03e1a-4375-4f3e-ab0d-17b8498c0146-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:58 crc kubenswrapper[4861]: I0309 09:28:58.197522 4861 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/18b03e1a-4375-4f3e-ab0d-17b8498c0146-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:58 crc kubenswrapper[4861]: I0309 09:28:58.197529 4861 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18b03e1a-4375-4f3e-ab0d-17b8498c0146-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 09:28:58 crc kubenswrapper[4861]: W0309 09:28:58.320951 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod780ba45c_97cb_4382_9d7a_268051c773d1.slice/crio-75ada6b83717fc0ea12ed28e3a606e0669daa2179eb8c1811a0ebf4602e57474 WatchSource:0}: Error finding container 75ada6b83717fc0ea12ed28e3a606e0669daa2179eb8c1811a0ebf4602e57474: Status 404 returned error can't find the container with id 75ada6b83717fc0ea12ed28e3a606e0669daa2179eb8c1811a0ebf4602e57474 Mar 09 09:28:58 crc kubenswrapper[4861]: I0309 09:28:58.323573 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79fcc958f9-jb5bq"] Mar 09 09:28:58 crc kubenswrapper[4861]: I0309 09:28:58.696385 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7749c44969-c57rg" event={"ID":"18b03e1a-4375-4f3e-ab0d-17b8498c0146","Type":"ContainerDied","Data":"da607d03e9a0b78997a0f1f883081a617b3123991eda13760c2fc678683fb87c"} Mar 09 09:28:58 crc kubenswrapper[4861]: I0309 09:28:58.696738 4861 scope.go:117] "RemoveContainer" containerID="fc2a8891cd531d8b1ecf94dd1508de6fdce3bec1719a0a4fdeb47b8730d816f4" Mar 09 09:28:58 crc kubenswrapper[4861]: I0309 09:28:58.696402 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7749c44969-c57rg" Mar 09 09:28:58 crc kubenswrapper[4861]: I0309 09:28:58.698956 4861 generic.go:334] "Generic (PLEG): container finished" podID="780ba45c-97cb-4382-9d7a-268051c773d1" containerID="e2cd6904b6c9791d98dbb41d7feedf3b630e529d564a36da1bd523fb5bad4ac2" exitCode=0 Mar 09 09:28:58 crc kubenswrapper[4861]: I0309 09:28:58.698994 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79fcc958f9-jb5bq" event={"ID":"780ba45c-97cb-4382-9d7a-268051c773d1","Type":"ContainerDied","Data":"e2cd6904b6c9791d98dbb41d7feedf3b630e529d564a36da1bd523fb5bad4ac2"} Mar 09 09:28:58 crc kubenswrapper[4861]: I0309 09:28:58.699035 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79fcc958f9-jb5bq" event={"ID":"780ba45c-97cb-4382-9d7a-268051c773d1","Type":"ContainerStarted","Data":"75ada6b83717fc0ea12ed28e3a606e0669daa2179eb8c1811a0ebf4602e57474"} Mar 09 09:28:58 crc kubenswrapper[4861]: I0309 09:28:58.732249 4861 scope.go:117] "RemoveContainer" containerID="e140a5deae214509e351f908f68fc1d89ef3f4da51f96fa3c7b2a1a6d756b530" Mar 09 09:28:58 crc kubenswrapper[4861]: I0309 09:28:58.924023 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7749c44969-c57rg"] Mar 09 09:28:58 crc kubenswrapper[4861]: I0309 09:28:58.934537 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7749c44969-c57rg"] Mar 09 09:28:59 crc kubenswrapper[4861]: I0309 09:28:59.671988 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18b03e1a-4375-4f3e-ab0d-17b8498c0146" path="/var/lib/kubelet/pods/18b03e1a-4375-4f3e-ab0d-17b8498c0146/volumes" Mar 09 09:28:59 crc kubenswrapper[4861]: I0309 09:28:59.710836 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79fcc958f9-jb5bq" event={"ID":"780ba45c-97cb-4382-9d7a-268051c773d1","Type":"ContainerStarted","Data":"f248c4f116f7bec28b96d752a3b21273d95bf54bb64eee693ca308042da58d1a"} Mar 09 09:28:59 crc kubenswrapper[4861]: I0309 09:28:59.711286 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-79fcc958f9-jb5bq" Mar 09 09:28:59 crc kubenswrapper[4861]: I0309 09:28:59.733614 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79fcc958f9-jb5bq" podStartSLOduration=2.733210619 podStartE2EDuration="2.733210619s" podCreationTimestamp="2026-03-09 09:28:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:28:59.72688721 +0000 UTC m=+1382.811926621" watchObservedRunningTime="2026-03-09 09:28:59.733210619 +0000 UTC m=+1382.818250020" Mar 09 09:29:04 crc kubenswrapper[4861]: I0309 09:29:04.073074 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zmd8c"] Mar 09 09:29:04 crc kubenswrapper[4861]: E0309 09:29:04.075598 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18b03e1a-4375-4f3e-ab0d-17b8498c0146" containerName="init" Mar 09 09:29:04 crc kubenswrapper[4861]: I0309 09:29:04.075620 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="18b03e1a-4375-4f3e-ab0d-17b8498c0146" containerName="init" Mar 09 09:29:04 crc kubenswrapper[4861]: E0309 09:29:04.075649 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18b03e1a-4375-4f3e-ab0d-17b8498c0146" containerName="dnsmasq-dns" Mar 09 09:29:04 crc kubenswrapper[4861]: I0309 09:29:04.075657 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="18b03e1a-4375-4f3e-ab0d-17b8498c0146" containerName="dnsmasq-dns" Mar 09 09:29:04 crc kubenswrapper[4861]: I0309 09:29:04.075928 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="18b03e1a-4375-4f3e-ab0d-17b8498c0146" containerName="dnsmasq-dns" Mar 09 09:29:04 crc kubenswrapper[4861]: I0309 09:29:04.077751 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zmd8c" Mar 09 09:29:04 crc kubenswrapper[4861]: I0309 09:29:04.084616 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zmd8c"] Mar 09 09:29:04 crc kubenswrapper[4861]: I0309 09:29:04.215981 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8895c3e-9b4a-424d-965d-756bf35edf5e-utilities\") pod \"community-operators-zmd8c\" (UID: \"c8895c3e-9b4a-424d-965d-756bf35edf5e\") " pod="openshift-marketplace/community-operators-zmd8c" Mar 09 09:29:04 crc kubenswrapper[4861]: I0309 09:29:04.216259 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8sgr\" (UniqueName: \"kubernetes.io/projected/c8895c3e-9b4a-424d-965d-756bf35edf5e-kube-api-access-t8sgr\") pod \"community-operators-zmd8c\" (UID: \"c8895c3e-9b4a-424d-965d-756bf35edf5e\") " pod="openshift-marketplace/community-operators-zmd8c" Mar 09 09:29:04 crc kubenswrapper[4861]: I0309 09:29:04.216463 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8895c3e-9b4a-424d-965d-756bf35edf5e-catalog-content\") pod \"community-operators-zmd8c\" (UID: \"c8895c3e-9b4a-424d-965d-756bf35edf5e\") " pod="openshift-marketplace/community-operators-zmd8c" Mar 09 09:29:04 crc kubenswrapper[4861]: I0309 09:29:04.318785 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8sgr\" (UniqueName: \"kubernetes.io/projected/c8895c3e-9b4a-424d-965d-756bf35edf5e-kube-api-access-t8sgr\") pod \"community-operators-zmd8c\" (UID: \"c8895c3e-9b4a-424d-965d-756bf35edf5e\") " pod="openshift-marketplace/community-operators-zmd8c" Mar 09 09:29:04 crc kubenswrapper[4861]: I0309 09:29:04.319193 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8895c3e-9b4a-424d-965d-756bf35edf5e-catalog-content\") pod \"community-operators-zmd8c\" (UID: \"c8895c3e-9b4a-424d-965d-756bf35edf5e\") " pod="openshift-marketplace/community-operators-zmd8c" Mar 09 09:29:04 crc kubenswrapper[4861]: I0309 09:29:04.319263 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8895c3e-9b4a-424d-965d-756bf35edf5e-utilities\") pod \"community-operators-zmd8c\" (UID: \"c8895c3e-9b4a-424d-965d-756bf35edf5e\") " pod="openshift-marketplace/community-operators-zmd8c" Mar 09 09:29:04 crc kubenswrapper[4861]: I0309 09:29:04.319757 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8895c3e-9b4a-424d-965d-756bf35edf5e-catalog-content\") pod \"community-operators-zmd8c\" (UID: \"c8895c3e-9b4a-424d-965d-756bf35edf5e\") " pod="openshift-marketplace/community-operators-zmd8c" Mar 09 09:29:04 crc kubenswrapper[4861]: I0309 09:29:04.319831 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8895c3e-9b4a-424d-965d-756bf35edf5e-utilities\") pod \"community-operators-zmd8c\" (UID: \"c8895c3e-9b4a-424d-965d-756bf35edf5e\") " pod="openshift-marketplace/community-operators-zmd8c" Mar 09 09:29:04 crc kubenswrapper[4861]: I0309 09:29:04.350800 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8sgr\" (UniqueName: \"kubernetes.io/projected/c8895c3e-9b4a-424d-965d-756bf35edf5e-kube-api-access-t8sgr\") pod \"community-operators-zmd8c\" (UID: \"c8895c3e-9b4a-424d-965d-756bf35edf5e\") " pod="openshift-marketplace/community-operators-zmd8c" Mar 09 09:29:04 crc kubenswrapper[4861]: I0309 09:29:04.411366 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zmd8c" Mar 09 09:29:05 crc kubenswrapper[4861]: W0309 09:29:05.034985 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8895c3e_9b4a_424d_965d_756bf35edf5e.slice/crio-ddfd72c386cbd06b9e3feb27096651a3074645ddc8a2252a358300037ff38c6f WatchSource:0}: Error finding container ddfd72c386cbd06b9e3feb27096651a3074645ddc8a2252a358300037ff38c6f: Status 404 returned error can't find the container with id ddfd72c386cbd06b9e3feb27096651a3074645ddc8a2252a358300037ff38c6f Mar 09 09:29:05 crc kubenswrapper[4861]: I0309 09:29:05.038768 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zmd8c"] Mar 09 09:29:05 crc kubenswrapper[4861]: I0309 09:29:05.768808 4861 generic.go:334] "Generic (PLEG): container finished" podID="c8895c3e-9b4a-424d-965d-756bf35edf5e" containerID="50dbe922246ee43cb58f53ad06b47ab03fdc45005d2c048b07975a98e5a5b3ad" exitCode=0 Mar 09 09:29:05 crc kubenswrapper[4861]: I0309 09:29:05.768981 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zmd8c" event={"ID":"c8895c3e-9b4a-424d-965d-756bf35edf5e","Type":"ContainerDied","Data":"50dbe922246ee43cb58f53ad06b47ab03fdc45005d2c048b07975a98e5a5b3ad"} Mar 09 09:29:05 crc kubenswrapper[4861]: I0309 09:29:05.769127 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zmd8c" event={"ID":"c8895c3e-9b4a-424d-965d-756bf35edf5e","Type":"ContainerStarted","Data":"ddfd72c386cbd06b9e3feb27096651a3074645ddc8a2252a358300037ff38c6f"} Mar 09 09:29:06 crc kubenswrapper[4861]: I0309 09:29:06.882539 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mmdfr" Mar 09 09:29:06 crc kubenswrapper[4861]: I0309 09:29:06.929025 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mmdfr" Mar 09 09:29:07 crc kubenswrapper[4861]: I0309 09:29:07.841069 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-79fcc958f9-jb5bq" Mar 09 09:29:07 crc kubenswrapper[4861]: I0309 09:29:07.917861 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bfb45b47-l8dxz"] Mar 09 09:29:07 crc kubenswrapper[4861]: I0309 09:29:07.918147 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-bfb45b47-l8dxz" podUID="62c19310-bd29-41b1-a8af-7abd3d9a955a" containerName="dnsmasq-dns" containerID="cri-o://5cfa9d18a24201e5896c8400091d3e1685ecdb90f44f5270a15636af8d4e4491" gracePeriod=10 Mar 09 09:29:08 crc kubenswrapper[4861]: I0309 09:29:08.421093 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bfb45b47-l8dxz" Mar 09 09:29:08 crc kubenswrapper[4861]: I0309 09:29:08.451696 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mmdfr"] Mar 09 09:29:08 crc kubenswrapper[4861]: I0309 09:29:08.548024 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62c19310-bd29-41b1-a8af-7abd3d9a955a-config\") pod \"62c19310-bd29-41b1-a8af-7abd3d9a955a\" (UID: \"62c19310-bd29-41b1-a8af-7abd3d9a955a\") " Mar 09 09:29:08 crc kubenswrapper[4861]: I0309 09:29:08.548142 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/62c19310-bd29-41b1-a8af-7abd3d9a955a-openstack-edpm-ipam\") pod \"62c19310-bd29-41b1-a8af-7abd3d9a955a\" (UID: \"62c19310-bd29-41b1-a8af-7abd3d9a955a\") " Mar 09 09:29:08 crc kubenswrapper[4861]: I0309 09:29:08.548803 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sd74b\" (UniqueName: \"kubernetes.io/projected/62c19310-bd29-41b1-a8af-7abd3d9a955a-kube-api-access-sd74b\") pod \"62c19310-bd29-41b1-a8af-7abd3d9a955a\" (UID: \"62c19310-bd29-41b1-a8af-7abd3d9a955a\") " Mar 09 09:29:08 crc kubenswrapper[4861]: I0309 09:29:08.548866 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62c19310-bd29-41b1-a8af-7abd3d9a955a-dns-svc\") pod \"62c19310-bd29-41b1-a8af-7abd3d9a955a\" (UID: \"62c19310-bd29-41b1-a8af-7abd3d9a955a\") " Mar 09 09:29:08 crc kubenswrapper[4861]: I0309 09:29:08.548908 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/62c19310-bd29-41b1-a8af-7abd3d9a955a-ovsdbserver-sb\") pod \"62c19310-bd29-41b1-a8af-7abd3d9a955a\" (UID: \"62c19310-bd29-41b1-a8af-7abd3d9a955a\") " Mar 09 09:29:08 crc kubenswrapper[4861]: I0309 09:29:08.548940 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/62c19310-bd29-41b1-a8af-7abd3d9a955a-dns-swift-storage-0\") pod \"62c19310-bd29-41b1-a8af-7abd3d9a955a\" (UID: \"62c19310-bd29-41b1-a8af-7abd3d9a955a\") " Mar 09 09:29:08 crc kubenswrapper[4861]: I0309 09:29:08.549015 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/62c19310-bd29-41b1-a8af-7abd3d9a955a-ovsdbserver-nb\") pod \"62c19310-bd29-41b1-a8af-7abd3d9a955a\" (UID: \"62c19310-bd29-41b1-a8af-7abd3d9a955a\") " Mar 09 09:29:08 crc kubenswrapper[4861]: I0309 09:29:08.555684 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62c19310-bd29-41b1-a8af-7abd3d9a955a-kube-api-access-sd74b" (OuterVolumeSpecName: "kube-api-access-sd74b") pod "62c19310-bd29-41b1-a8af-7abd3d9a955a" (UID: "62c19310-bd29-41b1-a8af-7abd3d9a955a"). InnerVolumeSpecName "kube-api-access-sd74b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:29:08 crc kubenswrapper[4861]: I0309 09:29:08.615051 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62c19310-bd29-41b1-a8af-7abd3d9a955a-config" (OuterVolumeSpecName: "config") pod "62c19310-bd29-41b1-a8af-7abd3d9a955a" (UID: "62c19310-bd29-41b1-a8af-7abd3d9a955a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:29:08 crc kubenswrapper[4861]: I0309 09:29:08.621860 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62c19310-bd29-41b1-a8af-7abd3d9a955a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "62c19310-bd29-41b1-a8af-7abd3d9a955a" (UID: "62c19310-bd29-41b1-a8af-7abd3d9a955a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:29:08 crc kubenswrapper[4861]: I0309 09:29:08.622510 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62c19310-bd29-41b1-a8af-7abd3d9a955a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "62c19310-bd29-41b1-a8af-7abd3d9a955a" (UID: "62c19310-bd29-41b1-a8af-7abd3d9a955a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:29:08 crc kubenswrapper[4861]: I0309 09:29:08.622519 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62c19310-bd29-41b1-a8af-7abd3d9a955a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "62c19310-bd29-41b1-a8af-7abd3d9a955a" (UID: "62c19310-bd29-41b1-a8af-7abd3d9a955a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:29:08 crc kubenswrapper[4861]: I0309 09:29:08.622510 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62c19310-bd29-41b1-a8af-7abd3d9a955a-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "62c19310-bd29-41b1-a8af-7abd3d9a955a" (UID: "62c19310-bd29-41b1-a8af-7abd3d9a955a"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:29:08 crc kubenswrapper[4861]: I0309 09:29:08.623570 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62c19310-bd29-41b1-a8af-7abd3d9a955a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "62c19310-bd29-41b1-a8af-7abd3d9a955a" (UID: "62c19310-bd29-41b1-a8af-7abd3d9a955a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:29:08 crc kubenswrapper[4861]: I0309 09:29:08.651358 4861 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/62c19310-bd29-41b1-a8af-7abd3d9a955a-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:08 crc kubenswrapper[4861]: I0309 09:29:08.651707 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sd74b\" (UniqueName: \"kubernetes.io/projected/62c19310-bd29-41b1-a8af-7abd3d9a955a-kube-api-access-sd74b\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:08 crc kubenswrapper[4861]: I0309 09:29:08.651720 4861 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/62c19310-bd29-41b1-a8af-7abd3d9a955a-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:08 crc kubenswrapper[4861]: I0309 09:29:08.651737 4861 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/62c19310-bd29-41b1-a8af-7abd3d9a955a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:08 crc kubenswrapper[4861]: I0309 09:29:08.651749 4861 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/62c19310-bd29-41b1-a8af-7abd3d9a955a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:08 crc kubenswrapper[4861]: I0309 09:29:08.651761 4861 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/62c19310-bd29-41b1-a8af-7abd3d9a955a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:08 crc kubenswrapper[4861]: I0309 09:29:08.651773 4861 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62c19310-bd29-41b1-a8af-7abd3d9a955a-config\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:08 crc kubenswrapper[4861]: I0309 09:29:08.801225 4861 generic.go:334] "Generic (PLEG): container finished" podID="62c19310-bd29-41b1-a8af-7abd3d9a955a" containerID="5cfa9d18a24201e5896c8400091d3e1685ecdb90f44f5270a15636af8d4e4491" exitCode=0 Mar 09 09:29:08 crc kubenswrapper[4861]: I0309 09:29:08.801460 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mmdfr" podUID="379c2378-7f4c-402c-af79-d6b0996dcac3" containerName="registry-server" containerID="cri-o://2b418ccea885e78071dee4874397a4fea350a5143e95eb4b5022f394cf1a158f" gracePeriod=2 Mar 09 09:29:08 crc kubenswrapper[4861]: I0309 09:29:08.801763 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bfb45b47-l8dxz" Mar 09 09:29:08 crc kubenswrapper[4861]: I0309 09:29:08.803743 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bfb45b47-l8dxz" event={"ID":"62c19310-bd29-41b1-a8af-7abd3d9a955a","Type":"ContainerDied","Data":"5cfa9d18a24201e5896c8400091d3e1685ecdb90f44f5270a15636af8d4e4491"} Mar 09 09:29:08 crc kubenswrapper[4861]: I0309 09:29:08.803786 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bfb45b47-l8dxz" event={"ID":"62c19310-bd29-41b1-a8af-7abd3d9a955a","Type":"ContainerDied","Data":"7972bf990c882d375d18658851e3f1e639424f0e7dd80bd108bddc33880c5d7f"} Mar 09 09:29:08 crc kubenswrapper[4861]: I0309 09:29:08.803804 4861 scope.go:117] "RemoveContainer" containerID="5cfa9d18a24201e5896c8400091d3e1685ecdb90f44f5270a15636af8d4e4491" Mar 09 09:29:08 crc kubenswrapper[4861]: I0309 09:29:08.839229 4861 scope.go:117] "RemoveContainer" containerID="94d193ce751d030a272f0a73fe4cfcb4680cf49525dcb7786f25552e381d8f7c" Mar 09 09:29:08 crc kubenswrapper[4861]: I0309 09:29:08.854257 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bfb45b47-l8dxz"] Mar 09 09:29:08 crc kubenswrapper[4861]: I0309 09:29:08.869653 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bfb45b47-l8dxz"] Mar 09 09:29:08 crc kubenswrapper[4861]: I0309 09:29:08.877675 4861 scope.go:117] "RemoveContainer" containerID="5cfa9d18a24201e5896c8400091d3e1685ecdb90f44f5270a15636af8d4e4491" Mar 09 09:29:08 crc kubenswrapper[4861]: E0309 09:29:08.878279 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cfa9d18a24201e5896c8400091d3e1685ecdb90f44f5270a15636af8d4e4491\": container with ID starting with 5cfa9d18a24201e5896c8400091d3e1685ecdb90f44f5270a15636af8d4e4491 not found: ID does not exist" containerID="5cfa9d18a24201e5896c8400091d3e1685ecdb90f44f5270a15636af8d4e4491" Mar 09 09:29:08 crc kubenswrapper[4861]: I0309 09:29:08.878334 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cfa9d18a24201e5896c8400091d3e1685ecdb90f44f5270a15636af8d4e4491"} err="failed to get container status \"5cfa9d18a24201e5896c8400091d3e1685ecdb90f44f5270a15636af8d4e4491\": rpc error: code = NotFound desc = could not find container \"5cfa9d18a24201e5896c8400091d3e1685ecdb90f44f5270a15636af8d4e4491\": container with ID starting with 5cfa9d18a24201e5896c8400091d3e1685ecdb90f44f5270a15636af8d4e4491 not found: ID does not exist" Mar 09 09:29:08 crc kubenswrapper[4861]: I0309 09:29:08.878362 4861 scope.go:117] "RemoveContainer" containerID="94d193ce751d030a272f0a73fe4cfcb4680cf49525dcb7786f25552e381d8f7c" Mar 09 09:29:08 crc kubenswrapper[4861]: E0309 09:29:08.878723 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94d193ce751d030a272f0a73fe4cfcb4680cf49525dcb7786f25552e381d8f7c\": container with ID starting with 94d193ce751d030a272f0a73fe4cfcb4680cf49525dcb7786f25552e381d8f7c not found: ID does not exist" containerID="94d193ce751d030a272f0a73fe4cfcb4680cf49525dcb7786f25552e381d8f7c" Mar 09 09:29:08 crc kubenswrapper[4861]: I0309 09:29:08.878754 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94d193ce751d030a272f0a73fe4cfcb4680cf49525dcb7786f25552e381d8f7c"} err="failed to get container status \"94d193ce751d030a272f0a73fe4cfcb4680cf49525dcb7786f25552e381d8f7c\": rpc error: code = NotFound desc = could not find container \"94d193ce751d030a272f0a73fe4cfcb4680cf49525dcb7786f25552e381d8f7c\": container with ID starting with 94d193ce751d030a272f0a73fe4cfcb4680cf49525dcb7786f25552e381d8f7c not found: ID does not exist" Mar 09 09:29:09 crc kubenswrapper[4861]: I0309 09:29:09.283061 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mmdfr" Mar 09 09:29:09 crc kubenswrapper[4861]: I0309 09:29:09.368777 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phjt2\" (UniqueName: \"kubernetes.io/projected/379c2378-7f4c-402c-af79-d6b0996dcac3-kube-api-access-phjt2\") pod \"379c2378-7f4c-402c-af79-d6b0996dcac3\" (UID: \"379c2378-7f4c-402c-af79-d6b0996dcac3\") " Mar 09 09:29:09 crc kubenswrapper[4861]: I0309 09:29:09.369009 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/379c2378-7f4c-402c-af79-d6b0996dcac3-utilities\") pod \"379c2378-7f4c-402c-af79-d6b0996dcac3\" (UID: \"379c2378-7f4c-402c-af79-d6b0996dcac3\") " Mar 09 09:29:09 crc kubenswrapper[4861]: I0309 09:29:09.369034 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/379c2378-7f4c-402c-af79-d6b0996dcac3-catalog-content\") pod \"379c2378-7f4c-402c-af79-d6b0996dcac3\" (UID: \"379c2378-7f4c-402c-af79-d6b0996dcac3\") " Mar 09 09:29:09 crc kubenswrapper[4861]: I0309 09:29:09.370346 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/379c2378-7f4c-402c-af79-d6b0996dcac3-utilities" (OuterVolumeSpecName: "utilities") pod "379c2378-7f4c-402c-af79-d6b0996dcac3" (UID: "379c2378-7f4c-402c-af79-d6b0996dcac3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:29:09 crc kubenswrapper[4861]: I0309 09:29:09.376667 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/379c2378-7f4c-402c-af79-d6b0996dcac3-kube-api-access-phjt2" (OuterVolumeSpecName: "kube-api-access-phjt2") pod "379c2378-7f4c-402c-af79-d6b0996dcac3" (UID: "379c2378-7f4c-402c-af79-d6b0996dcac3"). InnerVolumeSpecName "kube-api-access-phjt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:29:09 crc kubenswrapper[4861]: I0309 09:29:09.471875 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phjt2\" (UniqueName: \"kubernetes.io/projected/379c2378-7f4c-402c-af79-d6b0996dcac3-kube-api-access-phjt2\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:09 crc kubenswrapper[4861]: I0309 09:29:09.471905 4861 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/379c2378-7f4c-402c-af79-d6b0996dcac3-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:09 crc kubenswrapper[4861]: I0309 09:29:09.518118 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/379c2378-7f4c-402c-af79-d6b0996dcac3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "379c2378-7f4c-402c-af79-d6b0996dcac3" (UID: "379c2378-7f4c-402c-af79-d6b0996dcac3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:29:09 crc kubenswrapper[4861]: I0309 09:29:09.573456 4861 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/379c2378-7f4c-402c-af79-d6b0996dcac3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:09 crc kubenswrapper[4861]: I0309 09:29:09.670394 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62c19310-bd29-41b1-a8af-7abd3d9a955a" path="/var/lib/kubelet/pods/62c19310-bd29-41b1-a8af-7abd3d9a955a/volumes" Mar 09 09:29:09 crc kubenswrapper[4861]: I0309 09:29:09.815107 4861 generic.go:334] "Generic (PLEG): container finished" podID="379c2378-7f4c-402c-af79-d6b0996dcac3" containerID="2b418ccea885e78071dee4874397a4fea350a5143e95eb4b5022f394cf1a158f" exitCode=0 Mar 09 09:29:09 crc kubenswrapper[4861]: I0309 09:29:09.815175 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mmdfr" Mar 09 09:29:09 crc kubenswrapper[4861]: I0309 09:29:09.815196 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mmdfr" event={"ID":"379c2378-7f4c-402c-af79-d6b0996dcac3","Type":"ContainerDied","Data":"2b418ccea885e78071dee4874397a4fea350a5143e95eb4b5022f394cf1a158f"} Mar 09 09:29:09 crc kubenswrapper[4861]: I0309 09:29:09.815749 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mmdfr" event={"ID":"379c2378-7f4c-402c-af79-d6b0996dcac3","Type":"ContainerDied","Data":"5b2b63dc5b67d4b04c7935f828d1d60b0be0193468c0b77b46692f0b917744be"} Mar 09 09:29:09 crc kubenswrapper[4861]: I0309 09:29:09.815771 4861 scope.go:117] "RemoveContainer" containerID="2b418ccea885e78071dee4874397a4fea350a5143e95eb4b5022f394cf1a158f" Mar 09 09:29:09 crc kubenswrapper[4861]: I0309 09:29:09.820618 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zmd8c" event={"ID":"c8895c3e-9b4a-424d-965d-756bf35edf5e","Type":"ContainerStarted","Data":"e662f0f8719bc2347484eddc9d31815f5df3dd6967f8690b0eba57bd2343e80e"} Mar 09 09:29:09 crc kubenswrapper[4861]: I0309 09:29:09.876201 4861 scope.go:117] "RemoveContainer" containerID="48f906d13752529add49e639f53b4060e19162ca9ea057695f8cda449df4cd1a" Mar 09 09:29:09 crc kubenswrapper[4861]: I0309 09:29:09.884499 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mmdfr"] Mar 09 09:29:09 crc kubenswrapper[4861]: I0309 09:29:09.894028 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mmdfr"] Mar 09 09:29:09 crc kubenswrapper[4861]: I0309 09:29:09.903499 4861 scope.go:117] "RemoveContainer" containerID="c18a92d677bd7dbfbc4f09e60597631bcfad90a45fef48e18855957c7eadd709" Mar 09 09:29:09 crc kubenswrapper[4861]: I0309 09:29:09.926970 4861 scope.go:117] "RemoveContainer" containerID="2b418ccea885e78071dee4874397a4fea350a5143e95eb4b5022f394cf1a158f" Mar 09 09:29:09 crc kubenswrapper[4861]: E0309 09:29:09.927504 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b418ccea885e78071dee4874397a4fea350a5143e95eb4b5022f394cf1a158f\": container with ID starting with 2b418ccea885e78071dee4874397a4fea350a5143e95eb4b5022f394cf1a158f not found: ID does not exist" containerID="2b418ccea885e78071dee4874397a4fea350a5143e95eb4b5022f394cf1a158f" Mar 09 09:29:09 crc kubenswrapper[4861]: I0309 09:29:09.927545 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b418ccea885e78071dee4874397a4fea350a5143e95eb4b5022f394cf1a158f"} err="failed to get container status \"2b418ccea885e78071dee4874397a4fea350a5143e95eb4b5022f394cf1a158f\": rpc error: code = NotFound desc = could not find container \"2b418ccea885e78071dee4874397a4fea350a5143e95eb4b5022f394cf1a158f\": container with ID starting with 2b418ccea885e78071dee4874397a4fea350a5143e95eb4b5022f394cf1a158f not found: ID does not exist" Mar 09 09:29:09 crc kubenswrapper[4861]: I0309 09:29:09.927571 4861 scope.go:117] "RemoveContainer" containerID="48f906d13752529add49e639f53b4060e19162ca9ea057695f8cda449df4cd1a" Mar 09 09:29:09 crc kubenswrapper[4861]: E0309 09:29:09.927916 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48f906d13752529add49e639f53b4060e19162ca9ea057695f8cda449df4cd1a\": container with ID starting with 48f906d13752529add49e639f53b4060e19162ca9ea057695f8cda449df4cd1a not found: ID does not exist" containerID="48f906d13752529add49e639f53b4060e19162ca9ea057695f8cda449df4cd1a" Mar 09 09:29:09 crc kubenswrapper[4861]: I0309 09:29:09.927953 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48f906d13752529add49e639f53b4060e19162ca9ea057695f8cda449df4cd1a"} err="failed to get container status \"48f906d13752529add49e639f53b4060e19162ca9ea057695f8cda449df4cd1a\": rpc error: code = NotFound desc = could not find container \"48f906d13752529add49e639f53b4060e19162ca9ea057695f8cda449df4cd1a\": container with ID starting with 48f906d13752529add49e639f53b4060e19162ca9ea057695f8cda449df4cd1a not found: ID does not exist" Mar 09 09:29:09 crc kubenswrapper[4861]: I0309 09:29:09.927979 4861 scope.go:117] "RemoveContainer" containerID="c18a92d677bd7dbfbc4f09e60597631bcfad90a45fef48e18855957c7eadd709" Mar 09 09:29:09 crc kubenswrapper[4861]: E0309 09:29:09.928514 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c18a92d677bd7dbfbc4f09e60597631bcfad90a45fef48e18855957c7eadd709\": container with ID starting with c18a92d677bd7dbfbc4f09e60597631bcfad90a45fef48e18855957c7eadd709 not found: ID does not exist" containerID="c18a92d677bd7dbfbc4f09e60597631bcfad90a45fef48e18855957c7eadd709" Mar 09 09:29:09 crc kubenswrapper[4861]: I0309 09:29:09.928545 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c18a92d677bd7dbfbc4f09e60597631bcfad90a45fef48e18855957c7eadd709"} err="failed to get container status \"c18a92d677bd7dbfbc4f09e60597631bcfad90a45fef48e18855957c7eadd709\": rpc error: code = NotFound desc = could not find container \"c18a92d677bd7dbfbc4f09e60597631bcfad90a45fef48e18855957c7eadd709\": container with ID starting with c18a92d677bd7dbfbc4f09e60597631bcfad90a45fef48e18855957c7eadd709 not found: ID does not exist" Mar 09 09:29:10 crc kubenswrapper[4861]: I0309 09:29:10.843865 4861 generic.go:334] "Generic (PLEG): container finished" podID="c8895c3e-9b4a-424d-965d-756bf35edf5e" containerID="e662f0f8719bc2347484eddc9d31815f5df3dd6967f8690b0eba57bd2343e80e" exitCode=0 Mar 09 09:29:10 crc kubenswrapper[4861]: I0309 09:29:10.843920 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zmd8c" event={"ID":"c8895c3e-9b4a-424d-965d-756bf35edf5e","Type":"ContainerDied","Data":"e662f0f8719bc2347484eddc9d31815f5df3dd6967f8690b0eba57bd2343e80e"} Mar 09 09:29:10 crc kubenswrapper[4861]: I0309 09:29:10.848138 4861 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 09:29:11 crc kubenswrapper[4861]: I0309 09:29:11.670078 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="379c2378-7f4c-402c-af79-d6b0996dcac3" path="/var/lib/kubelet/pods/379c2378-7f4c-402c-af79-d6b0996dcac3/volumes" Mar 09 09:29:11 crc kubenswrapper[4861]: I0309 09:29:11.855190 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zmd8c" event={"ID":"c8895c3e-9b4a-424d-965d-756bf35edf5e","Type":"ContainerStarted","Data":"b65e835d295ca03e181ca77ac501140b8cf3a2b0464aacaa697247d85eb2d305"} Mar 09 09:29:11 crc kubenswrapper[4861]: I0309 09:29:11.882195 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zmd8c" podStartSLOduration=2.180611425 podStartE2EDuration="7.882168606s" podCreationTimestamp="2026-03-09 09:29:04 +0000 UTC" firstStartedPulling="2026-03-09 09:29:05.77051706 +0000 UTC m=+1388.855556461" lastFinishedPulling="2026-03-09 09:29:11.472074221 +0000 UTC m=+1394.557113642" observedRunningTime="2026-03-09 09:29:11.874517769 +0000 UTC m=+1394.959557200" watchObservedRunningTime="2026-03-09 09:29:11.882168606 +0000 UTC m=+1394.967208007" Mar 09 09:29:14 crc kubenswrapper[4861]: I0309 09:29:14.412389 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zmd8c" Mar 09 09:29:14 crc kubenswrapper[4861]: I0309 09:29:14.412938 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zmd8c" Mar 09 09:29:14 crc kubenswrapper[4861]: I0309 09:29:14.457480 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zmd8c" Mar 09 09:29:17 crc kubenswrapper[4861]: I0309 09:29:17.914618 4861 generic.go:334] "Generic (PLEG): container finished" podID="2c3f8770-f9a3-49ae-81e0-caad7b40ac46" containerID="fdfa8739d3ec136badc411ee9b64de65893baab74b8c993e9d438bc06c96f073" exitCode=0 Mar 09 09:29:17 crc kubenswrapper[4861]: I0309 09:29:17.914719 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2c3f8770-f9a3-49ae-81e0-caad7b40ac46","Type":"ContainerDied","Data":"fdfa8739d3ec136badc411ee9b64de65893baab74b8c993e9d438bc06c96f073"} Mar 09 09:29:17 crc kubenswrapper[4861]: I0309 09:29:17.917875 4861 generic.go:334] "Generic (PLEG): container finished" podID="36ab59d0-e730-43a5-a7f1-99f136e5f9d3" containerID="e9d3981ac87148061872c32306fba139c5910bb559f093d164f9796913a623dd" exitCode=0 Mar 09 09:29:17 crc kubenswrapper[4861]: I0309 09:29:17.917931 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"36ab59d0-e730-43a5-a7f1-99f136e5f9d3","Type":"ContainerDied","Data":"e9d3981ac87148061872c32306fba139c5910bb559f093d164f9796913a623dd"} Mar 09 09:29:18 crc kubenswrapper[4861]: I0309 09:29:18.928736 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2c3f8770-f9a3-49ae-81e0-caad7b40ac46","Type":"ContainerStarted","Data":"98d933e34116ca3349506611e3da58be348128e4598630faaf15d65e488ce03a"} Mar 09 09:29:18 crc kubenswrapper[4861]: I0309 09:29:18.930345 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 09 09:29:18 crc kubenswrapper[4861]: I0309 09:29:18.932938 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"36ab59d0-e730-43a5-a7f1-99f136e5f9d3","Type":"ContainerStarted","Data":"2dbfd7338409e36c5246f69f0930c853267cc1b950cca7d86cbf71c8c485e795"} Mar 09 09:29:18 crc kubenswrapper[4861]: I0309 09:29:18.934855 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:29:18 crc kubenswrapper[4861]: I0309 09:29:18.975902 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.975876569 podStartE2EDuration="36.975876569s" podCreationTimestamp="2026-03-09 09:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:29:18.961102492 +0000 UTC m=+1402.046141913" watchObservedRunningTime="2026-03-09 09:29:18.975876569 +0000 UTC m=+1402.060915970" Mar 09 09:29:18 crc kubenswrapper[4861]: I0309 09:29:18.984756 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.984738301 podStartE2EDuration="36.984738301s" podCreationTimestamp="2026-03-09 09:28:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 09:29:18.984478253 +0000 UTC m=+1402.069517684" watchObservedRunningTime="2026-03-09 09:29:18.984738301 +0000 UTC m=+1402.069777702" Mar 09 09:29:20 crc kubenswrapper[4861]: I0309 09:29:20.514659 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2pv7c"] Mar 09 09:29:20 crc kubenswrapper[4861]: E0309 09:29:20.515495 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="379c2378-7f4c-402c-af79-d6b0996dcac3" containerName="extract-utilities" Mar 09 09:29:20 crc kubenswrapper[4861]: I0309 09:29:20.515512 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="379c2378-7f4c-402c-af79-d6b0996dcac3" containerName="extract-utilities" Mar 09 09:29:20 crc kubenswrapper[4861]: E0309 09:29:20.515528 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62c19310-bd29-41b1-a8af-7abd3d9a955a" containerName="dnsmasq-dns" Mar 09 09:29:20 crc kubenswrapper[4861]: I0309 09:29:20.515535 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="62c19310-bd29-41b1-a8af-7abd3d9a955a" containerName="dnsmasq-dns" Mar 09 09:29:20 crc kubenswrapper[4861]: E0309 09:29:20.515550 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62c19310-bd29-41b1-a8af-7abd3d9a955a" containerName="init" Mar 09 09:29:20 crc kubenswrapper[4861]: I0309 09:29:20.515559 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="62c19310-bd29-41b1-a8af-7abd3d9a955a" containerName="init" Mar 09 09:29:20 crc kubenswrapper[4861]: E0309 09:29:20.515583 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="379c2378-7f4c-402c-af79-d6b0996dcac3" containerName="extract-content" Mar 09 09:29:20 crc kubenswrapper[4861]: I0309 09:29:20.515592 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="379c2378-7f4c-402c-af79-d6b0996dcac3" containerName="extract-content" Mar 09 09:29:20 crc kubenswrapper[4861]: E0309 09:29:20.515604 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="379c2378-7f4c-402c-af79-d6b0996dcac3" containerName="registry-server" Mar 09 09:29:20 crc kubenswrapper[4861]: I0309 09:29:20.515611 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="379c2378-7f4c-402c-af79-d6b0996dcac3" containerName="registry-server" Mar 09 09:29:20 crc kubenswrapper[4861]: I0309 09:29:20.515840 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="62c19310-bd29-41b1-a8af-7abd3d9a955a" containerName="dnsmasq-dns" Mar 09 09:29:20 crc kubenswrapper[4861]: I0309 09:29:20.515857 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="379c2378-7f4c-402c-af79-d6b0996dcac3" containerName="registry-server" Mar 09 09:29:20 crc kubenswrapper[4861]: I0309 09:29:20.519755 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2pv7c" Mar 09 09:29:20 crc kubenswrapper[4861]: I0309 09:29:20.556991 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2pv7c"] Mar 09 09:29:20 crc kubenswrapper[4861]: I0309 09:29:20.602800 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6xmh\" (UniqueName: \"kubernetes.io/projected/9cc96eed-6d35-4ea5-a5a4-5ca80a1952df-kube-api-access-k6xmh\") pod \"redhat-marketplace-2pv7c\" (UID: \"9cc96eed-6d35-4ea5-a5a4-5ca80a1952df\") " pod="openshift-marketplace/redhat-marketplace-2pv7c" Mar 09 09:29:20 crc kubenswrapper[4861]: I0309 09:29:20.602938 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cc96eed-6d35-4ea5-a5a4-5ca80a1952df-utilities\") pod \"redhat-marketplace-2pv7c\" (UID: \"9cc96eed-6d35-4ea5-a5a4-5ca80a1952df\") " pod="openshift-marketplace/redhat-marketplace-2pv7c" Mar 09 09:29:20 crc kubenswrapper[4861]: I0309 09:29:20.602970 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cc96eed-6d35-4ea5-a5a4-5ca80a1952df-catalog-content\") pod \"redhat-marketplace-2pv7c\" (UID: \"9cc96eed-6d35-4ea5-a5a4-5ca80a1952df\") " pod="openshift-marketplace/redhat-marketplace-2pv7c" Mar 09 09:29:20 crc kubenswrapper[4861]: I0309 09:29:20.704821 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cc96eed-6d35-4ea5-a5a4-5ca80a1952df-utilities\") pod \"redhat-marketplace-2pv7c\" (UID: \"9cc96eed-6d35-4ea5-a5a4-5ca80a1952df\") " pod="openshift-marketplace/redhat-marketplace-2pv7c" Mar 09 09:29:20 crc kubenswrapper[4861]: I0309 09:29:20.704894 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cc96eed-6d35-4ea5-a5a4-5ca80a1952df-catalog-content\") pod \"redhat-marketplace-2pv7c\" (UID: \"9cc96eed-6d35-4ea5-a5a4-5ca80a1952df\") " pod="openshift-marketplace/redhat-marketplace-2pv7c" Mar 09 09:29:20 crc kubenswrapper[4861]: I0309 09:29:20.704999 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6xmh\" (UniqueName: \"kubernetes.io/projected/9cc96eed-6d35-4ea5-a5a4-5ca80a1952df-kube-api-access-k6xmh\") pod \"redhat-marketplace-2pv7c\" (UID: \"9cc96eed-6d35-4ea5-a5a4-5ca80a1952df\") " pod="openshift-marketplace/redhat-marketplace-2pv7c" Mar 09 09:29:20 crc kubenswrapper[4861]: I0309 09:29:20.705610 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cc96eed-6d35-4ea5-a5a4-5ca80a1952df-utilities\") pod \"redhat-marketplace-2pv7c\" (UID: \"9cc96eed-6d35-4ea5-a5a4-5ca80a1952df\") " pod="openshift-marketplace/redhat-marketplace-2pv7c" Mar 09 09:29:20 crc kubenswrapper[4861]: I0309 09:29:20.705998 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cc96eed-6d35-4ea5-a5a4-5ca80a1952df-catalog-content\") pod \"redhat-marketplace-2pv7c\" (UID: \"9cc96eed-6d35-4ea5-a5a4-5ca80a1952df\") " pod="openshift-marketplace/redhat-marketplace-2pv7c" Mar 09 09:29:20 crc kubenswrapper[4861]: I0309 09:29:20.729647 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6xmh\" (UniqueName: \"kubernetes.io/projected/9cc96eed-6d35-4ea5-a5a4-5ca80a1952df-kube-api-access-k6xmh\") pod \"redhat-marketplace-2pv7c\" (UID: \"9cc96eed-6d35-4ea5-a5a4-5ca80a1952df\") " pod="openshift-marketplace/redhat-marketplace-2pv7c" Mar 09 09:29:20 crc kubenswrapper[4861]: I0309 09:29:20.861414 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2pv7c" Mar 09 09:29:21 crc kubenswrapper[4861]: I0309 09:29:21.003641 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kfh6m"] Mar 09 09:29:21 crc kubenswrapper[4861]: I0309 09:29:21.005539 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kfh6m" Mar 09 09:29:21 crc kubenswrapper[4861]: I0309 09:29:21.009446 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lkd5q" Mar 09 09:29:21 crc kubenswrapper[4861]: I0309 09:29:21.011600 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 09:29:21 crc kubenswrapper[4861]: I0309 09:29:21.011677 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 09:29:21 crc kubenswrapper[4861]: I0309 09:29:21.011953 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 09:29:21 crc kubenswrapper[4861]: I0309 09:29:21.019977 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kfh6m"] Mar 09 09:29:21 crc kubenswrapper[4861]: I0309 09:29:21.114944 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/438a18ff-fdc3-44f3-9c51-df15a691c389-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-kfh6m\" (UID: \"438a18ff-fdc3-44f3-9c51-df15a691c389\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kfh6m" Mar 09 09:29:21 crc kubenswrapper[4861]: I0309 09:29:21.115681 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/438a18ff-fdc3-44f3-9c51-df15a691c389-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-kfh6m\" (UID: \"438a18ff-fdc3-44f3-9c51-df15a691c389\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kfh6m" Mar 09 09:29:21 crc kubenswrapper[4861]: I0309 09:29:21.115817 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdg7p\" (UniqueName: \"kubernetes.io/projected/438a18ff-fdc3-44f3-9c51-df15a691c389-kube-api-access-tdg7p\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-kfh6m\" (UID: \"438a18ff-fdc3-44f3-9c51-df15a691c389\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kfh6m" Mar 09 09:29:21 crc kubenswrapper[4861]: I0309 09:29:21.116060 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/438a18ff-fdc3-44f3-9c51-df15a691c389-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-kfh6m\" (UID: \"438a18ff-fdc3-44f3-9c51-df15a691c389\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kfh6m" Mar 09 09:29:21 crc kubenswrapper[4861]: I0309 09:29:21.217919 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/438a18ff-fdc3-44f3-9c51-df15a691c389-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-kfh6m\" (UID: \"438a18ff-fdc3-44f3-9c51-df15a691c389\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kfh6m" Mar 09 09:29:21 crc kubenswrapper[4861]: I0309 09:29:21.218011 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/438a18ff-fdc3-44f3-9c51-df15a691c389-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-kfh6m\" (UID: \"438a18ff-fdc3-44f3-9c51-df15a691c389\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kfh6m" Mar 09 09:29:21 crc kubenswrapper[4861]: I0309 09:29:21.218053 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdg7p\" (UniqueName: \"kubernetes.io/projected/438a18ff-fdc3-44f3-9c51-df15a691c389-kube-api-access-tdg7p\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-kfh6m\" (UID: \"438a18ff-fdc3-44f3-9c51-df15a691c389\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kfh6m" Mar 09 09:29:21 crc kubenswrapper[4861]: I0309 09:29:21.218136 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/438a18ff-fdc3-44f3-9c51-df15a691c389-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-kfh6m\" (UID: \"438a18ff-fdc3-44f3-9c51-df15a691c389\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kfh6m" Mar 09 09:29:21 crc kubenswrapper[4861]: I0309 09:29:21.225192 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/438a18ff-fdc3-44f3-9c51-df15a691c389-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-kfh6m\" (UID: \"438a18ff-fdc3-44f3-9c51-df15a691c389\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kfh6m" Mar 09 09:29:21 crc kubenswrapper[4861]: I0309 09:29:21.227248 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/438a18ff-fdc3-44f3-9c51-df15a691c389-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-kfh6m\" (UID: \"438a18ff-fdc3-44f3-9c51-df15a691c389\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kfh6m" Mar 09 09:29:21 crc kubenswrapper[4861]: I0309 09:29:21.235333 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/438a18ff-fdc3-44f3-9c51-df15a691c389-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-kfh6m\" (UID: \"438a18ff-fdc3-44f3-9c51-df15a691c389\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kfh6m" Mar 09 09:29:21 crc kubenswrapper[4861]: I0309 09:29:21.239242 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdg7p\" (UniqueName: \"kubernetes.io/projected/438a18ff-fdc3-44f3-9c51-df15a691c389-kube-api-access-tdg7p\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-kfh6m\" (UID: \"438a18ff-fdc3-44f3-9c51-df15a691c389\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kfh6m" Mar 09 09:29:21 crc kubenswrapper[4861]: I0309 09:29:21.363185 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kfh6m" Mar 09 09:29:21 crc kubenswrapper[4861]: I0309 09:29:21.457662 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2pv7c"] Mar 09 09:29:21 crc kubenswrapper[4861]: W0309 09:29:21.486868 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9cc96eed_6d35_4ea5_a5a4_5ca80a1952df.slice/crio-cbd59fdc460c29b01f6e43a91771a418e06e35afc53a60c26db5185d34a14ea0 WatchSource:0}: Error finding container cbd59fdc460c29b01f6e43a91771a418e06e35afc53a60c26db5185d34a14ea0: Status 404 returned error can't find the container with id cbd59fdc460c29b01f6e43a91771a418e06e35afc53a60c26db5185d34a14ea0 Mar 09 09:29:21 crc kubenswrapper[4861]: I0309 09:29:21.962923 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kfh6m"] Mar 09 09:29:21 crc kubenswrapper[4861]: I0309 09:29:21.970600 4861 generic.go:334] "Generic (PLEG): container finished" podID="9cc96eed-6d35-4ea5-a5a4-5ca80a1952df" containerID="89b19c05237812d79a0c7639b4e880405de7977335281eb035159a4a2184e91f" exitCode=0 Mar 09 09:29:21 crc kubenswrapper[4861]: I0309 09:29:21.970645 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2pv7c" event={"ID":"9cc96eed-6d35-4ea5-a5a4-5ca80a1952df","Type":"ContainerDied","Data":"89b19c05237812d79a0c7639b4e880405de7977335281eb035159a4a2184e91f"} Mar 09 09:29:21 crc kubenswrapper[4861]: I0309 09:29:21.970669 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2pv7c" event={"ID":"9cc96eed-6d35-4ea5-a5a4-5ca80a1952df","Type":"ContainerStarted","Data":"cbd59fdc460c29b01f6e43a91771a418e06e35afc53a60c26db5185d34a14ea0"} Mar 09 09:29:22 crc kubenswrapper[4861]: I0309 09:29:22.723627 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-25rz6"] Mar 09 09:29:22 crc kubenswrapper[4861]: I0309 09:29:22.728174 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-25rz6" Mar 09 09:29:22 crc kubenswrapper[4861]: I0309 09:29:22.755451 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-25rz6"] Mar 09 09:29:22 crc kubenswrapper[4861]: I0309 09:29:22.756629 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8l6p\" (UniqueName: \"kubernetes.io/projected/df80ee46-58e5-462b-a44f-d8d3e3a4a7b4-kube-api-access-q8l6p\") pod \"certified-operators-25rz6\" (UID: \"df80ee46-58e5-462b-a44f-d8d3e3a4a7b4\") " pod="openshift-marketplace/certified-operators-25rz6" Mar 09 09:29:22 crc kubenswrapper[4861]: I0309 09:29:22.756701 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df80ee46-58e5-462b-a44f-d8d3e3a4a7b4-catalog-content\") pod \"certified-operators-25rz6\" (UID: \"df80ee46-58e5-462b-a44f-d8d3e3a4a7b4\") " pod="openshift-marketplace/certified-operators-25rz6" Mar 09 09:29:22 crc kubenswrapper[4861]: I0309 09:29:22.756833 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df80ee46-58e5-462b-a44f-d8d3e3a4a7b4-utilities\") pod \"certified-operators-25rz6\" (UID: \"df80ee46-58e5-462b-a44f-d8d3e3a4a7b4\") " pod="openshift-marketplace/certified-operators-25rz6" Mar 09 09:29:22 crc kubenswrapper[4861]: I0309 09:29:22.858555 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df80ee46-58e5-462b-a44f-d8d3e3a4a7b4-utilities\") pod \"certified-operators-25rz6\" (UID: \"df80ee46-58e5-462b-a44f-d8d3e3a4a7b4\") " pod="openshift-marketplace/certified-operators-25rz6" Mar 09 09:29:22 crc kubenswrapper[4861]: I0309 09:29:22.858610 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8l6p\" (UniqueName: \"kubernetes.io/projected/df80ee46-58e5-462b-a44f-d8d3e3a4a7b4-kube-api-access-q8l6p\") pod \"certified-operators-25rz6\" (UID: \"df80ee46-58e5-462b-a44f-d8d3e3a4a7b4\") " pod="openshift-marketplace/certified-operators-25rz6" Mar 09 09:29:22 crc kubenswrapper[4861]: I0309 09:29:22.858670 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df80ee46-58e5-462b-a44f-d8d3e3a4a7b4-catalog-content\") pod \"certified-operators-25rz6\" (UID: \"df80ee46-58e5-462b-a44f-d8d3e3a4a7b4\") " pod="openshift-marketplace/certified-operators-25rz6" Mar 09 09:29:22 crc kubenswrapper[4861]: I0309 09:29:22.859146 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df80ee46-58e5-462b-a44f-d8d3e3a4a7b4-utilities\") pod \"certified-operators-25rz6\" (UID: \"df80ee46-58e5-462b-a44f-d8d3e3a4a7b4\") " pod="openshift-marketplace/certified-operators-25rz6" Mar 09 09:29:22 crc kubenswrapper[4861]: I0309 09:29:22.859364 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df80ee46-58e5-462b-a44f-d8d3e3a4a7b4-catalog-content\") pod \"certified-operators-25rz6\" (UID: \"df80ee46-58e5-462b-a44f-d8d3e3a4a7b4\") " pod="openshift-marketplace/certified-operators-25rz6" Mar 09 09:29:22 crc kubenswrapper[4861]: I0309 09:29:22.881625 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8l6p\" (UniqueName: \"kubernetes.io/projected/df80ee46-58e5-462b-a44f-d8d3e3a4a7b4-kube-api-access-q8l6p\") pod \"certified-operators-25rz6\" (UID: \"df80ee46-58e5-462b-a44f-d8d3e3a4a7b4\") " pod="openshift-marketplace/certified-operators-25rz6" Mar 09 09:29:22 crc kubenswrapper[4861]: I0309 09:29:22.985163 4861 generic.go:334] "Generic (PLEG): container finished" podID="9cc96eed-6d35-4ea5-a5a4-5ca80a1952df" containerID="e2931393291b3fefcae9839447bdcd0d5e32ed9f490b24928b42e493186645af" exitCode=0 Mar 09 09:29:22 crc kubenswrapper[4861]: I0309 09:29:22.985248 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2pv7c" event={"ID":"9cc96eed-6d35-4ea5-a5a4-5ca80a1952df","Type":"ContainerDied","Data":"e2931393291b3fefcae9839447bdcd0d5e32ed9f490b24928b42e493186645af"} Mar 09 09:29:22 crc kubenswrapper[4861]: I0309 09:29:22.990923 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kfh6m" event={"ID":"438a18ff-fdc3-44f3-9c51-df15a691c389","Type":"ContainerStarted","Data":"cad885fe85510965ad0015886ad312a6bc18324569e55625d267ee8a3051e352"} Mar 09 09:29:23 crc kubenswrapper[4861]: I0309 09:29:23.074683 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-25rz6" Mar 09 09:29:23 crc kubenswrapper[4861]: I0309 09:29:23.647578 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-25rz6"] Mar 09 09:29:23 crc kubenswrapper[4861]: W0309 09:29:23.652943 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf80ee46_58e5_462b_a44f_d8d3e3a4a7b4.slice/crio-5923acf02e37ea7eeca840b15a6e2bd201edab98fad672979b123886238246cc WatchSource:0}: Error finding container 5923acf02e37ea7eeca840b15a6e2bd201edab98fad672979b123886238246cc: Status 404 returned error can't find the container with id 5923acf02e37ea7eeca840b15a6e2bd201edab98fad672979b123886238246cc Mar 09 09:29:24 crc kubenswrapper[4861]: I0309 09:29:24.002304 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-25rz6" event={"ID":"df80ee46-58e5-462b-a44f-d8d3e3a4a7b4","Type":"ContainerStarted","Data":"912cf97c9b671c317eeba747d3f4ae15d1110a8a88695767569f41487a419264"} Mar 09 09:29:24 crc kubenswrapper[4861]: I0309 09:29:24.002670 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-25rz6" event={"ID":"df80ee46-58e5-462b-a44f-d8d3e3a4a7b4","Type":"ContainerStarted","Data":"5923acf02e37ea7eeca840b15a6e2bd201edab98fad672979b123886238246cc"} Mar 09 09:29:24 crc kubenswrapper[4861]: I0309 09:29:24.487065 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zmd8c" Mar 09 09:29:25 crc kubenswrapper[4861]: I0309 09:29:25.020357 4861 generic.go:334] "Generic (PLEG): container finished" podID="df80ee46-58e5-462b-a44f-d8d3e3a4a7b4" containerID="912cf97c9b671c317eeba747d3f4ae15d1110a8a88695767569f41487a419264" exitCode=0 Mar 09 09:29:25 crc kubenswrapper[4861]: I0309 09:29:25.020518 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-25rz6" event={"ID":"df80ee46-58e5-462b-a44f-d8d3e3a4a7b4","Type":"ContainerDied","Data":"912cf97c9b671c317eeba747d3f4ae15d1110a8a88695767569f41487a419264"} Mar 09 09:29:25 crc kubenswrapper[4861]: I0309 09:29:25.024297 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2pv7c" event={"ID":"9cc96eed-6d35-4ea5-a5a4-5ca80a1952df","Type":"ContainerStarted","Data":"4891083edfb95fc1b472af63915adcd2c67ebd11e711cebe3cd53a4709c9530a"} Mar 09 09:29:25 crc kubenswrapper[4861]: I0309 09:29:25.063343 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2pv7c" podStartSLOduration=3.297173865 podStartE2EDuration="5.06332654s" podCreationTimestamp="2026-03-09 09:29:20 +0000 UTC" firstStartedPulling="2026-03-09 09:29:21.978960765 +0000 UTC m=+1405.064000166" lastFinishedPulling="2026-03-09 09:29:23.74511344 +0000 UTC m=+1406.830152841" observedRunningTime="2026-03-09 09:29:25.060438048 +0000 UTC m=+1408.145477449" watchObservedRunningTime="2026-03-09 09:29:25.06332654 +0000 UTC m=+1408.148365941" Mar 09 09:29:25 crc kubenswrapper[4861]: I0309 09:29:25.110817 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zmd8c"] Mar 09 09:29:25 crc kubenswrapper[4861]: I0309 09:29:25.111079 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zmd8c" podUID="c8895c3e-9b4a-424d-965d-756bf35edf5e" containerName="registry-server" containerID="cri-o://b65e835d295ca03e181ca77ac501140b8cf3a2b0464aacaa697247d85eb2d305" gracePeriod=2 Mar 09 09:29:25 crc kubenswrapper[4861]: I0309 09:29:25.618537 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zmd8c" Mar 09 09:29:25 crc kubenswrapper[4861]: I0309 09:29:25.719460 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8895c3e-9b4a-424d-965d-756bf35edf5e-utilities\") pod \"c8895c3e-9b4a-424d-965d-756bf35edf5e\" (UID: \"c8895c3e-9b4a-424d-965d-756bf35edf5e\") " Mar 09 09:29:25 crc kubenswrapper[4861]: I0309 09:29:25.719544 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8895c3e-9b4a-424d-965d-756bf35edf5e-catalog-content\") pod \"c8895c3e-9b4a-424d-965d-756bf35edf5e\" (UID: \"c8895c3e-9b4a-424d-965d-756bf35edf5e\") " Mar 09 09:29:25 crc kubenswrapper[4861]: I0309 09:29:25.720348 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8895c3e-9b4a-424d-965d-756bf35edf5e-utilities" (OuterVolumeSpecName: "utilities") pod "c8895c3e-9b4a-424d-965d-756bf35edf5e" (UID: "c8895c3e-9b4a-424d-965d-756bf35edf5e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:29:25 crc kubenswrapper[4861]: I0309 09:29:25.726438 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8895c3e-9b4a-424d-965d-756bf35edf5e-kube-api-access-t8sgr" (OuterVolumeSpecName: "kube-api-access-t8sgr") pod "c8895c3e-9b4a-424d-965d-756bf35edf5e" (UID: "c8895c3e-9b4a-424d-965d-756bf35edf5e"). InnerVolumeSpecName "kube-api-access-t8sgr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:29:25 crc kubenswrapper[4861]: I0309 09:29:25.719605 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8sgr\" (UniqueName: \"kubernetes.io/projected/c8895c3e-9b4a-424d-965d-756bf35edf5e-kube-api-access-t8sgr\") pod \"c8895c3e-9b4a-424d-965d-756bf35edf5e\" (UID: \"c8895c3e-9b4a-424d-965d-756bf35edf5e\") " Mar 09 09:29:25 crc kubenswrapper[4861]: I0309 09:29:25.741719 4861 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8895c3e-9b4a-424d-965d-756bf35edf5e-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:25 crc kubenswrapper[4861]: I0309 09:29:25.741754 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8sgr\" (UniqueName: \"kubernetes.io/projected/c8895c3e-9b4a-424d-965d-756bf35edf5e-kube-api-access-t8sgr\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:25 crc kubenswrapper[4861]: I0309 09:29:25.777055 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8895c3e-9b4a-424d-965d-756bf35edf5e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c8895c3e-9b4a-424d-965d-756bf35edf5e" (UID: "c8895c3e-9b4a-424d-965d-756bf35edf5e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:29:25 crc kubenswrapper[4861]: I0309 09:29:25.844318 4861 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8895c3e-9b4a-424d-965d-756bf35edf5e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:26 crc kubenswrapper[4861]: I0309 09:29:26.042896 4861 generic.go:334] "Generic (PLEG): container finished" podID="c8895c3e-9b4a-424d-965d-756bf35edf5e" containerID="b65e835d295ca03e181ca77ac501140b8cf3a2b0464aacaa697247d85eb2d305" exitCode=0 Mar 09 09:29:26 crc kubenswrapper[4861]: I0309 09:29:26.044318 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zmd8c" Mar 09 09:29:26 crc kubenswrapper[4861]: I0309 09:29:26.044455 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zmd8c" event={"ID":"c8895c3e-9b4a-424d-965d-756bf35edf5e","Type":"ContainerDied","Data":"b65e835d295ca03e181ca77ac501140b8cf3a2b0464aacaa697247d85eb2d305"} Mar 09 09:29:26 crc kubenswrapper[4861]: I0309 09:29:26.044499 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zmd8c" event={"ID":"c8895c3e-9b4a-424d-965d-756bf35edf5e","Type":"ContainerDied","Data":"ddfd72c386cbd06b9e3feb27096651a3074645ddc8a2252a358300037ff38c6f"} Mar 09 09:29:26 crc kubenswrapper[4861]: I0309 09:29:26.044523 4861 scope.go:117] "RemoveContainer" containerID="b65e835d295ca03e181ca77ac501140b8cf3a2b0464aacaa697247d85eb2d305" Mar 09 09:29:26 crc kubenswrapper[4861]: I0309 09:29:26.099892 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zmd8c"] Mar 09 09:29:26 crc kubenswrapper[4861]: I0309 09:29:26.102198 4861 scope.go:117] "RemoveContainer" containerID="e662f0f8719bc2347484eddc9d31815f5df3dd6967f8690b0eba57bd2343e80e" Mar 09 09:29:26 crc kubenswrapper[4861]: I0309 09:29:26.110869 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zmd8c"] Mar 09 09:29:26 crc kubenswrapper[4861]: I0309 09:29:26.147119 4861 scope.go:117] "RemoveContainer" containerID="50dbe922246ee43cb58f53ad06b47ab03fdc45005d2c048b07975a98e5a5b3ad" Mar 09 09:29:26 crc kubenswrapper[4861]: I0309 09:29:26.182066 4861 scope.go:117] "RemoveContainer" containerID="b65e835d295ca03e181ca77ac501140b8cf3a2b0464aacaa697247d85eb2d305" Mar 09 09:29:26 crc kubenswrapper[4861]: E0309 09:29:26.182649 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b65e835d295ca03e181ca77ac501140b8cf3a2b0464aacaa697247d85eb2d305\": container with ID starting with b65e835d295ca03e181ca77ac501140b8cf3a2b0464aacaa697247d85eb2d305 not found: ID does not exist" containerID="b65e835d295ca03e181ca77ac501140b8cf3a2b0464aacaa697247d85eb2d305" Mar 09 09:29:26 crc kubenswrapper[4861]: I0309 09:29:26.182693 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b65e835d295ca03e181ca77ac501140b8cf3a2b0464aacaa697247d85eb2d305"} err="failed to get container status \"b65e835d295ca03e181ca77ac501140b8cf3a2b0464aacaa697247d85eb2d305\": rpc error: code = NotFound desc = could not find container \"b65e835d295ca03e181ca77ac501140b8cf3a2b0464aacaa697247d85eb2d305\": container with ID starting with b65e835d295ca03e181ca77ac501140b8cf3a2b0464aacaa697247d85eb2d305 not found: ID does not exist" Mar 09 09:29:26 crc kubenswrapper[4861]: I0309 09:29:26.182716 4861 scope.go:117] "RemoveContainer" containerID="e662f0f8719bc2347484eddc9d31815f5df3dd6967f8690b0eba57bd2343e80e" Mar 09 09:29:26 crc kubenswrapper[4861]: E0309 09:29:26.183055 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e662f0f8719bc2347484eddc9d31815f5df3dd6967f8690b0eba57bd2343e80e\": container with ID starting with e662f0f8719bc2347484eddc9d31815f5df3dd6967f8690b0eba57bd2343e80e not found: ID does not exist" containerID="e662f0f8719bc2347484eddc9d31815f5df3dd6967f8690b0eba57bd2343e80e" Mar 09 09:29:26 crc kubenswrapper[4861]: I0309 09:29:26.183076 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e662f0f8719bc2347484eddc9d31815f5df3dd6967f8690b0eba57bd2343e80e"} err="failed to get container status \"e662f0f8719bc2347484eddc9d31815f5df3dd6967f8690b0eba57bd2343e80e\": rpc error: code = NotFound desc = could not find container \"e662f0f8719bc2347484eddc9d31815f5df3dd6967f8690b0eba57bd2343e80e\": container with ID starting with e662f0f8719bc2347484eddc9d31815f5df3dd6967f8690b0eba57bd2343e80e not found: ID does not exist" Mar 09 09:29:26 crc kubenswrapper[4861]: I0309 09:29:26.183092 4861 scope.go:117] "RemoveContainer" containerID="50dbe922246ee43cb58f53ad06b47ab03fdc45005d2c048b07975a98e5a5b3ad" Mar 09 09:29:26 crc kubenswrapper[4861]: E0309 09:29:26.183592 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50dbe922246ee43cb58f53ad06b47ab03fdc45005d2c048b07975a98e5a5b3ad\": container with ID starting with 50dbe922246ee43cb58f53ad06b47ab03fdc45005d2c048b07975a98e5a5b3ad not found: ID does not exist" containerID="50dbe922246ee43cb58f53ad06b47ab03fdc45005d2c048b07975a98e5a5b3ad" Mar 09 09:29:26 crc kubenswrapper[4861]: I0309 09:29:26.183619 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50dbe922246ee43cb58f53ad06b47ab03fdc45005d2c048b07975a98e5a5b3ad"} err="failed to get container status \"50dbe922246ee43cb58f53ad06b47ab03fdc45005d2c048b07975a98e5a5b3ad\": rpc error: code = NotFound desc = could not find container \"50dbe922246ee43cb58f53ad06b47ab03fdc45005d2c048b07975a98e5a5b3ad\": container with ID starting with 50dbe922246ee43cb58f53ad06b47ab03fdc45005d2c048b07975a98e5a5b3ad not found: ID does not exist" Mar 09 09:29:27 crc kubenswrapper[4861]: I0309 09:29:27.054272 4861 generic.go:334] "Generic (PLEG): container finished" podID="df80ee46-58e5-462b-a44f-d8d3e3a4a7b4" containerID="44f1e66fb4c471987f7a9eba9e10ff9e451e829b9d3e49e6c18a7250f02feeb2" exitCode=0 Mar 09 09:29:27 crc kubenswrapper[4861]: I0309 09:29:27.054342 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-25rz6" event={"ID":"df80ee46-58e5-462b-a44f-d8d3e3a4a7b4","Type":"ContainerDied","Data":"44f1e66fb4c471987f7a9eba9e10ff9e451e829b9d3e49e6c18a7250f02feeb2"} Mar 09 09:29:27 crc kubenswrapper[4861]: I0309 09:29:27.680350 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8895c3e-9b4a-424d-965d-756bf35edf5e" path="/var/lib/kubelet/pods/c8895c3e-9b4a-424d-965d-756bf35edf5e/volumes" Mar 09 09:29:30 crc kubenswrapper[4861]: I0309 09:29:30.861641 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2pv7c" Mar 09 09:29:30 crc kubenswrapper[4861]: I0309 09:29:30.862261 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2pv7c" Mar 09 09:29:31 crc kubenswrapper[4861]: I0309 09:29:31.921911 4861 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-2pv7c" podUID="9cc96eed-6d35-4ea5-a5a4-5ca80a1952df" containerName="registry-server" probeResult="failure" output=< Mar 09 09:29:31 crc kubenswrapper[4861]: timeout: failed to connect service ":50051" within 1s Mar 09 09:29:31 crc kubenswrapper[4861]: > Mar 09 09:29:32 crc kubenswrapper[4861]: I0309 09:29:32.988693 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 09 09:29:33 crc kubenswrapper[4861]: I0309 09:29:33.162825 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 09 09:29:35 crc kubenswrapper[4861]: I0309 09:29:35.162900 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kfh6m" event={"ID":"438a18ff-fdc3-44f3-9c51-df15a691c389","Type":"ContainerStarted","Data":"af2fc03b0505f4abd78742d61269755db5aebeeb6d9fa0e17c1559ef5618e2ec"} Mar 09 09:29:35 crc kubenswrapper[4861]: I0309 09:29:35.166720 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-25rz6" event={"ID":"df80ee46-58e5-462b-a44f-d8d3e3a4a7b4","Type":"ContainerStarted","Data":"1699f26bfa35e206b44d08aca1f29bbfaf39a8d03978a3ec7f4407a801b1444b"} Mar 09 09:29:35 crc kubenswrapper[4861]: I0309 09:29:35.180549 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kfh6m" podStartSLOduration=3.044559527 podStartE2EDuration="15.180506425s" podCreationTimestamp="2026-03-09 09:29:20 +0000 UTC" firstStartedPulling="2026-03-09 09:29:21.9777302 +0000 UTC m=+1405.062769601" lastFinishedPulling="2026-03-09 09:29:34.113677098 +0000 UTC m=+1417.198716499" observedRunningTime="2026-03-09 09:29:35.179344002 +0000 UTC m=+1418.264383413" watchObservedRunningTime="2026-03-09 09:29:35.180506425 +0000 UTC m=+1418.265545826" Mar 09 09:29:35 crc kubenswrapper[4861]: I0309 09:29:35.205888 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-25rz6" podStartSLOduration=3.48458812 podStartE2EDuration="13.205865272s" podCreationTimestamp="2026-03-09 09:29:22 +0000 UTC" firstStartedPulling="2026-03-09 09:29:25.023417391 +0000 UTC m=+1408.108456792" lastFinishedPulling="2026-03-09 09:29:34.744694543 +0000 UTC m=+1417.829733944" observedRunningTime="2026-03-09 09:29:35.197494495 +0000 UTC m=+1418.282533896" watchObservedRunningTime="2026-03-09 09:29:35.205865272 +0000 UTC m=+1418.290904673" Mar 09 09:29:41 crc kubenswrapper[4861]: I0309 09:29:41.913231 4861 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-2pv7c" podUID="9cc96eed-6d35-4ea5-a5a4-5ca80a1952df" containerName="registry-server" probeResult="failure" output=< Mar 09 09:29:41 crc kubenswrapper[4861]: timeout: failed to connect service ":50051" within 1s Mar 09 09:29:41 crc kubenswrapper[4861]: > Mar 09 09:29:43 crc kubenswrapper[4861]: I0309 09:29:43.076025 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-25rz6" Mar 09 09:29:43 crc kubenswrapper[4861]: I0309 09:29:43.076087 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-25rz6" Mar 09 09:29:43 crc kubenswrapper[4861]: I0309 09:29:43.135984 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-25rz6" Mar 09 09:29:43 crc kubenswrapper[4861]: I0309 09:29:43.296114 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-25rz6" Mar 09 09:29:43 crc kubenswrapper[4861]: I0309 09:29:43.390973 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-25rz6"] Mar 09 09:29:45 crc kubenswrapper[4861]: I0309 09:29:45.268838 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-25rz6" podUID="df80ee46-58e5-462b-a44f-d8d3e3a4a7b4" containerName="registry-server" containerID="cri-o://1699f26bfa35e206b44d08aca1f29bbfaf39a8d03978a3ec7f4407a801b1444b" gracePeriod=2 Mar 09 09:29:45 crc kubenswrapper[4861]: I0309 09:29:45.735799 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-25rz6" Mar 09 09:29:45 crc kubenswrapper[4861]: I0309 09:29:45.761909 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8l6p\" (UniqueName: \"kubernetes.io/projected/df80ee46-58e5-462b-a44f-d8d3e3a4a7b4-kube-api-access-q8l6p\") pod \"df80ee46-58e5-462b-a44f-d8d3e3a4a7b4\" (UID: \"df80ee46-58e5-462b-a44f-d8d3e3a4a7b4\") " Mar 09 09:29:45 crc kubenswrapper[4861]: I0309 09:29:45.761988 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df80ee46-58e5-462b-a44f-d8d3e3a4a7b4-utilities\") pod \"df80ee46-58e5-462b-a44f-d8d3e3a4a7b4\" (UID: \"df80ee46-58e5-462b-a44f-d8d3e3a4a7b4\") " Mar 09 09:29:45 crc kubenswrapper[4861]: I0309 09:29:45.762159 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df80ee46-58e5-462b-a44f-d8d3e3a4a7b4-catalog-content\") pod \"df80ee46-58e5-462b-a44f-d8d3e3a4a7b4\" (UID: \"df80ee46-58e5-462b-a44f-d8d3e3a4a7b4\") " Mar 09 09:29:45 crc kubenswrapper[4861]: I0309 09:29:45.763025 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df80ee46-58e5-462b-a44f-d8d3e3a4a7b4-utilities" (OuterVolumeSpecName: "utilities") pod "df80ee46-58e5-462b-a44f-d8d3e3a4a7b4" (UID: "df80ee46-58e5-462b-a44f-d8d3e3a4a7b4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:29:45 crc kubenswrapper[4861]: I0309 09:29:45.786168 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df80ee46-58e5-462b-a44f-d8d3e3a4a7b4-kube-api-access-q8l6p" (OuterVolumeSpecName: "kube-api-access-q8l6p") pod "df80ee46-58e5-462b-a44f-d8d3e3a4a7b4" (UID: "df80ee46-58e5-462b-a44f-d8d3e3a4a7b4"). InnerVolumeSpecName "kube-api-access-q8l6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:29:45 crc kubenswrapper[4861]: I0309 09:29:45.832691 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df80ee46-58e5-462b-a44f-d8d3e3a4a7b4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "df80ee46-58e5-462b-a44f-d8d3e3a4a7b4" (UID: "df80ee46-58e5-462b-a44f-d8d3e3a4a7b4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:29:45 crc kubenswrapper[4861]: I0309 09:29:45.864162 4861 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df80ee46-58e5-462b-a44f-d8d3e3a4a7b4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:45 crc kubenswrapper[4861]: I0309 09:29:45.864199 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8l6p\" (UniqueName: \"kubernetes.io/projected/df80ee46-58e5-462b-a44f-d8d3e3a4a7b4-kube-api-access-q8l6p\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:45 crc kubenswrapper[4861]: I0309 09:29:45.864211 4861 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df80ee46-58e5-462b-a44f-d8d3e3a4a7b4-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:46 crc kubenswrapper[4861]: I0309 09:29:46.283671 4861 generic.go:334] "Generic (PLEG): container finished" podID="df80ee46-58e5-462b-a44f-d8d3e3a4a7b4" containerID="1699f26bfa35e206b44d08aca1f29bbfaf39a8d03978a3ec7f4407a801b1444b" exitCode=0 Mar 09 09:29:46 crc kubenswrapper[4861]: I0309 09:29:46.283761 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-25rz6" Mar 09 09:29:46 crc kubenswrapper[4861]: I0309 09:29:46.283742 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-25rz6" event={"ID":"df80ee46-58e5-462b-a44f-d8d3e3a4a7b4","Type":"ContainerDied","Data":"1699f26bfa35e206b44d08aca1f29bbfaf39a8d03978a3ec7f4407a801b1444b"} Mar 09 09:29:46 crc kubenswrapper[4861]: I0309 09:29:46.284084 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-25rz6" event={"ID":"df80ee46-58e5-462b-a44f-d8d3e3a4a7b4","Type":"ContainerDied","Data":"5923acf02e37ea7eeca840b15a6e2bd201edab98fad672979b123886238246cc"} Mar 09 09:29:46 crc kubenswrapper[4861]: I0309 09:29:46.284143 4861 scope.go:117] "RemoveContainer" containerID="1699f26bfa35e206b44d08aca1f29bbfaf39a8d03978a3ec7f4407a801b1444b" Mar 09 09:29:46 crc kubenswrapper[4861]: I0309 09:29:46.316802 4861 scope.go:117] "RemoveContainer" containerID="44f1e66fb4c471987f7a9eba9e10ff9e451e829b9d3e49e6c18a7250f02feeb2" Mar 09 09:29:46 crc kubenswrapper[4861]: I0309 09:29:46.324953 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-25rz6"] Mar 09 09:29:46 crc kubenswrapper[4861]: I0309 09:29:46.333185 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-25rz6"] Mar 09 09:29:46 crc kubenswrapper[4861]: I0309 09:29:46.356255 4861 scope.go:117] "RemoveContainer" containerID="912cf97c9b671c317eeba747d3f4ae15d1110a8a88695767569f41487a419264" Mar 09 09:29:46 crc kubenswrapper[4861]: I0309 09:29:46.390309 4861 scope.go:117] "RemoveContainer" containerID="1699f26bfa35e206b44d08aca1f29bbfaf39a8d03978a3ec7f4407a801b1444b" Mar 09 09:29:46 crc kubenswrapper[4861]: E0309 09:29:46.391264 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1699f26bfa35e206b44d08aca1f29bbfaf39a8d03978a3ec7f4407a801b1444b\": container with ID starting with 1699f26bfa35e206b44d08aca1f29bbfaf39a8d03978a3ec7f4407a801b1444b not found: ID does not exist" containerID="1699f26bfa35e206b44d08aca1f29bbfaf39a8d03978a3ec7f4407a801b1444b" Mar 09 09:29:46 crc kubenswrapper[4861]: I0309 09:29:46.391309 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1699f26bfa35e206b44d08aca1f29bbfaf39a8d03978a3ec7f4407a801b1444b"} err="failed to get container status \"1699f26bfa35e206b44d08aca1f29bbfaf39a8d03978a3ec7f4407a801b1444b\": rpc error: code = NotFound desc = could not find container \"1699f26bfa35e206b44d08aca1f29bbfaf39a8d03978a3ec7f4407a801b1444b\": container with ID starting with 1699f26bfa35e206b44d08aca1f29bbfaf39a8d03978a3ec7f4407a801b1444b not found: ID does not exist" Mar 09 09:29:46 crc kubenswrapper[4861]: I0309 09:29:46.391386 4861 scope.go:117] "RemoveContainer" containerID="44f1e66fb4c471987f7a9eba9e10ff9e451e829b9d3e49e6c18a7250f02feeb2" Mar 09 09:29:46 crc kubenswrapper[4861]: E0309 09:29:46.391821 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44f1e66fb4c471987f7a9eba9e10ff9e451e829b9d3e49e6c18a7250f02feeb2\": container with ID starting with 44f1e66fb4c471987f7a9eba9e10ff9e451e829b9d3e49e6c18a7250f02feeb2 not found: ID does not exist" containerID="44f1e66fb4c471987f7a9eba9e10ff9e451e829b9d3e49e6c18a7250f02feeb2" Mar 09 09:29:46 crc kubenswrapper[4861]: I0309 09:29:46.391860 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44f1e66fb4c471987f7a9eba9e10ff9e451e829b9d3e49e6c18a7250f02feeb2"} err="failed to get container status \"44f1e66fb4c471987f7a9eba9e10ff9e451e829b9d3e49e6c18a7250f02feeb2\": rpc error: code = NotFound desc = could not find container \"44f1e66fb4c471987f7a9eba9e10ff9e451e829b9d3e49e6c18a7250f02feeb2\": container with ID starting with 44f1e66fb4c471987f7a9eba9e10ff9e451e829b9d3e49e6c18a7250f02feeb2 not found: ID does not exist" Mar 09 09:29:46 crc kubenswrapper[4861]: I0309 09:29:46.391888 4861 scope.go:117] "RemoveContainer" containerID="912cf97c9b671c317eeba747d3f4ae15d1110a8a88695767569f41487a419264" Mar 09 09:29:46 crc kubenswrapper[4861]: E0309 09:29:46.392235 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"912cf97c9b671c317eeba747d3f4ae15d1110a8a88695767569f41487a419264\": container with ID starting with 912cf97c9b671c317eeba747d3f4ae15d1110a8a88695767569f41487a419264 not found: ID does not exist" containerID="912cf97c9b671c317eeba747d3f4ae15d1110a8a88695767569f41487a419264" Mar 09 09:29:46 crc kubenswrapper[4861]: I0309 09:29:46.392275 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"912cf97c9b671c317eeba747d3f4ae15d1110a8a88695767569f41487a419264"} err="failed to get container status \"912cf97c9b671c317eeba747d3f4ae15d1110a8a88695767569f41487a419264\": rpc error: code = NotFound desc = could not find container \"912cf97c9b671c317eeba747d3f4ae15d1110a8a88695767569f41487a419264\": container with ID starting with 912cf97c9b671c317eeba747d3f4ae15d1110a8a88695767569f41487a419264 not found: ID does not exist" Mar 09 09:29:47 crc kubenswrapper[4861]: I0309 09:29:47.297218 4861 generic.go:334] "Generic (PLEG): container finished" podID="438a18ff-fdc3-44f3-9c51-df15a691c389" containerID="af2fc03b0505f4abd78742d61269755db5aebeeb6d9fa0e17c1559ef5618e2ec" exitCode=0 Mar 09 09:29:47 crc kubenswrapper[4861]: I0309 09:29:47.297318 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kfh6m" event={"ID":"438a18ff-fdc3-44f3-9c51-df15a691c389","Type":"ContainerDied","Data":"af2fc03b0505f4abd78742d61269755db5aebeeb6d9fa0e17c1559ef5618e2ec"} Mar 09 09:29:47 crc kubenswrapper[4861]: I0309 09:29:47.672203 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df80ee46-58e5-462b-a44f-d8d3e3a4a7b4" path="/var/lib/kubelet/pods/df80ee46-58e5-462b-a44f-d8d3e3a4a7b4/volumes" Mar 09 09:29:48 crc kubenswrapper[4861]: I0309 09:29:48.744210 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kfh6m" Mar 09 09:29:48 crc kubenswrapper[4861]: I0309 09:29:48.819510 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/438a18ff-fdc3-44f3-9c51-df15a691c389-ssh-key-openstack-edpm-ipam\") pod \"438a18ff-fdc3-44f3-9c51-df15a691c389\" (UID: \"438a18ff-fdc3-44f3-9c51-df15a691c389\") " Mar 09 09:29:48 crc kubenswrapper[4861]: I0309 09:29:48.819733 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/438a18ff-fdc3-44f3-9c51-df15a691c389-repo-setup-combined-ca-bundle\") pod \"438a18ff-fdc3-44f3-9c51-df15a691c389\" (UID: \"438a18ff-fdc3-44f3-9c51-df15a691c389\") " Mar 09 09:29:48 crc kubenswrapper[4861]: I0309 09:29:48.819837 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/438a18ff-fdc3-44f3-9c51-df15a691c389-inventory\") pod \"438a18ff-fdc3-44f3-9c51-df15a691c389\" (UID: \"438a18ff-fdc3-44f3-9c51-df15a691c389\") " Mar 09 09:29:48 crc kubenswrapper[4861]: I0309 09:29:48.819889 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdg7p\" (UniqueName: \"kubernetes.io/projected/438a18ff-fdc3-44f3-9c51-df15a691c389-kube-api-access-tdg7p\") pod \"438a18ff-fdc3-44f3-9c51-df15a691c389\" (UID: \"438a18ff-fdc3-44f3-9c51-df15a691c389\") " Mar 09 09:29:48 crc kubenswrapper[4861]: I0309 09:29:48.827003 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/438a18ff-fdc3-44f3-9c51-df15a691c389-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "438a18ff-fdc3-44f3-9c51-df15a691c389" (UID: "438a18ff-fdc3-44f3-9c51-df15a691c389"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:29:48 crc kubenswrapper[4861]: I0309 09:29:48.834310 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/438a18ff-fdc3-44f3-9c51-df15a691c389-kube-api-access-tdg7p" (OuterVolumeSpecName: "kube-api-access-tdg7p") pod "438a18ff-fdc3-44f3-9c51-df15a691c389" (UID: "438a18ff-fdc3-44f3-9c51-df15a691c389"). InnerVolumeSpecName "kube-api-access-tdg7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:29:48 crc kubenswrapper[4861]: I0309 09:29:48.851984 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/438a18ff-fdc3-44f3-9c51-df15a691c389-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "438a18ff-fdc3-44f3-9c51-df15a691c389" (UID: "438a18ff-fdc3-44f3-9c51-df15a691c389"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:29:48 crc kubenswrapper[4861]: I0309 09:29:48.853159 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/438a18ff-fdc3-44f3-9c51-df15a691c389-inventory" (OuterVolumeSpecName: "inventory") pod "438a18ff-fdc3-44f3-9c51-df15a691c389" (UID: "438a18ff-fdc3-44f3-9c51-df15a691c389"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:29:48 crc kubenswrapper[4861]: I0309 09:29:48.922018 4861 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/438a18ff-fdc3-44f3-9c51-df15a691c389-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:48 crc kubenswrapper[4861]: I0309 09:29:48.922074 4861 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/438a18ff-fdc3-44f3-9c51-df15a691c389-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:48 crc kubenswrapper[4861]: I0309 09:29:48.922098 4861 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/438a18ff-fdc3-44f3-9c51-df15a691c389-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:48 crc kubenswrapper[4861]: I0309 09:29:48.922115 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdg7p\" (UniqueName: \"kubernetes.io/projected/438a18ff-fdc3-44f3-9c51-df15a691c389-kube-api-access-tdg7p\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:49 crc kubenswrapper[4861]: I0309 09:29:49.320939 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kfh6m" event={"ID":"438a18ff-fdc3-44f3-9c51-df15a691c389","Type":"ContainerDied","Data":"cad885fe85510965ad0015886ad312a6bc18324569e55625d267ee8a3051e352"} Mar 09 09:29:49 crc kubenswrapper[4861]: I0309 09:29:49.321218 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cad885fe85510965ad0015886ad312a6bc18324569e55625d267ee8a3051e352" Mar 09 09:29:49 crc kubenswrapper[4861]: I0309 09:29:49.321064 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kfh6m" Mar 09 09:29:49 crc kubenswrapper[4861]: I0309 09:29:49.397410 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-gvccp"] Mar 09 09:29:49 crc kubenswrapper[4861]: E0309 09:29:49.397808 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8895c3e-9b4a-424d-965d-756bf35edf5e" containerName="extract-utilities" Mar 09 09:29:49 crc kubenswrapper[4861]: I0309 09:29:49.397822 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8895c3e-9b4a-424d-965d-756bf35edf5e" containerName="extract-utilities" Mar 09 09:29:49 crc kubenswrapper[4861]: E0309 09:29:49.397838 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="438a18ff-fdc3-44f3-9c51-df15a691c389" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 09 09:29:49 crc kubenswrapper[4861]: I0309 09:29:49.397847 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="438a18ff-fdc3-44f3-9c51-df15a691c389" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 09 09:29:49 crc kubenswrapper[4861]: E0309 09:29:49.397862 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8895c3e-9b4a-424d-965d-756bf35edf5e" containerName="registry-server" Mar 09 09:29:49 crc kubenswrapper[4861]: I0309 09:29:49.397870 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8895c3e-9b4a-424d-965d-756bf35edf5e" containerName="registry-server" Mar 09 09:29:49 crc kubenswrapper[4861]: E0309 09:29:49.397884 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8895c3e-9b4a-424d-965d-756bf35edf5e" containerName="extract-content" Mar 09 09:29:49 crc kubenswrapper[4861]: I0309 09:29:49.397893 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8895c3e-9b4a-424d-965d-756bf35edf5e" containerName="extract-content" Mar 09 09:29:49 crc kubenswrapper[4861]: E0309 09:29:49.397908 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df80ee46-58e5-462b-a44f-d8d3e3a4a7b4" containerName="extract-utilities" Mar 09 09:29:49 crc kubenswrapper[4861]: I0309 09:29:49.397915 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="df80ee46-58e5-462b-a44f-d8d3e3a4a7b4" containerName="extract-utilities" Mar 09 09:29:49 crc kubenswrapper[4861]: E0309 09:29:49.397926 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df80ee46-58e5-462b-a44f-d8d3e3a4a7b4" containerName="registry-server" Mar 09 09:29:49 crc kubenswrapper[4861]: I0309 09:29:49.397935 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="df80ee46-58e5-462b-a44f-d8d3e3a4a7b4" containerName="registry-server" Mar 09 09:29:49 crc kubenswrapper[4861]: E0309 09:29:49.397962 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df80ee46-58e5-462b-a44f-d8d3e3a4a7b4" containerName="extract-content" Mar 09 09:29:49 crc kubenswrapper[4861]: I0309 09:29:49.397971 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="df80ee46-58e5-462b-a44f-d8d3e3a4a7b4" containerName="extract-content" Mar 09 09:29:49 crc kubenswrapper[4861]: I0309 09:29:49.398171 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8895c3e-9b4a-424d-965d-756bf35edf5e" containerName="registry-server" Mar 09 09:29:49 crc kubenswrapper[4861]: I0309 09:29:49.398186 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="df80ee46-58e5-462b-a44f-d8d3e3a4a7b4" containerName="registry-server" Mar 09 09:29:49 crc kubenswrapper[4861]: I0309 09:29:49.398198 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="438a18ff-fdc3-44f3-9c51-df15a691c389" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 09 09:29:49 crc kubenswrapper[4861]: I0309 09:29:49.398872 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-gvccp" Mar 09 09:29:49 crc kubenswrapper[4861]: I0309 09:29:49.402933 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lkd5q" Mar 09 09:29:49 crc kubenswrapper[4861]: I0309 09:29:49.403615 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 09:29:49 crc kubenswrapper[4861]: I0309 09:29:49.403630 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 09:29:49 crc kubenswrapper[4861]: I0309 09:29:49.403779 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 09:29:49 crc kubenswrapper[4861]: I0309 09:29:49.415357 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-gvccp"] Mar 09 09:29:49 crc kubenswrapper[4861]: I0309 09:29:49.429952 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6t5v\" (UniqueName: \"kubernetes.io/projected/cd678163-1379-40da-be83-c4ace8b0cf0d-kube-api-access-s6t5v\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-gvccp\" (UID: \"cd678163-1379-40da-be83-c4ace8b0cf0d\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-gvccp" Mar 09 09:29:49 crc kubenswrapper[4861]: I0309 09:29:49.430105 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cd678163-1379-40da-be83-c4ace8b0cf0d-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-gvccp\" (UID: \"cd678163-1379-40da-be83-c4ace8b0cf0d\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-gvccp" Mar 09 09:29:49 crc kubenswrapper[4861]: I0309 09:29:49.430141 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd678163-1379-40da-be83-c4ace8b0cf0d-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-gvccp\" (UID: \"cd678163-1379-40da-be83-c4ace8b0cf0d\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-gvccp" Mar 09 09:29:49 crc kubenswrapper[4861]: I0309 09:29:49.532162 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6t5v\" (UniqueName: \"kubernetes.io/projected/cd678163-1379-40da-be83-c4ace8b0cf0d-kube-api-access-s6t5v\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-gvccp\" (UID: \"cd678163-1379-40da-be83-c4ace8b0cf0d\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-gvccp" Mar 09 09:29:49 crc kubenswrapper[4861]: I0309 09:29:49.532269 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cd678163-1379-40da-be83-c4ace8b0cf0d-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-gvccp\" (UID: \"cd678163-1379-40da-be83-c4ace8b0cf0d\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-gvccp" Mar 09 09:29:49 crc kubenswrapper[4861]: I0309 09:29:49.532300 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd678163-1379-40da-be83-c4ace8b0cf0d-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-gvccp\" (UID: \"cd678163-1379-40da-be83-c4ace8b0cf0d\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-gvccp" Mar 09 09:29:49 crc kubenswrapper[4861]: I0309 09:29:49.538159 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd678163-1379-40da-be83-c4ace8b0cf0d-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-gvccp\" (UID: \"cd678163-1379-40da-be83-c4ace8b0cf0d\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-gvccp" Mar 09 09:29:49 crc kubenswrapper[4861]: I0309 09:29:49.541748 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cd678163-1379-40da-be83-c4ace8b0cf0d-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-gvccp\" (UID: \"cd678163-1379-40da-be83-c4ace8b0cf0d\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-gvccp" Mar 09 09:29:49 crc kubenswrapper[4861]: I0309 09:29:49.555899 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6t5v\" (UniqueName: \"kubernetes.io/projected/cd678163-1379-40da-be83-c4ace8b0cf0d-kube-api-access-s6t5v\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-gvccp\" (UID: \"cd678163-1379-40da-be83-c4ace8b0cf0d\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-gvccp" Mar 09 09:29:49 crc kubenswrapper[4861]: I0309 09:29:49.717839 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-gvccp" Mar 09 09:29:50 crc kubenswrapper[4861]: I0309 09:29:50.216286 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-gvccp"] Mar 09 09:29:50 crc kubenswrapper[4861]: I0309 09:29:50.331396 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-gvccp" event={"ID":"cd678163-1379-40da-be83-c4ace8b0cf0d","Type":"ContainerStarted","Data":"d6181ef6f055e42febd54ecd2efb69768202f5fd29cc3293229313634d5b8073"} Mar 09 09:29:50 crc kubenswrapper[4861]: I0309 09:29:50.920737 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2pv7c" Mar 09 09:29:50 crc kubenswrapper[4861]: I0309 09:29:50.974783 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2pv7c" Mar 09 09:29:51 crc kubenswrapper[4861]: I0309 09:29:51.342622 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-gvccp" event={"ID":"cd678163-1379-40da-be83-c4ace8b0cf0d","Type":"ContainerStarted","Data":"cd2e37599c30614a6badd893c4756d5a333953e2f04f45409259769a0fda2349"} Mar 09 09:29:51 crc kubenswrapper[4861]: I0309 09:29:51.374326 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-gvccp" podStartSLOduration=1.9233231929999999 podStartE2EDuration="2.374297613s" podCreationTimestamp="2026-03-09 09:29:49 +0000 UTC" firstStartedPulling="2026-03-09 09:29:50.225039764 +0000 UTC m=+1433.310079165" lastFinishedPulling="2026-03-09 09:29:50.676014184 +0000 UTC m=+1433.761053585" observedRunningTime="2026-03-09 09:29:51.356030146 +0000 UTC m=+1434.441069547" watchObservedRunningTime="2026-03-09 09:29:51.374297613 +0000 UTC m=+1434.459337014" Mar 09 09:29:51 crc kubenswrapper[4861]: I0309 09:29:51.712390 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2pv7c"] Mar 09 09:29:52 crc kubenswrapper[4861]: I0309 09:29:52.350863 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2pv7c" podUID="9cc96eed-6d35-4ea5-a5a4-5ca80a1952df" containerName="registry-server" containerID="cri-o://4891083edfb95fc1b472af63915adcd2c67ebd11e711cebe3cd53a4709c9530a" gracePeriod=2 Mar 09 09:29:52 crc kubenswrapper[4861]: I0309 09:29:52.801807 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2pv7c" Mar 09 09:29:52 crc kubenswrapper[4861]: I0309 09:29:52.998938 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cc96eed-6d35-4ea5-a5a4-5ca80a1952df-catalog-content\") pod \"9cc96eed-6d35-4ea5-a5a4-5ca80a1952df\" (UID: \"9cc96eed-6d35-4ea5-a5a4-5ca80a1952df\") " Mar 09 09:29:52 crc kubenswrapper[4861]: I0309 09:29:52.999149 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cc96eed-6d35-4ea5-a5a4-5ca80a1952df-utilities\") pod \"9cc96eed-6d35-4ea5-a5a4-5ca80a1952df\" (UID: \"9cc96eed-6d35-4ea5-a5a4-5ca80a1952df\") " Mar 09 09:29:52 crc kubenswrapper[4861]: I0309 09:29:52.999247 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6xmh\" (UniqueName: \"kubernetes.io/projected/9cc96eed-6d35-4ea5-a5a4-5ca80a1952df-kube-api-access-k6xmh\") pod \"9cc96eed-6d35-4ea5-a5a4-5ca80a1952df\" (UID: \"9cc96eed-6d35-4ea5-a5a4-5ca80a1952df\") " Mar 09 09:29:53 crc kubenswrapper[4861]: I0309 09:29:52.999960 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cc96eed-6d35-4ea5-a5a4-5ca80a1952df-utilities" (OuterVolumeSpecName: "utilities") pod "9cc96eed-6d35-4ea5-a5a4-5ca80a1952df" (UID: "9cc96eed-6d35-4ea5-a5a4-5ca80a1952df"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:29:53 crc kubenswrapper[4861]: I0309 09:29:53.006994 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cc96eed-6d35-4ea5-a5a4-5ca80a1952df-kube-api-access-k6xmh" (OuterVolumeSpecName: "kube-api-access-k6xmh") pod "9cc96eed-6d35-4ea5-a5a4-5ca80a1952df" (UID: "9cc96eed-6d35-4ea5-a5a4-5ca80a1952df"). InnerVolumeSpecName "kube-api-access-k6xmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:29:53 crc kubenswrapper[4861]: I0309 09:29:53.026813 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cc96eed-6d35-4ea5-a5a4-5ca80a1952df-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9cc96eed-6d35-4ea5-a5a4-5ca80a1952df" (UID: "9cc96eed-6d35-4ea5-a5a4-5ca80a1952df"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:29:53 crc kubenswrapper[4861]: I0309 09:29:53.102703 4861 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9cc96eed-6d35-4ea5-a5a4-5ca80a1952df-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:53 crc kubenswrapper[4861]: I0309 09:29:53.102751 4861 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9cc96eed-6d35-4ea5-a5a4-5ca80a1952df-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:53 crc kubenswrapper[4861]: I0309 09:29:53.102766 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6xmh\" (UniqueName: \"kubernetes.io/projected/9cc96eed-6d35-4ea5-a5a4-5ca80a1952df-kube-api-access-k6xmh\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:53 crc kubenswrapper[4861]: I0309 09:29:53.364285 4861 generic.go:334] "Generic (PLEG): container finished" podID="9cc96eed-6d35-4ea5-a5a4-5ca80a1952df" containerID="4891083edfb95fc1b472af63915adcd2c67ebd11e711cebe3cd53a4709c9530a" exitCode=0 Mar 09 09:29:53 crc kubenswrapper[4861]: I0309 09:29:53.364334 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2pv7c" event={"ID":"9cc96eed-6d35-4ea5-a5a4-5ca80a1952df","Type":"ContainerDied","Data":"4891083edfb95fc1b472af63915adcd2c67ebd11e711cebe3cd53a4709c9530a"} Mar 09 09:29:53 crc kubenswrapper[4861]: I0309 09:29:53.364389 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2pv7c" event={"ID":"9cc96eed-6d35-4ea5-a5a4-5ca80a1952df","Type":"ContainerDied","Data":"cbd59fdc460c29b01f6e43a91771a418e06e35afc53a60c26db5185d34a14ea0"} Mar 09 09:29:53 crc kubenswrapper[4861]: I0309 09:29:53.364407 4861 scope.go:117] "RemoveContainer" containerID="4891083edfb95fc1b472af63915adcd2c67ebd11e711cebe3cd53a4709c9530a" Mar 09 09:29:53 crc kubenswrapper[4861]: I0309 09:29:53.364403 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2pv7c" Mar 09 09:29:53 crc kubenswrapper[4861]: I0309 09:29:53.386438 4861 scope.go:117] "RemoveContainer" containerID="e2931393291b3fefcae9839447bdcd0d5e32ed9f490b24928b42e493186645af" Mar 09 09:29:53 crc kubenswrapper[4861]: I0309 09:29:53.402415 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2pv7c"] Mar 09 09:29:53 crc kubenswrapper[4861]: I0309 09:29:53.412018 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2pv7c"] Mar 09 09:29:53 crc kubenswrapper[4861]: I0309 09:29:53.434678 4861 scope.go:117] "RemoveContainer" containerID="89b19c05237812d79a0c7639b4e880405de7977335281eb035159a4a2184e91f" Mar 09 09:29:53 crc kubenswrapper[4861]: I0309 09:29:53.463646 4861 scope.go:117] "RemoveContainer" containerID="4891083edfb95fc1b472af63915adcd2c67ebd11e711cebe3cd53a4709c9530a" Mar 09 09:29:53 crc kubenswrapper[4861]: E0309 09:29:53.464176 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4891083edfb95fc1b472af63915adcd2c67ebd11e711cebe3cd53a4709c9530a\": container with ID starting with 4891083edfb95fc1b472af63915adcd2c67ebd11e711cebe3cd53a4709c9530a not found: ID does not exist" containerID="4891083edfb95fc1b472af63915adcd2c67ebd11e711cebe3cd53a4709c9530a" Mar 09 09:29:53 crc kubenswrapper[4861]: I0309 09:29:53.464212 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4891083edfb95fc1b472af63915adcd2c67ebd11e711cebe3cd53a4709c9530a"} err="failed to get container status \"4891083edfb95fc1b472af63915adcd2c67ebd11e711cebe3cd53a4709c9530a\": rpc error: code = NotFound desc = could not find container \"4891083edfb95fc1b472af63915adcd2c67ebd11e711cebe3cd53a4709c9530a\": container with ID starting with 4891083edfb95fc1b472af63915adcd2c67ebd11e711cebe3cd53a4709c9530a not found: ID does not exist" Mar 09 09:29:53 crc kubenswrapper[4861]: I0309 09:29:53.464235 4861 scope.go:117] "RemoveContainer" containerID="e2931393291b3fefcae9839447bdcd0d5e32ed9f490b24928b42e493186645af" Mar 09 09:29:53 crc kubenswrapper[4861]: E0309 09:29:53.464627 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2931393291b3fefcae9839447bdcd0d5e32ed9f490b24928b42e493186645af\": container with ID starting with e2931393291b3fefcae9839447bdcd0d5e32ed9f490b24928b42e493186645af not found: ID does not exist" containerID="e2931393291b3fefcae9839447bdcd0d5e32ed9f490b24928b42e493186645af" Mar 09 09:29:53 crc kubenswrapper[4861]: I0309 09:29:53.464668 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2931393291b3fefcae9839447bdcd0d5e32ed9f490b24928b42e493186645af"} err="failed to get container status \"e2931393291b3fefcae9839447bdcd0d5e32ed9f490b24928b42e493186645af\": rpc error: code = NotFound desc = could not find container \"e2931393291b3fefcae9839447bdcd0d5e32ed9f490b24928b42e493186645af\": container with ID starting with e2931393291b3fefcae9839447bdcd0d5e32ed9f490b24928b42e493186645af not found: ID does not exist" Mar 09 09:29:53 crc kubenswrapper[4861]: I0309 09:29:53.464696 4861 scope.go:117] "RemoveContainer" containerID="89b19c05237812d79a0c7639b4e880405de7977335281eb035159a4a2184e91f" Mar 09 09:29:53 crc kubenswrapper[4861]: E0309 09:29:53.465096 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89b19c05237812d79a0c7639b4e880405de7977335281eb035159a4a2184e91f\": container with ID starting with 89b19c05237812d79a0c7639b4e880405de7977335281eb035159a4a2184e91f not found: ID does not exist" containerID="89b19c05237812d79a0c7639b4e880405de7977335281eb035159a4a2184e91f" Mar 09 09:29:53 crc kubenswrapper[4861]: I0309 09:29:53.465130 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89b19c05237812d79a0c7639b4e880405de7977335281eb035159a4a2184e91f"} err="failed to get container status \"89b19c05237812d79a0c7639b4e880405de7977335281eb035159a4a2184e91f\": rpc error: code = NotFound desc = could not find container \"89b19c05237812d79a0c7639b4e880405de7977335281eb035159a4a2184e91f\": container with ID starting with 89b19c05237812d79a0c7639b4e880405de7977335281eb035159a4a2184e91f not found: ID does not exist" Mar 09 09:29:53 crc kubenswrapper[4861]: I0309 09:29:53.685742 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cc96eed-6d35-4ea5-a5a4-5ca80a1952df" path="/var/lib/kubelet/pods/9cc96eed-6d35-4ea5-a5a4-5ca80a1952df/volumes" Mar 09 09:29:54 crc kubenswrapper[4861]: I0309 09:29:54.376505 4861 generic.go:334] "Generic (PLEG): container finished" podID="cd678163-1379-40da-be83-c4ace8b0cf0d" containerID="cd2e37599c30614a6badd893c4756d5a333953e2f04f45409259769a0fda2349" exitCode=0 Mar 09 09:29:54 crc kubenswrapper[4861]: I0309 09:29:54.376606 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-gvccp" event={"ID":"cd678163-1379-40da-be83-c4ace8b0cf0d","Type":"ContainerDied","Data":"cd2e37599c30614a6badd893c4756d5a333953e2f04f45409259769a0fda2349"} Mar 09 09:29:55 crc kubenswrapper[4861]: I0309 09:29:55.771024 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-gvccp" Mar 09 09:29:55 crc kubenswrapper[4861]: I0309 09:29:55.959901 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd678163-1379-40da-be83-c4ace8b0cf0d-inventory\") pod \"cd678163-1379-40da-be83-c4ace8b0cf0d\" (UID: \"cd678163-1379-40da-be83-c4ace8b0cf0d\") " Mar 09 09:29:55 crc kubenswrapper[4861]: I0309 09:29:55.960353 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cd678163-1379-40da-be83-c4ace8b0cf0d-ssh-key-openstack-edpm-ipam\") pod \"cd678163-1379-40da-be83-c4ace8b0cf0d\" (UID: \"cd678163-1379-40da-be83-c4ace8b0cf0d\") " Mar 09 09:29:55 crc kubenswrapper[4861]: I0309 09:29:55.960466 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6t5v\" (UniqueName: \"kubernetes.io/projected/cd678163-1379-40da-be83-c4ace8b0cf0d-kube-api-access-s6t5v\") pod \"cd678163-1379-40da-be83-c4ace8b0cf0d\" (UID: \"cd678163-1379-40da-be83-c4ace8b0cf0d\") " Mar 09 09:29:55 crc kubenswrapper[4861]: I0309 09:29:55.967739 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd678163-1379-40da-be83-c4ace8b0cf0d-kube-api-access-s6t5v" (OuterVolumeSpecName: "kube-api-access-s6t5v") pod "cd678163-1379-40da-be83-c4ace8b0cf0d" (UID: "cd678163-1379-40da-be83-c4ace8b0cf0d"). InnerVolumeSpecName "kube-api-access-s6t5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:29:55 crc kubenswrapper[4861]: I0309 09:29:55.986813 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd678163-1379-40da-be83-c4ace8b0cf0d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "cd678163-1379-40da-be83-c4ace8b0cf0d" (UID: "cd678163-1379-40da-be83-c4ace8b0cf0d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:29:55 crc kubenswrapper[4861]: I0309 09:29:55.988814 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd678163-1379-40da-be83-c4ace8b0cf0d-inventory" (OuterVolumeSpecName: "inventory") pod "cd678163-1379-40da-be83-c4ace8b0cf0d" (UID: "cd678163-1379-40da-be83-c4ace8b0cf0d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:29:56 crc kubenswrapper[4861]: I0309 09:29:56.062559 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6t5v\" (UniqueName: \"kubernetes.io/projected/cd678163-1379-40da-be83-c4ace8b0cf0d-kube-api-access-s6t5v\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:56 crc kubenswrapper[4861]: I0309 09:29:56.062593 4861 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd678163-1379-40da-be83-c4ace8b0cf0d-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:56 crc kubenswrapper[4861]: I0309 09:29:56.062602 4861 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cd678163-1379-40da-be83-c4ace8b0cf0d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 09:29:56 crc kubenswrapper[4861]: I0309 09:29:56.399252 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-gvccp" event={"ID":"cd678163-1379-40da-be83-c4ace8b0cf0d","Type":"ContainerDied","Data":"d6181ef6f055e42febd54ecd2efb69768202f5fd29cc3293229313634d5b8073"} Mar 09 09:29:56 crc kubenswrapper[4861]: I0309 09:29:56.399296 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6181ef6f055e42febd54ecd2efb69768202f5fd29cc3293229313634d5b8073" Mar 09 09:29:56 crc kubenswrapper[4861]: I0309 09:29:56.399389 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-gvccp" Mar 09 09:29:56 crc kubenswrapper[4861]: I0309 09:29:56.472566 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6k9cp"] Mar 09 09:29:56 crc kubenswrapper[4861]: E0309 09:29:56.472990 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd678163-1379-40da-be83-c4ace8b0cf0d" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 09 09:29:56 crc kubenswrapper[4861]: I0309 09:29:56.473008 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd678163-1379-40da-be83-c4ace8b0cf0d" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 09 09:29:56 crc kubenswrapper[4861]: E0309 09:29:56.473050 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cc96eed-6d35-4ea5-a5a4-5ca80a1952df" containerName="registry-server" Mar 09 09:29:56 crc kubenswrapper[4861]: I0309 09:29:56.473056 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cc96eed-6d35-4ea5-a5a4-5ca80a1952df" containerName="registry-server" Mar 09 09:29:56 crc kubenswrapper[4861]: E0309 09:29:56.473066 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cc96eed-6d35-4ea5-a5a4-5ca80a1952df" containerName="extract-utilities" Mar 09 09:29:56 crc kubenswrapper[4861]: I0309 09:29:56.473073 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cc96eed-6d35-4ea5-a5a4-5ca80a1952df" containerName="extract-utilities" Mar 09 09:29:56 crc kubenswrapper[4861]: E0309 09:29:56.473082 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cc96eed-6d35-4ea5-a5a4-5ca80a1952df" containerName="extract-content" Mar 09 09:29:56 crc kubenswrapper[4861]: I0309 09:29:56.473088 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cc96eed-6d35-4ea5-a5a4-5ca80a1952df" containerName="extract-content" Mar 09 09:29:56 crc kubenswrapper[4861]: I0309 09:29:56.473283 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cc96eed-6d35-4ea5-a5a4-5ca80a1952df" containerName="registry-server" Mar 09 09:29:56 crc kubenswrapper[4861]: I0309 09:29:56.473303 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd678163-1379-40da-be83-c4ace8b0cf0d" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 09 09:29:56 crc kubenswrapper[4861]: I0309 09:29:56.473991 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6k9cp" Mar 09 09:29:56 crc kubenswrapper[4861]: I0309 09:29:56.476293 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 09:29:56 crc kubenswrapper[4861]: I0309 09:29:56.476393 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 09:29:56 crc kubenswrapper[4861]: I0309 09:29:56.476551 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 09:29:56 crc kubenswrapper[4861]: I0309 09:29:56.476938 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lkd5q" Mar 09 09:29:56 crc kubenswrapper[4861]: I0309 09:29:56.498143 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6k9cp"] Mar 09 09:29:56 crc kubenswrapper[4861]: I0309 09:29:56.673633 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b1f90870-1ef5-46d6-b495-f41e2d14a888-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-6k9cp\" (UID: \"b1f90870-1ef5-46d6-b495-f41e2d14a888\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6k9cp" Mar 09 09:29:56 crc kubenswrapper[4861]: I0309 09:29:56.673693 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6f6p8\" (UniqueName: \"kubernetes.io/projected/b1f90870-1ef5-46d6-b495-f41e2d14a888-kube-api-access-6f6p8\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-6k9cp\" (UID: \"b1f90870-1ef5-46d6-b495-f41e2d14a888\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6k9cp" Mar 09 09:29:56 crc kubenswrapper[4861]: I0309 09:29:56.673805 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1f90870-1ef5-46d6-b495-f41e2d14a888-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-6k9cp\" (UID: \"b1f90870-1ef5-46d6-b495-f41e2d14a888\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6k9cp" Mar 09 09:29:56 crc kubenswrapper[4861]: I0309 09:29:56.673842 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b1f90870-1ef5-46d6-b495-f41e2d14a888-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-6k9cp\" (UID: \"b1f90870-1ef5-46d6-b495-f41e2d14a888\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6k9cp" Mar 09 09:29:56 crc kubenswrapper[4861]: I0309 09:29:56.775519 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1f90870-1ef5-46d6-b495-f41e2d14a888-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-6k9cp\" (UID: \"b1f90870-1ef5-46d6-b495-f41e2d14a888\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6k9cp" Mar 09 09:29:56 crc kubenswrapper[4861]: I0309 09:29:56.775574 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b1f90870-1ef5-46d6-b495-f41e2d14a888-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-6k9cp\" (UID: \"b1f90870-1ef5-46d6-b495-f41e2d14a888\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6k9cp" Mar 09 09:29:56 crc kubenswrapper[4861]: I0309 09:29:56.775670 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b1f90870-1ef5-46d6-b495-f41e2d14a888-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-6k9cp\" (UID: \"b1f90870-1ef5-46d6-b495-f41e2d14a888\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6k9cp" Mar 09 09:29:56 crc kubenswrapper[4861]: I0309 09:29:56.775713 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6f6p8\" (UniqueName: \"kubernetes.io/projected/b1f90870-1ef5-46d6-b495-f41e2d14a888-kube-api-access-6f6p8\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-6k9cp\" (UID: \"b1f90870-1ef5-46d6-b495-f41e2d14a888\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6k9cp" Mar 09 09:29:56 crc kubenswrapper[4861]: I0309 09:29:56.781476 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b1f90870-1ef5-46d6-b495-f41e2d14a888-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-6k9cp\" (UID: \"b1f90870-1ef5-46d6-b495-f41e2d14a888\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6k9cp" Mar 09 09:29:56 crc kubenswrapper[4861]: I0309 09:29:56.781591 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1f90870-1ef5-46d6-b495-f41e2d14a888-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-6k9cp\" (UID: \"b1f90870-1ef5-46d6-b495-f41e2d14a888\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6k9cp" Mar 09 09:29:56 crc kubenswrapper[4861]: I0309 09:29:56.782453 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b1f90870-1ef5-46d6-b495-f41e2d14a888-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-6k9cp\" (UID: \"b1f90870-1ef5-46d6-b495-f41e2d14a888\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6k9cp" Mar 09 09:29:56 crc kubenswrapper[4861]: I0309 09:29:56.802772 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6f6p8\" (UniqueName: \"kubernetes.io/projected/b1f90870-1ef5-46d6-b495-f41e2d14a888-kube-api-access-6f6p8\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-6k9cp\" (UID: \"b1f90870-1ef5-46d6-b495-f41e2d14a888\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6k9cp" Mar 09 09:29:57 crc kubenswrapper[4861]: I0309 09:29:57.090344 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6k9cp" Mar 09 09:29:57 crc kubenswrapper[4861]: I0309 09:29:57.604446 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6k9cp"] Mar 09 09:29:58 crc kubenswrapper[4861]: I0309 09:29:58.417918 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6k9cp" event={"ID":"b1f90870-1ef5-46d6-b495-f41e2d14a888","Type":"ContainerStarted","Data":"dbf4625ad834f2eb184b50c25caae7ad5c7a6b6b036c9841ae00f5d61a9025d2"} Mar 09 09:29:58 crc kubenswrapper[4861]: I0309 09:29:58.419238 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6k9cp" event={"ID":"b1f90870-1ef5-46d6-b495-f41e2d14a888","Type":"ContainerStarted","Data":"fc650f0436f2687a317497a2e88f181477e42cbf8b168f09943a3035f3e8fd78"} Mar 09 09:29:58 crc kubenswrapper[4861]: I0309 09:29:58.436145 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6k9cp" podStartSLOduration=2.058047665 podStartE2EDuration="2.436124853s" podCreationTimestamp="2026-03-09 09:29:56 +0000 UTC" firstStartedPulling="2026-03-09 09:29:57.605252574 +0000 UTC m=+1440.690291985" lastFinishedPulling="2026-03-09 09:29:57.983329772 +0000 UTC m=+1441.068369173" observedRunningTime="2026-03-09 09:29:58.432759769 +0000 UTC m=+1441.517799170" watchObservedRunningTime="2026-03-09 09:29:58.436124853 +0000 UTC m=+1441.521164254" Mar 09 09:30:00 crc kubenswrapper[4861]: I0309 09:30:00.137866 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550810-fzhll"] Mar 09 09:30:00 crc kubenswrapper[4861]: I0309 09:30:00.143164 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550810-fzhll" Mar 09 09:30:00 crc kubenswrapper[4861]: I0309 09:30:00.147426 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:30:00 crc kubenswrapper[4861]: I0309 09:30:00.147545 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:30:00 crc kubenswrapper[4861]: I0309 09:30:00.148550 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k86l8" Mar 09 09:30:00 crc kubenswrapper[4861]: I0309 09:30:00.159470 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550810-fzhll"] Mar 09 09:30:00 crc kubenswrapper[4861]: I0309 09:30:00.237264 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550810-s4vkz"] Mar 09 09:30:00 crc kubenswrapper[4861]: I0309 09:30:00.239756 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550810-s4vkz" Mar 09 09:30:00 crc kubenswrapper[4861]: I0309 09:30:00.241728 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 09 09:30:00 crc kubenswrapper[4861]: I0309 09:30:00.241860 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 09 09:30:00 crc kubenswrapper[4861]: I0309 09:30:00.242814 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6l2bw\" (UniqueName: \"kubernetes.io/projected/8e6fc23e-235c-405e-84ab-b09260e27aac-kube-api-access-6l2bw\") pod \"auto-csr-approver-29550810-fzhll\" (UID: \"8e6fc23e-235c-405e-84ab-b09260e27aac\") " pod="openshift-infra/auto-csr-approver-29550810-fzhll" Mar 09 09:30:00 crc kubenswrapper[4861]: I0309 09:30:00.248099 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550810-s4vkz"] Mar 09 09:30:00 crc kubenswrapper[4861]: I0309 09:30:00.345254 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/50a3028b-48fa-43a1-ac7d-12e409d83703-secret-volume\") pod \"collect-profiles-29550810-s4vkz\" (UID: \"50a3028b-48fa-43a1-ac7d-12e409d83703\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550810-s4vkz" Mar 09 09:30:00 crc kubenswrapper[4861]: I0309 09:30:00.345538 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6l2bw\" (UniqueName: \"kubernetes.io/projected/8e6fc23e-235c-405e-84ab-b09260e27aac-kube-api-access-6l2bw\") pod \"auto-csr-approver-29550810-fzhll\" (UID: \"8e6fc23e-235c-405e-84ab-b09260e27aac\") " pod="openshift-infra/auto-csr-approver-29550810-fzhll" Mar 09 09:30:00 crc kubenswrapper[4861]: I0309 09:30:00.345737 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/50a3028b-48fa-43a1-ac7d-12e409d83703-config-volume\") pod \"collect-profiles-29550810-s4vkz\" (UID: \"50a3028b-48fa-43a1-ac7d-12e409d83703\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550810-s4vkz" Mar 09 09:30:00 crc kubenswrapper[4861]: I0309 09:30:00.345941 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7g2l\" (UniqueName: \"kubernetes.io/projected/50a3028b-48fa-43a1-ac7d-12e409d83703-kube-api-access-m7g2l\") pod \"collect-profiles-29550810-s4vkz\" (UID: \"50a3028b-48fa-43a1-ac7d-12e409d83703\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550810-s4vkz" Mar 09 09:30:00 crc kubenswrapper[4861]: I0309 09:30:00.364666 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6l2bw\" (UniqueName: \"kubernetes.io/projected/8e6fc23e-235c-405e-84ab-b09260e27aac-kube-api-access-6l2bw\") pod \"auto-csr-approver-29550810-fzhll\" (UID: \"8e6fc23e-235c-405e-84ab-b09260e27aac\") " pod="openshift-infra/auto-csr-approver-29550810-fzhll" Mar 09 09:30:00 crc kubenswrapper[4861]: I0309 09:30:00.447807 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/50a3028b-48fa-43a1-ac7d-12e409d83703-config-volume\") pod \"collect-profiles-29550810-s4vkz\" (UID: \"50a3028b-48fa-43a1-ac7d-12e409d83703\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550810-s4vkz" Mar 09 09:30:00 crc kubenswrapper[4861]: I0309 09:30:00.447944 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7g2l\" (UniqueName: \"kubernetes.io/projected/50a3028b-48fa-43a1-ac7d-12e409d83703-kube-api-access-m7g2l\") pod \"collect-profiles-29550810-s4vkz\" (UID: \"50a3028b-48fa-43a1-ac7d-12e409d83703\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550810-s4vkz" Mar 09 09:30:00 crc kubenswrapper[4861]: I0309 09:30:00.448023 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/50a3028b-48fa-43a1-ac7d-12e409d83703-secret-volume\") pod \"collect-profiles-29550810-s4vkz\" (UID: \"50a3028b-48fa-43a1-ac7d-12e409d83703\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550810-s4vkz" Mar 09 09:30:00 crc kubenswrapper[4861]: I0309 09:30:00.449607 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/50a3028b-48fa-43a1-ac7d-12e409d83703-config-volume\") pod \"collect-profiles-29550810-s4vkz\" (UID: \"50a3028b-48fa-43a1-ac7d-12e409d83703\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550810-s4vkz" Mar 09 09:30:00 crc kubenswrapper[4861]: I0309 09:30:00.458238 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/50a3028b-48fa-43a1-ac7d-12e409d83703-secret-volume\") pod \"collect-profiles-29550810-s4vkz\" (UID: \"50a3028b-48fa-43a1-ac7d-12e409d83703\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550810-s4vkz" Mar 09 09:30:00 crc kubenswrapper[4861]: I0309 09:30:00.465603 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550810-fzhll" Mar 09 09:30:00 crc kubenswrapper[4861]: I0309 09:30:00.467973 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7g2l\" (UniqueName: \"kubernetes.io/projected/50a3028b-48fa-43a1-ac7d-12e409d83703-kube-api-access-m7g2l\") pod \"collect-profiles-29550810-s4vkz\" (UID: \"50a3028b-48fa-43a1-ac7d-12e409d83703\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550810-s4vkz" Mar 09 09:30:00 crc kubenswrapper[4861]: I0309 09:30:00.573791 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550810-s4vkz" Mar 09 09:30:00 crc kubenswrapper[4861]: W0309 09:30:00.904213 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e6fc23e_235c_405e_84ab_b09260e27aac.slice/crio-da60cbf4871e8f33d6631a32b0cd648a216317d4c0c4a03676cf235404b8a609 WatchSource:0}: Error finding container da60cbf4871e8f33d6631a32b0cd648a216317d4c0c4a03676cf235404b8a609: Status 404 returned error can't find the container with id da60cbf4871e8f33d6631a32b0cd648a216317d4c0c4a03676cf235404b8a609 Mar 09 09:30:00 crc kubenswrapper[4861]: I0309 09:30:00.904503 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550810-fzhll"] Mar 09 09:30:01 crc kubenswrapper[4861]: W0309 09:30:01.015974 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50a3028b_48fa_43a1_ac7d_12e409d83703.slice/crio-8213cf8bad05b2850e6ed96f3aa99f7416819b668bc953950bbf787cd0bdbb4e WatchSource:0}: Error finding container 8213cf8bad05b2850e6ed96f3aa99f7416819b668bc953950bbf787cd0bdbb4e: Status 404 returned error can't find the container with id 8213cf8bad05b2850e6ed96f3aa99f7416819b668bc953950bbf787cd0bdbb4e Mar 09 09:30:01 crc kubenswrapper[4861]: I0309 09:30:01.018435 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550810-s4vkz"] Mar 09 09:30:01 crc kubenswrapper[4861]: I0309 09:30:01.448901 4861 generic.go:334] "Generic (PLEG): container finished" podID="50a3028b-48fa-43a1-ac7d-12e409d83703" containerID="c26847120e43444813aa069a4121a676c0450177f1ebc4fc00ce050eb622d965" exitCode=0 Mar 09 09:30:01 crc kubenswrapper[4861]: I0309 09:30:01.448977 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550810-s4vkz" event={"ID":"50a3028b-48fa-43a1-ac7d-12e409d83703","Type":"ContainerDied","Data":"c26847120e43444813aa069a4121a676c0450177f1ebc4fc00ce050eb622d965"} Mar 09 09:30:01 crc kubenswrapper[4861]: I0309 09:30:01.449007 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550810-s4vkz" event={"ID":"50a3028b-48fa-43a1-ac7d-12e409d83703","Type":"ContainerStarted","Data":"8213cf8bad05b2850e6ed96f3aa99f7416819b668bc953950bbf787cd0bdbb4e"} Mar 09 09:30:01 crc kubenswrapper[4861]: I0309 09:30:01.453015 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550810-fzhll" event={"ID":"8e6fc23e-235c-405e-84ab-b09260e27aac","Type":"ContainerStarted","Data":"da60cbf4871e8f33d6631a32b0cd648a216317d4c0c4a03676cf235404b8a609"} Mar 09 09:30:02 crc kubenswrapper[4861]: I0309 09:30:02.890933 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550810-s4vkz" Mar 09 09:30:03 crc kubenswrapper[4861]: I0309 09:30:03.000457 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/50a3028b-48fa-43a1-ac7d-12e409d83703-secret-volume\") pod \"50a3028b-48fa-43a1-ac7d-12e409d83703\" (UID: \"50a3028b-48fa-43a1-ac7d-12e409d83703\") " Mar 09 09:30:03 crc kubenswrapper[4861]: I0309 09:30:03.000513 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/50a3028b-48fa-43a1-ac7d-12e409d83703-config-volume\") pod \"50a3028b-48fa-43a1-ac7d-12e409d83703\" (UID: \"50a3028b-48fa-43a1-ac7d-12e409d83703\") " Mar 09 09:30:03 crc kubenswrapper[4861]: I0309 09:30:03.000678 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7g2l\" (UniqueName: \"kubernetes.io/projected/50a3028b-48fa-43a1-ac7d-12e409d83703-kube-api-access-m7g2l\") pod \"50a3028b-48fa-43a1-ac7d-12e409d83703\" (UID: \"50a3028b-48fa-43a1-ac7d-12e409d83703\") " Mar 09 09:30:03 crc kubenswrapper[4861]: I0309 09:30:03.001334 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50a3028b-48fa-43a1-ac7d-12e409d83703-config-volume" (OuterVolumeSpecName: "config-volume") pod "50a3028b-48fa-43a1-ac7d-12e409d83703" (UID: "50a3028b-48fa-43a1-ac7d-12e409d83703"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:30:03 crc kubenswrapper[4861]: I0309 09:30:03.007309 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50a3028b-48fa-43a1-ac7d-12e409d83703-kube-api-access-m7g2l" (OuterVolumeSpecName: "kube-api-access-m7g2l") pod "50a3028b-48fa-43a1-ac7d-12e409d83703" (UID: "50a3028b-48fa-43a1-ac7d-12e409d83703"). InnerVolumeSpecName "kube-api-access-m7g2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:30:03 crc kubenswrapper[4861]: I0309 09:30:03.007585 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50a3028b-48fa-43a1-ac7d-12e409d83703-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "50a3028b-48fa-43a1-ac7d-12e409d83703" (UID: "50a3028b-48fa-43a1-ac7d-12e409d83703"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:30:03 crc kubenswrapper[4861]: I0309 09:30:03.103522 4861 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/50a3028b-48fa-43a1-ac7d-12e409d83703-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 09 09:30:03 crc kubenswrapper[4861]: I0309 09:30:03.103565 4861 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/50a3028b-48fa-43a1-ac7d-12e409d83703-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 09:30:03 crc kubenswrapper[4861]: I0309 09:30:03.103580 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7g2l\" (UniqueName: \"kubernetes.io/projected/50a3028b-48fa-43a1-ac7d-12e409d83703-kube-api-access-m7g2l\") on node \"crc\" DevicePath \"\"" Mar 09 09:30:03 crc kubenswrapper[4861]: I0309 09:30:03.473478 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550810-s4vkz" event={"ID":"50a3028b-48fa-43a1-ac7d-12e409d83703","Type":"ContainerDied","Data":"8213cf8bad05b2850e6ed96f3aa99f7416819b668bc953950bbf787cd0bdbb4e"} Mar 09 09:30:03 crc kubenswrapper[4861]: I0309 09:30:03.473513 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550810-s4vkz" Mar 09 09:30:03 crc kubenswrapper[4861]: I0309 09:30:03.473521 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8213cf8bad05b2850e6ed96f3aa99f7416819b668bc953950bbf787cd0bdbb4e" Mar 09 09:30:03 crc kubenswrapper[4861]: I0309 09:30:03.475478 4861 generic.go:334] "Generic (PLEG): container finished" podID="8e6fc23e-235c-405e-84ab-b09260e27aac" containerID="f98290531ebd1c593cfa0aacbe3b6f0e3032adc647ce41130ba5cc6ef8a30f66" exitCode=0 Mar 09 09:30:03 crc kubenswrapper[4861]: I0309 09:30:03.475511 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550810-fzhll" event={"ID":"8e6fc23e-235c-405e-84ab-b09260e27aac","Type":"ContainerDied","Data":"f98290531ebd1c593cfa0aacbe3b6f0e3032adc647ce41130ba5cc6ef8a30f66"} Mar 09 09:30:04 crc kubenswrapper[4861]: I0309 09:30:04.822607 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550810-fzhll" Mar 09 09:30:04 crc kubenswrapper[4861]: I0309 09:30:04.939449 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6l2bw\" (UniqueName: \"kubernetes.io/projected/8e6fc23e-235c-405e-84ab-b09260e27aac-kube-api-access-6l2bw\") pod \"8e6fc23e-235c-405e-84ab-b09260e27aac\" (UID: \"8e6fc23e-235c-405e-84ab-b09260e27aac\") " Mar 09 09:30:04 crc kubenswrapper[4861]: I0309 09:30:04.945724 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e6fc23e-235c-405e-84ab-b09260e27aac-kube-api-access-6l2bw" (OuterVolumeSpecName: "kube-api-access-6l2bw") pod "8e6fc23e-235c-405e-84ab-b09260e27aac" (UID: "8e6fc23e-235c-405e-84ab-b09260e27aac"). InnerVolumeSpecName "kube-api-access-6l2bw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:30:05 crc kubenswrapper[4861]: I0309 09:30:05.041155 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6l2bw\" (UniqueName: \"kubernetes.io/projected/8e6fc23e-235c-405e-84ab-b09260e27aac-kube-api-access-6l2bw\") on node \"crc\" DevicePath \"\"" Mar 09 09:30:05 crc kubenswrapper[4861]: I0309 09:30:05.496613 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550810-fzhll" event={"ID":"8e6fc23e-235c-405e-84ab-b09260e27aac","Type":"ContainerDied","Data":"da60cbf4871e8f33d6631a32b0cd648a216317d4c0c4a03676cf235404b8a609"} Mar 09 09:30:05 crc kubenswrapper[4861]: I0309 09:30:05.496659 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da60cbf4871e8f33d6631a32b0cd648a216317d4c0c4a03676cf235404b8a609" Mar 09 09:30:05 crc kubenswrapper[4861]: I0309 09:30:05.496667 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550810-fzhll" Mar 09 09:30:05 crc kubenswrapper[4861]: I0309 09:30:05.894787 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550804-lfm7c"] Mar 09 09:30:05 crc kubenswrapper[4861]: I0309 09:30:05.903297 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550804-lfm7c"] Mar 09 09:30:07 crc kubenswrapper[4861]: I0309 09:30:07.670169 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4a629d1-4904-48e6-9a69-e436c68cfdbc" path="/var/lib/kubelet/pods/a4a629d1-4904-48e6-9a69-e436c68cfdbc/volumes" Mar 09 09:30:10 crc kubenswrapper[4861]: I0309 09:30:10.844190 4861 scope.go:117] "RemoveContainer" containerID="48a24b34c684d5a93e2d113832b569fd6f40b813ce52a311f4c0a30652f38851" Mar 09 09:30:10 crc kubenswrapper[4861]: I0309 09:30:10.883031 4861 scope.go:117] "RemoveContainer" containerID="88eb70e641682e41216df017500a8be0f5e835caaf804e626b55c86a4abe6876" Mar 09 09:30:10 crc kubenswrapper[4861]: I0309 09:30:10.917922 4861 scope.go:117] "RemoveContainer" containerID="a7322d7bcc83d5419742604674ffba70d24ba7fbe97cb4ccc03de61b7d2abc47" Mar 09 09:30:10 crc kubenswrapper[4861]: I0309 09:30:10.964089 4861 scope.go:117] "RemoveContainer" containerID="e117a32444d886a78c67cb565fdc8024e6468ba42d37d84957d2eaf74bef8159" Mar 09 09:31:11 crc kubenswrapper[4861]: I0309 09:31:11.163727 4861 scope.go:117] "RemoveContainer" containerID="e72c1d9cb3d3ed802613c080c0934aa922e277a849d6ac09e873332c57084265" Mar 09 09:31:24 crc kubenswrapper[4861]: I0309 09:31:24.605755 4861 patch_prober.go:28] interesting pod/machine-config-daemon-5g7gc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:31:24 crc kubenswrapper[4861]: I0309 09:31:24.606392 4861 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:31:54 crc kubenswrapper[4861]: I0309 09:31:54.605961 4861 patch_prober.go:28] interesting pod/machine-config-daemon-5g7gc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:31:54 crc kubenswrapper[4861]: I0309 09:31:54.607490 4861 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:32:00 crc kubenswrapper[4861]: I0309 09:32:00.147491 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550812-vzh87"] Mar 09 09:32:00 crc kubenswrapper[4861]: E0309 09:32:00.148794 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50a3028b-48fa-43a1-ac7d-12e409d83703" containerName="collect-profiles" Mar 09 09:32:00 crc kubenswrapper[4861]: I0309 09:32:00.148811 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="50a3028b-48fa-43a1-ac7d-12e409d83703" containerName="collect-profiles" Mar 09 09:32:00 crc kubenswrapper[4861]: E0309 09:32:00.148833 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e6fc23e-235c-405e-84ab-b09260e27aac" containerName="oc" Mar 09 09:32:00 crc kubenswrapper[4861]: I0309 09:32:00.148840 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e6fc23e-235c-405e-84ab-b09260e27aac" containerName="oc" Mar 09 09:32:00 crc kubenswrapper[4861]: I0309 09:32:00.149090 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="50a3028b-48fa-43a1-ac7d-12e409d83703" containerName="collect-profiles" Mar 09 09:32:00 crc kubenswrapper[4861]: I0309 09:32:00.149114 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e6fc23e-235c-405e-84ab-b09260e27aac" containerName="oc" Mar 09 09:32:00 crc kubenswrapper[4861]: I0309 09:32:00.149868 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550812-vzh87" Mar 09 09:32:00 crc kubenswrapper[4861]: I0309 09:32:00.152093 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:32:00 crc kubenswrapper[4861]: I0309 09:32:00.152093 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:32:00 crc kubenswrapper[4861]: I0309 09:32:00.158587 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k86l8" Mar 09 09:32:00 crc kubenswrapper[4861]: I0309 09:32:00.158801 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550812-vzh87"] Mar 09 09:32:00 crc kubenswrapper[4861]: I0309 09:32:00.317424 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lhbq\" (UniqueName: \"kubernetes.io/projected/03740edc-76c4-4be9-8871-29035d061880-kube-api-access-5lhbq\") pod \"auto-csr-approver-29550812-vzh87\" (UID: \"03740edc-76c4-4be9-8871-29035d061880\") " pod="openshift-infra/auto-csr-approver-29550812-vzh87" Mar 09 09:32:00 crc kubenswrapper[4861]: I0309 09:32:00.420106 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lhbq\" (UniqueName: \"kubernetes.io/projected/03740edc-76c4-4be9-8871-29035d061880-kube-api-access-5lhbq\") pod \"auto-csr-approver-29550812-vzh87\" (UID: \"03740edc-76c4-4be9-8871-29035d061880\") " pod="openshift-infra/auto-csr-approver-29550812-vzh87" Mar 09 09:32:00 crc kubenswrapper[4861]: I0309 09:32:00.446478 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lhbq\" (UniqueName: \"kubernetes.io/projected/03740edc-76c4-4be9-8871-29035d061880-kube-api-access-5lhbq\") pod \"auto-csr-approver-29550812-vzh87\" (UID: \"03740edc-76c4-4be9-8871-29035d061880\") " pod="openshift-infra/auto-csr-approver-29550812-vzh87" Mar 09 09:32:00 crc kubenswrapper[4861]: I0309 09:32:00.472358 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550812-vzh87" Mar 09 09:32:00 crc kubenswrapper[4861]: I0309 09:32:00.957024 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550812-vzh87"] Mar 09 09:32:01 crc kubenswrapper[4861]: I0309 09:32:01.640744 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550812-vzh87" event={"ID":"03740edc-76c4-4be9-8871-29035d061880","Type":"ContainerStarted","Data":"9be5984336edab0cfc1c11d0b85d0111ea6171a759c3a1a618f02adeeb3bb934"} Mar 09 09:32:02 crc kubenswrapper[4861]: I0309 09:32:02.652667 4861 generic.go:334] "Generic (PLEG): container finished" podID="03740edc-76c4-4be9-8871-29035d061880" containerID="a329ea2a6dc542f07a16030075887e823b17c6ed8a244888d2f86c973035e7b4" exitCode=0 Mar 09 09:32:02 crc kubenswrapper[4861]: I0309 09:32:02.652731 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550812-vzh87" event={"ID":"03740edc-76c4-4be9-8871-29035d061880","Type":"ContainerDied","Data":"a329ea2a6dc542f07a16030075887e823b17c6ed8a244888d2f86c973035e7b4"} Mar 09 09:32:03 crc kubenswrapper[4861]: I0309 09:32:03.966434 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550812-vzh87" Mar 09 09:32:04 crc kubenswrapper[4861]: I0309 09:32:04.088623 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5lhbq\" (UniqueName: \"kubernetes.io/projected/03740edc-76c4-4be9-8871-29035d061880-kube-api-access-5lhbq\") pod \"03740edc-76c4-4be9-8871-29035d061880\" (UID: \"03740edc-76c4-4be9-8871-29035d061880\") " Mar 09 09:32:04 crc kubenswrapper[4861]: I0309 09:32:04.095940 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03740edc-76c4-4be9-8871-29035d061880-kube-api-access-5lhbq" (OuterVolumeSpecName: "kube-api-access-5lhbq") pod "03740edc-76c4-4be9-8871-29035d061880" (UID: "03740edc-76c4-4be9-8871-29035d061880"). InnerVolumeSpecName "kube-api-access-5lhbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:32:04 crc kubenswrapper[4861]: I0309 09:32:04.191596 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5lhbq\" (UniqueName: \"kubernetes.io/projected/03740edc-76c4-4be9-8871-29035d061880-kube-api-access-5lhbq\") on node \"crc\" DevicePath \"\"" Mar 09 09:32:04 crc kubenswrapper[4861]: I0309 09:32:04.679739 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550812-vzh87" event={"ID":"03740edc-76c4-4be9-8871-29035d061880","Type":"ContainerDied","Data":"9be5984336edab0cfc1c11d0b85d0111ea6171a759c3a1a618f02adeeb3bb934"} Mar 09 09:32:04 crc kubenswrapper[4861]: I0309 09:32:04.679821 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9be5984336edab0cfc1c11d0b85d0111ea6171a759c3a1a618f02adeeb3bb934" Mar 09 09:32:04 crc kubenswrapper[4861]: I0309 09:32:04.679917 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550812-vzh87" Mar 09 09:32:05 crc kubenswrapper[4861]: I0309 09:32:05.045340 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550806-dlnz6"] Mar 09 09:32:05 crc kubenswrapper[4861]: I0309 09:32:05.053119 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550806-dlnz6"] Mar 09 09:32:05 crc kubenswrapper[4861]: I0309 09:32:05.671672 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5330b450-5c51-43e8-b0c0-c9b875dc49b5" path="/var/lib/kubelet/pods/5330b450-5c51-43e8-b0c0-c9b875dc49b5/volumes" Mar 09 09:32:11 crc kubenswrapper[4861]: I0309 09:32:11.264920 4861 scope.go:117] "RemoveContainer" containerID="687d779e14aa31b5a5bfd8d961aa81863f1af619b3ce31c1804ce1b27e08e60d" Mar 09 09:32:11 crc kubenswrapper[4861]: I0309 09:32:11.290567 4861 scope.go:117] "RemoveContainer" containerID="2c79b4b0ca27df0f16a48c63dea824fd62d21f4930a8301169db649d33109e66" Mar 09 09:32:11 crc kubenswrapper[4861]: I0309 09:32:11.313051 4861 scope.go:117] "RemoveContainer" containerID="5fa6ba578ffae467793f3902a8751c10074b77e91f06af157cb008ec9397b5dc" Mar 09 09:32:24 crc kubenswrapper[4861]: I0309 09:32:24.605694 4861 patch_prober.go:28] interesting pod/machine-config-daemon-5g7gc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:32:24 crc kubenswrapper[4861]: I0309 09:32:24.606259 4861 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:32:24 crc kubenswrapper[4861]: I0309 09:32:24.606297 4861 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" Mar 09 09:32:24 crc kubenswrapper[4861]: I0309 09:32:24.607077 4861 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c0726d3ac822004eacb4f8d12bb4cbaf2815fc9d29aaa8ba7db9d4fae1717ee1"} pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 09:32:24 crc kubenswrapper[4861]: I0309 09:32:24.607148 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" containerID="cri-o://c0726d3ac822004eacb4f8d12bb4cbaf2815fc9d29aaa8ba7db9d4fae1717ee1" gracePeriod=600 Mar 09 09:32:24 crc kubenswrapper[4861]: E0309 09:32:24.729033 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:32:24 crc kubenswrapper[4861]: I0309 09:32:24.990632 4861 generic.go:334] "Generic (PLEG): container finished" podID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerID="c0726d3ac822004eacb4f8d12bb4cbaf2815fc9d29aaa8ba7db9d4fae1717ee1" exitCode=0 Mar 09 09:32:24 crc kubenswrapper[4861]: I0309 09:32:24.990673 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" event={"ID":"6f7875e3-174f-4c67-8675-d878de74aa4f","Type":"ContainerDied","Data":"c0726d3ac822004eacb4f8d12bb4cbaf2815fc9d29aaa8ba7db9d4fae1717ee1"} Mar 09 09:32:24 crc kubenswrapper[4861]: I0309 09:32:24.990704 4861 scope.go:117] "RemoveContainer" containerID="7fdd7f3e15e67ed60e9bdb64538958917f435e5fb6449fedcf993ae2c627b46e" Mar 09 09:32:24 crc kubenswrapper[4861]: I0309 09:32:24.991452 4861 scope.go:117] "RemoveContainer" containerID="c0726d3ac822004eacb4f8d12bb4cbaf2815fc9d29aaa8ba7db9d4fae1717ee1" Mar 09 09:32:24 crc kubenswrapper[4861]: E0309 09:32:24.991896 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:32:38 crc kubenswrapper[4861]: I0309 09:32:38.658724 4861 scope.go:117] "RemoveContainer" containerID="c0726d3ac822004eacb4f8d12bb4cbaf2815fc9d29aaa8ba7db9d4fae1717ee1" Mar 09 09:32:38 crc kubenswrapper[4861]: E0309 09:32:38.659494 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:32:53 crc kubenswrapper[4861]: I0309 09:32:53.658362 4861 scope.go:117] "RemoveContainer" containerID="c0726d3ac822004eacb4f8d12bb4cbaf2815fc9d29aaa8ba7db9d4fae1717ee1" Mar 09 09:32:53 crc kubenswrapper[4861]: E0309 09:32:53.659199 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:33:03 crc kubenswrapper[4861]: I0309 09:33:03.323744 4861 generic.go:334] "Generic (PLEG): container finished" podID="b1f90870-1ef5-46d6-b495-f41e2d14a888" containerID="dbf4625ad834f2eb184b50c25caae7ad5c7a6b6b036c9841ae00f5d61a9025d2" exitCode=0 Mar 09 09:33:03 crc kubenswrapper[4861]: I0309 09:33:03.324322 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6k9cp" event={"ID":"b1f90870-1ef5-46d6-b495-f41e2d14a888","Type":"ContainerDied","Data":"dbf4625ad834f2eb184b50c25caae7ad5c7a6b6b036c9841ae00f5d61a9025d2"} Mar 09 09:33:04 crc kubenswrapper[4861]: I0309 09:33:04.734835 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6k9cp" Mar 09 09:33:04 crc kubenswrapper[4861]: I0309 09:33:04.866556 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1f90870-1ef5-46d6-b495-f41e2d14a888-bootstrap-combined-ca-bundle\") pod \"b1f90870-1ef5-46d6-b495-f41e2d14a888\" (UID: \"b1f90870-1ef5-46d6-b495-f41e2d14a888\") " Mar 09 09:33:04 crc kubenswrapper[4861]: I0309 09:33:04.866659 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b1f90870-1ef5-46d6-b495-f41e2d14a888-ssh-key-openstack-edpm-ipam\") pod \"b1f90870-1ef5-46d6-b495-f41e2d14a888\" (UID: \"b1f90870-1ef5-46d6-b495-f41e2d14a888\") " Mar 09 09:33:04 crc kubenswrapper[4861]: I0309 09:33:04.866732 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6f6p8\" (UniqueName: \"kubernetes.io/projected/b1f90870-1ef5-46d6-b495-f41e2d14a888-kube-api-access-6f6p8\") pod \"b1f90870-1ef5-46d6-b495-f41e2d14a888\" (UID: \"b1f90870-1ef5-46d6-b495-f41e2d14a888\") " Mar 09 09:33:04 crc kubenswrapper[4861]: I0309 09:33:04.866761 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b1f90870-1ef5-46d6-b495-f41e2d14a888-inventory\") pod \"b1f90870-1ef5-46d6-b495-f41e2d14a888\" (UID: \"b1f90870-1ef5-46d6-b495-f41e2d14a888\") " Mar 09 09:33:04 crc kubenswrapper[4861]: I0309 09:33:04.874672 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1f90870-1ef5-46d6-b495-f41e2d14a888-kube-api-access-6f6p8" (OuterVolumeSpecName: "kube-api-access-6f6p8") pod "b1f90870-1ef5-46d6-b495-f41e2d14a888" (UID: "b1f90870-1ef5-46d6-b495-f41e2d14a888"). InnerVolumeSpecName "kube-api-access-6f6p8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:33:04 crc kubenswrapper[4861]: I0309 09:33:04.875239 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1f90870-1ef5-46d6-b495-f41e2d14a888-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "b1f90870-1ef5-46d6-b495-f41e2d14a888" (UID: "b1f90870-1ef5-46d6-b495-f41e2d14a888"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:33:04 crc kubenswrapper[4861]: I0309 09:33:04.897881 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1f90870-1ef5-46d6-b495-f41e2d14a888-inventory" (OuterVolumeSpecName: "inventory") pod "b1f90870-1ef5-46d6-b495-f41e2d14a888" (UID: "b1f90870-1ef5-46d6-b495-f41e2d14a888"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:33:04 crc kubenswrapper[4861]: I0309 09:33:04.899292 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1f90870-1ef5-46d6-b495-f41e2d14a888-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b1f90870-1ef5-46d6-b495-f41e2d14a888" (UID: "b1f90870-1ef5-46d6-b495-f41e2d14a888"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:33:04 crc kubenswrapper[4861]: I0309 09:33:04.970567 4861 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b1f90870-1ef5-46d6-b495-f41e2d14a888-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 09:33:04 crc kubenswrapper[4861]: I0309 09:33:04.970624 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6f6p8\" (UniqueName: \"kubernetes.io/projected/b1f90870-1ef5-46d6-b495-f41e2d14a888-kube-api-access-6f6p8\") on node \"crc\" DevicePath \"\"" Mar 09 09:33:04 crc kubenswrapper[4861]: I0309 09:33:04.970644 4861 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b1f90870-1ef5-46d6-b495-f41e2d14a888-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 09:33:04 crc kubenswrapper[4861]: I0309 09:33:04.970660 4861 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b1f90870-1ef5-46d6-b495-f41e2d14a888-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:33:05 crc kubenswrapper[4861]: I0309 09:33:05.341754 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6k9cp" event={"ID":"b1f90870-1ef5-46d6-b495-f41e2d14a888","Type":"ContainerDied","Data":"fc650f0436f2687a317497a2e88f181477e42cbf8b168f09943a3035f3e8fd78"} Mar 09 09:33:05 crc kubenswrapper[4861]: I0309 09:33:05.342056 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc650f0436f2687a317497a2e88f181477e42cbf8b168f09943a3035f3e8fd78" Mar 09 09:33:05 crc kubenswrapper[4861]: I0309 09:33:05.341825 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-6k9cp" Mar 09 09:33:05 crc kubenswrapper[4861]: I0309 09:33:05.451654 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2gml2"] Mar 09 09:33:05 crc kubenswrapper[4861]: E0309 09:33:05.452097 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1f90870-1ef5-46d6-b495-f41e2d14a888" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 09 09:33:05 crc kubenswrapper[4861]: I0309 09:33:05.452124 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1f90870-1ef5-46d6-b495-f41e2d14a888" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 09 09:33:05 crc kubenswrapper[4861]: E0309 09:33:05.452153 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03740edc-76c4-4be9-8871-29035d061880" containerName="oc" Mar 09 09:33:05 crc kubenswrapper[4861]: I0309 09:33:05.452163 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="03740edc-76c4-4be9-8871-29035d061880" containerName="oc" Mar 09 09:33:05 crc kubenswrapper[4861]: I0309 09:33:05.452350 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="03740edc-76c4-4be9-8871-29035d061880" containerName="oc" Mar 09 09:33:05 crc kubenswrapper[4861]: I0309 09:33:05.452388 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1f90870-1ef5-46d6-b495-f41e2d14a888" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 09 09:33:05 crc kubenswrapper[4861]: I0309 09:33:05.452996 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2gml2" Mar 09 09:33:05 crc kubenswrapper[4861]: I0309 09:33:05.455176 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lkd5q" Mar 09 09:33:05 crc kubenswrapper[4861]: I0309 09:33:05.457005 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 09:33:05 crc kubenswrapper[4861]: I0309 09:33:05.457021 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 09:33:05 crc kubenswrapper[4861]: I0309 09:33:05.464456 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2gml2"] Mar 09 09:33:05 crc kubenswrapper[4861]: I0309 09:33:05.487879 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 09:33:05 crc kubenswrapper[4861]: I0309 09:33:05.488836 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28e43d3e-921e-4f6c-be2f-f37e5625374a-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-2gml2\" (UID: \"28e43d3e-921e-4f6c-be2f-f37e5625374a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2gml2" Mar 09 09:33:05 crc kubenswrapper[4861]: I0309 09:33:05.488910 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28e43d3e-921e-4f6c-be2f-f37e5625374a-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-2gml2\" (UID: \"28e43d3e-921e-4f6c-be2f-f37e5625374a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2gml2" Mar 09 09:33:05 crc kubenswrapper[4861]: I0309 09:33:05.488985 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mqxg\" (UniqueName: \"kubernetes.io/projected/28e43d3e-921e-4f6c-be2f-f37e5625374a-kube-api-access-2mqxg\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-2gml2\" (UID: \"28e43d3e-921e-4f6c-be2f-f37e5625374a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2gml2" Mar 09 09:33:05 crc kubenswrapper[4861]: I0309 09:33:05.590741 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28e43d3e-921e-4f6c-be2f-f37e5625374a-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-2gml2\" (UID: \"28e43d3e-921e-4f6c-be2f-f37e5625374a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2gml2" Mar 09 09:33:05 crc kubenswrapper[4861]: I0309 09:33:05.590878 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mqxg\" (UniqueName: \"kubernetes.io/projected/28e43d3e-921e-4f6c-be2f-f37e5625374a-kube-api-access-2mqxg\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-2gml2\" (UID: \"28e43d3e-921e-4f6c-be2f-f37e5625374a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2gml2" Mar 09 09:33:05 crc kubenswrapper[4861]: I0309 09:33:05.591038 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28e43d3e-921e-4f6c-be2f-f37e5625374a-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-2gml2\" (UID: \"28e43d3e-921e-4f6c-be2f-f37e5625374a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2gml2" Mar 09 09:33:05 crc kubenswrapper[4861]: I0309 09:33:05.598083 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28e43d3e-921e-4f6c-be2f-f37e5625374a-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-2gml2\" (UID: \"28e43d3e-921e-4f6c-be2f-f37e5625374a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2gml2" Mar 09 09:33:05 crc kubenswrapper[4861]: I0309 09:33:05.598597 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28e43d3e-921e-4f6c-be2f-f37e5625374a-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-2gml2\" (UID: \"28e43d3e-921e-4f6c-be2f-f37e5625374a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2gml2" Mar 09 09:33:05 crc kubenswrapper[4861]: I0309 09:33:05.609638 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mqxg\" (UniqueName: \"kubernetes.io/projected/28e43d3e-921e-4f6c-be2f-f37e5625374a-kube-api-access-2mqxg\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-2gml2\" (UID: \"28e43d3e-921e-4f6c-be2f-f37e5625374a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2gml2" Mar 09 09:33:05 crc kubenswrapper[4861]: I0309 09:33:05.659214 4861 scope.go:117] "RemoveContainer" containerID="c0726d3ac822004eacb4f8d12bb4cbaf2815fc9d29aaa8ba7db9d4fae1717ee1" Mar 09 09:33:05 crc kubenswrapper[4861]: E0309 09:33:05.659796 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:33:05 crc kubenswrapper[4861]: I0309 09:33:05.796079 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2gml2" Mar 09 09:33:06 crc kubenswrapper[4861]: I0309 09:33:06.319684 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2gml2"] Mar 09 09:33:06 crc kubenswrapper[4861]: I0309 09:33:06.355688 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2gml2" event={"ID":"28e43d3e-921e-4f6c-be2f-f37e5625374a","Type":"ContainerStarted","Data":"50e668f1ab1964dfbd13d29494fd8fdc71a59179f3906b9b8cd8eaddacf73c8b"} Mar 09 09:33:07 crc kubenswrapper[4861]: I0309 09:33:07.367619 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2gml2" event={"ID":"28e43d3e-921e-4f6c-be2f-f37e5625374a","Type":"ContainerStarted","Data":"f04a6345e6b5cd85326d1c78c4f7c4009d7c329407df7fbedf0357a85dea21d5"} Mar 09 09:33:07 crc kubenswrapper[4861]: I0309 09:33:07.380944 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2gml2" podStartSLOduration=1.8892015899999999 podStartE2EDuration="2.3809259s" podCreationTimestamp="2026-03-09 09:33:05 +0000 UTC" firstStartedPulling="2026-03-09 09:33:06.327381773 +0000 UTC m=+1629.412421184" lastFinishedPulling="2026-03-09 09:33:06.819106103 +0000 UTC m=+1629.904145494" observedRunningTime="2026-03-09 09:33:07.37951427 +0000 UTC m=+1630.464553671" watchObservedRunningTime="2026-03-09 09:33:07.3809259 +0000 UTC m=+1630.465965301" Mar 09 09:33:11 crc kubenswrapper[4861]: I0309 09:33:11.397339 4861 scope.go:117] "RemoveContainer" containerID="7f801cb67a90ced05cbc62eaecaa06f1ab05d667be9d41dd2eb4d9888ebafdea" Mar 09 09:33:11 crc kubenswrapper[4861]: I0309 09:33:11.418045 4861 scope.go:117] "RemoveContainer" containerID="e87f7e0e851f4944b2b3370f531360c735836531f36248b68167ac059e5e4495" Mar 09 09:33:11 crc kubenswrapper[4861]: I0309 09:33:11.436228 4861 scope.go:117] "RemoveContainer" containerID="6360fe2180cbf6227ebf7a75a027983d6faae1823563d9854f335c4cb2aa39bc" Mar 09 09:33:11 crc kubenswrapper[4861]: I0309 09:33:11.458608 4861 scope.go:117] "RemoveContainer" containerID="9a6976fb3b15c1a33bf9f11c55266c90e65d8bc0b256d7961d1f06a0053a3d1f" Mar 09 09:33:17 crc kubenswrapper[4861]: I0309 09:33:17.665555 4861 scope.go:117] "RemoveContainer" containerID="c0726d3ac822004eacb4f8d12bb4cbaf2815fc9d29aaa8ba7db9d4fae1717ee1" Mar 09 09:33:17 crc kubenswrapper[4861]: E0309 09:33:17.666385 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:33:29 crc kubenswrapper[4861]: I0309 09:33:29.658184 4861 scope.go:117] "RemoveContainer" containerID="c0726d3ac822004eacb4f8d12bb4cbaf2815fc9d29aaa8ba7db9d4fae1717ee1" Mar 09 09:33:29 crc kubenswrapper[4861]: E0309 09:33:29.659041 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:33:41 crc kubenswrapper[4861]: I0309 09:33:41.657607 4861 scope.go:117] "RemoveContainer" containerID="c0726d3ac822004eacb4f8d12bb4cbaf2815fc9d29aaa8ba7db9d4fae1717ee1" Mar 09 09:33:41 crc kubenswrapper[4861]: E0309 09:33:41.658321 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:33:43 crc kubenswrapper[4861]: I0309 09:33:43.068518 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-4mxxh"] Mar 09 09:33:43 crc kubenswrapper[4861]: I0309 09:33:43.081892 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-4mxxh"] Mar 09 09:33:43 crc kubenswrapper[4861]: I0309 09:33:43.671633 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8499a71b-bdde-4468-9a1e-781db816f2f0" path="/var/lib/kubelet/pods/8499a71b-bdde-4468-9a1e-781db816f2f0/volumes" Mar 09 09:33:44 crc kubenswrapper[4861]: I0309 09:33:44.028171 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-a7e9-account-create-update-x5n4n"] Mar 09 09:33:44 crc kubenswrapper[4861]: I0309 09:33:44.038214 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-a7e9-account-create-update-x5n4n"] Mar 09 09:33:45 crc kubenswrapper[4861]: I0309 09:33:45.669210 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="591c3df3-47fb-4da5-9776-4a6ed3170472" path="/var/lib/kubelet/pods/591c3df3-47fb-4da5-9776-4a6ed3170472/volumes" Mar 09 09:33:47 crc kubenswrapper[4861]: I0309 09:33:47.032655 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-7k92z"] Mar 09 09:33:47 crc kubenswrapper[4861]: I0309 09:33:47.044242 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-51c1-account-create-update-2g9qr"] Mar 09 09:33:47 crc kubenswrapper[4861]: I0309 09:33:47.054435 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-51c1-account-create-update-2g9qr"] Mar 09 09:33:47 crc kubenswrapper[4861]: I0309 09:33:47.062754 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-7k92z"] Mar 09 09:33:47 crc kubenswrapper[4861]: I0309 09:33:47.676232 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2de22eb1-f0a3-41f2-a06e-53ce14fedaf8" path="/var/lib/kubelet/pods/2de22eb1-f0a3-41f2-a06e-53ce14fedaf8/volumes" Mar 09 09:33:47 crc kubenswrapper[4861]: I0309 09:33:47.677780 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3468cc7-9c05-4934-bc2d-287e80b966a4" path="/var/lib/kubelet/pods/c3468cc7-9c05-4934-bc2d-287e80b966a4/volumes" Mar 09 09:33:54 crc kubenswrapper[4861]: I0309 09:33:54.034232 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-bc782"] Mar 09 09:33:54 crc kubenswrapper[4861]: I0309 09:33:54.045202 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-bc782"] Mar 09 09:33:55 crc kubenswrapper[4861]: I0309 09:33:55.031589 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-3c6b-account-create-update-g9zvd"] Mar 09 09:33:55 crc kubenswrapper[4861]: I0309 09:33:55.041587 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-3c6b-account-create-update-g9zvd"] Mar 09 09:33:55 crc kubenswrapper[4861]: I0309 09:33:55.657947 4861 scope.go:117] "RemoveContainer" containerID="c0726d3ac822004eacb4f8d12bb4cbaf2815fc9d29aaa8ba7db9d4fae1717ee1" Mar 09 09:33:55 crc kubenswrapper[4861]: E0309 09:33:55.658222 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:33:55 crc kubenswrapper[4861]: I0309 09:33:55.668783 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a022a77f-33dc-449f-b20f-b91978014c94" path="/var/lib/kubelet/pods/a022a77f-33dc-449f-b20f-b91978014c94/volumes" Mar 09 09:33:55 crc kubenswrapper[4861]: I0309 09:33:55.669461 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e36bbce2-29bf-46ab-bc4f-a7afc2423059" path="/var/lib/kubelet/pods/e36bbce2-29bf-46ab-bc4f-a7afc2423059/volumes" Mar 09 09:34:00 crc kubenswrapper[4861]: I0309 09:34:00.157618 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550814-zp5g2"] Mar 09 09:34:00 crc kubenswrapper[4861]: I0309 09:34:00.159217 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550814-zp5g2" Mar 09 09:34:00 crc kubenswrapper[4861]: I0309 09:34:00.161871 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k86l8" Mar 09 09:34:00 crc kubenswrapper[4861]: I0309 09:34:00.162038 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:34:00 crc kubenswrapper[4861]: I0309 09:34:00.168119 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:34:00 crc kubenswrapper[4861]: I0309 09:34:00.168342 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550814-zp5g2"] Mar 09 09:34:00 crc kubenswrapper[4861]: I0309 09:34:00.313117 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sq8r\" (UniqueName: \"kubernetes.io/projected/e2c0233c-02dd-4711-8633-a75b72a8fb19-kube-api-access-8sq8r\") pod \"auto-csr-approver-29550814-zp5g2\" (UID: \"e2c0233c-02dd-4711-8633-a75b72a8fb19\") " pod="openshift-infra/auto-csr-approver-29550814-zp5g2" Mar 09 09:34:00 crc kubenswrapper[4861]: I0309 09:34:00.416011 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sq8r\" (UniqueName: \"kubernetes.io/projected/e2c0233c-02dd-4711-8633-a75b72a8fb19-kube-api-access-8sq8r\") pod \"auto-csr-approver-29550814-zp5g2\" (UID: \"e2c0233c-02dd-4711-8633-a75b72a8fb19\") " pod="openshift-infra/auto-csr-approver-29550814-zp5g2" Mar 09 09:34:00 crc kubenswrapper[4861]: I0309 09:34:00.436098 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sq8r\" (UniqueName: \"kubernetes.io/projected/e2c0233c-02dd-4711-8633-a75b72a8fb19-kube-api-access-8sq8r\") pod \"auto-csr-approver-29550814-zp5g2\" (UID: \"e2c0233c-02dd-4711-8633-a75b72a8fb19\") " pod="openshift-infra/auto-csr-approver-29550814-zp5g2" Mar 09 09:34:00 crc kubenswrapper[4861]: I0309 09:34:00.478805 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550814-zp5g2" Mar 09 09:34:00 crc kubenswrapper[4861]: I0309 09:34:00.738276 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550814-zp5g2"] Mar 09 09:34:00 crc kubenswrapper[4861]: I0309 09:34:00.827250 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550814-zp5g2" event={"ID":"e2c0233c-02dd-4711-8633-a75b72a8fb19","Type":"ContainerStarted","Data":"587fc1e8f94f188d5451baf87b1a88506d778afbc33b7e0241c58c10a2b4b186"} Mar 09 09:34:02 crc kubenswrapper[4861]: I0309 09:34:02.851152 4861 generic.go:334] "Generic (PLEG): container finished" podID="e2c0233c-02dd-4711-8633-a75b72a8fb19" containerID="694cf3cfb68f6b432415aaab5d6235d47fd0f6ad980609da705a1c42dfab4c46" exitCode=0 Mar 09 09:34:02 crc kubenswrapper[4861]: I0309 09:34:02.851220 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550814-zp5g2" event={"ID":"e2c0233c-02dd-4711-8633-a75b72a8fb19","Type":"ContainerDied","Data":"694cf3cfb68f6b432415aaab5d6235d47fd0f6ad980609da705a1c42dfab4c46"} Mar 09 09:34:04 crc kubenswrapper[4861]: I0309 09:34:04.193556 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550814-zp5g2" Mar 09 09:34:04 crc kubenswrapper[4861]: I0309 09:34:04.290484 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8sq8r\" (UniqueName: \"kubernetes.io/projected/e2c0233c-02dd-4711-8633-a75b72a8fb19-kube-api-access-8sq8r\") pod \"e2c0233c-02dd-4711-8633-a75b72a8fb19\" (UID: \"e2c0233c-02dd-4711-8633-a75b72a8fb19\") " Mar 09 09:34:04 crc kubenswrapper[4861]: I0309 09:34:04.297562 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2c0233c-02dd-4711-8633-a75b72a8fb19-kube-api-access-8sq8r" (OuterVolumeSpecName: "kube-api-access-8sq8r") pod "e2c0233c-02dd-4711-8633-a75b72a8fb19" (UID: "e2c0233c-02dd-4711-8633-a75b72a8fb19"). InnerVolumeSpecName "kube-api-access-8sq8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:34:04 crc kubenswrapper[4861]: I0309 09:34:04.393183 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8sq8r\" (UniqueName: \"kubernetes.io/projected/e2c0233c-02dd-4711-8633-a75b72a8fb19-kube-api-access-8sq8r\") on node \"crc\" DevicePath \"\"" Mar 09 09:34:04 crc kubenswrapper[4861]: I0309 09:34:04.868165 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550814-zp5g2" Mar 09 09:34:04 crc kubenswrapper[4861]: I0309 09:34:04.868124 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550814-zp5g2" event={"ID":"e2c0233c-02dd-4711-8633-a75b72a8fb19","Type":"ContainerDied","Data":"587fc1e8f94f188d5451baf87b1a88506d778afbc33b7e0241c58c10a2b4b186"} Mar 09 09:34:04 crc kubenswrapper[4861]: I0309 09:34:04.868509 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="587fc1e8f94f188d5451baf87b1a88506d778afbc33b7e0241c58c10a2b4b186" Mar 09 09:34:05 crc kubenswrapper[4861]: I0309 09:34:05.273795 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550808-4spcd"] Mar 09 09:34:05 crc kubenswrapper[4861]: I0309 09:34:05.282743 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550808-4spcd"] Mar 09 09:34:05 crc kubenswrapper[4861]: I0309 09:34:05.669527 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50e9c52c-8e13-491e-bf32-3daa6bf663bb" path="/var/lib/kubelet/pods/50e9c52c-8e13-491e-bf32-3daa6bf663bb/volumes" Mar 09 09:34:10 crc kubenswrapper[4861]: I0309 09:34:10.658540 4861 scope.go:117] "RemoveContainer" containerID="c0726d3ac822004eacb4f8d12bb4cbaf2815fc9d29aaa8ba7db9d4fae1717ee1" Mar 09 09:34:10 crc kubenswrapper[4861]: E0309 09:34:10.659545 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:34:11 crc kubenswrapper[4861]: I0309 09:34:11.521158 4861 scope.go:117] "RemoveContainer" containerID="212a3f24478517e3b3c1bfc2375f9d399f9c8ddbc22b7dc8da2f5515d3547a9d" Mar 09 09:34:11 crc kubenswrapper[4861]: I0309 09:34:11.549783 4861 scope.go:117] "RemoveContainer" containerID="f501d035d0d6df4266406c5d6d4937c22a47640e8366864942666f5e6ab3438d" Mar 09 09:34:11 crc kubenswrapper[4861]: I0309 09:34:11.599232 4861 scope.go:117] "RemoveContainer" containerID="caadf85455a67668f15f9f86aa6e994c39806be3eb7d310b487e21ad7f875c90" Mar 09 09:34:11 crc kubenswrapper[4861]: I0309 09:34:11.657637 4861 scope.go:117] "RemoveContainer" containerID="def0dc52c7f8a566bd281da8497a5f4ce26474566e2eff6c5fbe140b4718b0af" Mar 09 09:34:11 crc kubenswrapper[4861]: I0309 09:34:11.703351 4861 scope.go:117] "RemoveContainer" containerID="a0300fc10e488ebd33207cec458035e8e06b9da87be58b19fafc10fcb6e4aeb7" Mar 09 09:34:11 crc kubenswrapper[4861]: I0309 09:34:11.745143 4861 scope.go:117] "RemoveContainer" containerID="fb9822da780ebde3553751d1ca305edfeb98f9507135fc567902c2d2d7719f0c" Mar 09 09:34:11 crc kubenswrapper[4861]: I0309 09:34:11.788418 4861 scope.go:117] "RemoveContainer" containerID="75910fe5cac8dcbedce2876d4f517bf5a25ee5904f405755b0e71469fa4c4c39" Mar 09 09:34:20 crc kubenswrapper[4861]: I0309 09:34:20.034128 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-7qv25"] Mar 09 09:34:20 crc kubenswrapper[4861]: I0309 09:34:20.044400 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-7qv25"] Mar 09 09:34:21 crc kubenswrapper[4861]: I0309 09:34:21.670821 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e71176a-f707-4517-83d9-09848f66e7bb" path="/var/lib/kubelet/pods/6e71176a-f707-4517-83d9-09848f66e7bb/volumes" Mar 09 09:34:23 crc kubenswrapper[4861]: I0309 09:34:23.658615 4861 scope.go:117] "RemoveContainer" containerID="c0726d3ac822004eacb4f8d12bb4cbaf2815fc9d29aaa8ba7db9d4fae1717ee1" Mar 09 09:34:23 crc kubenswrapper[4861]: E0309 09:34:23.659161 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:34:27 crc kubenswrapper[4861]: I0309 09:34:27.081245 4861 generic.go:334] "Generic (PLEG): container finished" podID="28e43d3e-921e-4f6c-be2f-f37e5625374a" containerID="f04a6345e6b5cd85326d1c78c4f7c4009d7c329407df7fbedf0357a85dea21d5" exitCode=0 Mar 09 09:34:27 crc kubenswrapper[4861]: I0309 09:34:27.081342 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2gml2" event={"ID":"28e43d3e-921e-4f6c-be2f-f37e5625374a","Type":"ContainerDied","Data":"f04a6345e6b5cd85326d1c78c4f7c4009d7c329407df7fbedf0357a85dea21d5"} Mar 09 09:34:28 crc kubenswrapper[4861]: I0309 09:34:28.506288 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2gml2" Mar 09 09:34:28 crc kubenswrapper[4861]: I0309 09:34:28.566850 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28e43d3e-921e-4f6c-be2f-f37e5625374a-inventory\") pod \"28e43d3e-921e-4f6c-be2f-f37e5625374a\" (UID: \"28e43d3e-921e-4f6c-be2f-f37e5625374a\") " Mar 09 09:34:28 crc kubenswrapper[4861]: I0309 09:34:28.567019 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28e43d3e-921e-4f6c-be2f-f37e5625374a-ssh-key-openstack-edpm-ipam\") pod \"28e43d3e-921e-4f6c-be2f-f37e5625374a\" (UID: \"28e43d3e-921e-4f6c-be2f-f37e5625374a\") " Mar 09 09:34:28 crc kubenswrapper[4861]: I0309 09:34:28.567171 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mqxg\" (UniqueName: \"kubernetes.io/projected/28e43d3e-921e-4f6c-be2f-f37e5625374a-kube-api-access-2mqxg\") pod \"28e43d3e-921e-4f6c-be2f-f37e5625374a\" (UID: \"28e43d3e-921e-4f6c-be2f-f37e5625374a\") " Mar 09 09:34:28 crc kubenswrapper[4861]: I0309 09:34:28.572115 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28e43d3e-921e-4f6c-be2f-f37e5625374a-kube-api-access-2mqxg" (OuterVolumeSpecName: "kube-api-access-2mqxg") pod "28e43d3e-921e-4f6c-be2f-f37e5625374a" (UID: "28e43d3e-921e-4f6c-be2f-f37e5625374a"). InnerVolumeSpecName "kube-api-access-2mqxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:34:28 crc kubenswrapper[4861]: I0309 09:34:28.592740 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28e43d3e-921e-4f6c-be2f-f37e5625374a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "28e43d3e-921e-4f6c-be2f-f37e5625374a" (UID: "28e43d3e-921e-4f6c-be2f-f37e5625374a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:34:28 crc kubenswrapper[4861]: I0309 09:34:28.593076 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28e43d3e-921e-4f6c-be2f-f37e5625374a-inventory" (OuterVolumeSpecName: "inventory") pod "28e43d3e-921e-4f6c-be2f-f37e5625374a" (UID: "28e43d3e-921e-4f6c-be2f-f37e5625374a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:34:28 crc kubenswrapper[4861]: I0309 09:34:28.669600 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mqxg\" (UniqueName: \"kubernetes.io/projected/28e43d3e-921e-4f6c-be2f-f37e5625374a-kube-api-access-2mqxg\") on node \"crc\" DevicePath \"\"" Mar 09 09:34:28 crc kubenswrapper[4861]: I0309 09:34:28.669630 4861 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/28e43d3e-921e-4f6c-be2f-f37e5625374a-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 09:34:28 crc kubenswrapper[4861]: I0309 09:34:28.669639 4861 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28e43d3e-921e-4f6c-be2f-f37e5625374a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 09:34:29 crc kubenswrapper[4861]: I0309 09:34:29.098519 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2gml2" event={"ID":"28e43d3e-921e-4f6c-be2f-f37e5625374a","Type":"ContainerDied","Data":"50e668f1ab1964dfbd13d29494fd8fdc71a59179f3906b9b8cd8eaddacf73c8b"} Mar 09 09:34:29 crc kubenswrapper[4861]: I0309 09:34:29.098835 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50e668f1ab1964dfbd13d29494fd8fdc71a59179f3906b9b8cd8eaddacf73c8b" Mar 09 09:34:29 crc kubenswrapper[4861]: I0309 09:34:29.098804 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-2gml2" Mar 09 09:34:29 crc kubenswrapper[4861]: I0309 09:34:29.192509 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ww9c7"] Mar 09 09:34:29 crc kubenswrapper[4861]: E0309 09:34:29.192906 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28e43d3e-921e-4f6c-be2f-f37e5625374a" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 09 09:34:29 crc kubenswrapper[4861]: I0309 09:34:29.192924 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="28e43d3e-921e-4f6c-be2f-f37e5625374a" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 09 09:34:29 crc kubenswrapper[4861]: E0309 09:34:29.192949 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2c0233c-02dd-4711-8633-a75b72a8fb19" containerName="oc" Mar 09 09:34:29 crc kubenswrapper[4861]: I0309 09:34:29.192956 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2c0233c-02dd-4711-8633-a75b72a8fb19" containerName="oc" Mar 09 09:34:29 crc kubenswrapper[4861]: I0309 09:34:29.193182 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="28e43d3e-921e-4f6c-be2f-f37e5625374a" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 09 09:34:29 crc kubenswrapper[4861]: I0309 09:34:29.193200 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2c0233c-02dd-4711-8633-a75b72a8fb19" containerName="oc" Mar 09 09:34:29 crc kubenswrapper[4861]: I0309 09:34:29.193905 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ww9c7" Mar 09 09:34:29 crc kubenswrapper[4861]: I0309 09:34:29.196550 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 09:34:29 crc kubenswrapper[4861]: I0309 09:34:29.196898 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 09:34:29 crc kubenswrapper[4861]: I0309 09:34:29.196931 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lkd5q" Mar 09 09:34:29 crc kubenswrapper[4861]: I0309 09:34:29.197533 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 09:34:29 crc kubenswrapper[4861]: I0309 09:34:29.218948 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ww9c7"] Mar 09 09:34:29 crc kubenswrapper[4861]: I0309 09:34:29.279675 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmmbl\" (UniqueName: \"kubernetes.io/projected/b6f88b43-ae35-4f74-b14a-96332076ed1f-kube-api-access-bmmbl\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ww9c7\" (UID: \"b6f88b43-ae35-4f74-b14a-96332076ed1f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ww9c7" Mar 09 09:34:29 crc kubenswrapper[4861]: I0309 09:34:29.279783 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b6f88b43-ae35-4f74-b14a-96332076ed1f-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ww9c7\" (UID: \"b6f88b43-ae35-4f74-b14a-96332076ed1f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ww9c7" Mar 09 09:34:29 crc kubenswrapper[4861]: I0309 09:34:29.279811 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b6f88b43-ae35-4f74-b14a-96332076ed1f-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ww9c7\" (UID: \"b6f88b43-ae35-4f74-b14a-96332076ed1f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ww9c7" Mar 09 09:34:29 crc kubenswrapper[4861]: I0309 09:34:29.381242 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmmbl\" (UniqueName: \"kubernetes.io/projected/b6f88b43-ae35-4f74-b14a-96332076ed1f-kube-api-access-bmmbl\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ww9c7\" (UID: \"b6f88b43-ae35-4f74-b14a-96332076ed1f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ww9c7" Mar 09 09:34:29 crc kubenswrapper[4861]: I0309 09:34:29.381317 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b6f88b43-ae35-4f74-b14a-96332076ed1f-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ww9c7\" (UID: \"b6f88b43-ae35-4f74-b14a-96332076ed1f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ww9c7" Mar 09 09:34:29 crc kubenswrapper[4861]: I0309 09:34:29.381347 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b6f88b43-ae35-4f74-b14a-96332076ed1f-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ww9c7\" (UID: \"b6f88b43-ae35-4f74-b14a-96332076ed1f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ww9c7" Mar 09 09:34:29 crc kubenswrapper[4861]: I0309 09:34:29.393096 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b6f88b43-ae35-4f74-b14a-96332076ed1f-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ww9c7\" (UID: \"b6f88b43-ae35-4f74-b14a-96332076ed1f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ww9c7" Mar 09 09:34:29 crc kubenswrapper[4861]: I0309 09:34:29.393107 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b6f88b43-ae35-4f74-b14a-96332076ed1f-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ww9c7\" (UID: \"b6f88b43-ae35-4f74-b14a-96332076ed1f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ww9c7" Mar 09 09:34:29 crc kubenswrapper[4861]: I0309 09:34:29.398477 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmmbl\" (UniqueName: \"kubernetes.io/projected/b6f88b43-ae35-4f74-b14a-96332076ed1f-kube-api-access-bmmbl\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ww9c7\" (UID: \"b6f88b43-ae35-4f74-b14a-96332076ed1f\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ww9c7" Mar 09 09:34:29 crc kubenswrapper[4861]: I0309 09:34:29.515309 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ww9c7" Mar 09 09:34:30 crc kubenswrapper[4861]: I0309 09:34:30.030091 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ww9c7"] Mar 09 09:34:30 crc kubenswrapper[4861]: I0309 09:34:30.033191 4861 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 09:34:30 crc kubenswrapper[4861]: I0309 09:34:30.109401 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ww9c7" event={"ID":"b6f88b43-ae35-4f74-b14a-96332076ed1f","Type":"ContainerStarted","Data":"08ec9b9932819b25a721be8ac4ef377ac05e008cc62300ba9afdf538eb91aa81"} Mar 09 09:34:31 crc kubenswrapper[4861]: I0309 09:34:31.033432 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-q252w"] Mar 09 09:34:31 crc kubenswrapper[4861]: I0309 09:34:31.051784 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-crjkx"] Mar 09 09:34:31 crc kubenswrapper[4861]: I0309 09:34:31.065688 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-q252w"] Mar 09 09:34:31 crc kubenswrapper[4861]: I0309 09:34:31.076186 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-crjkx"] Mar 09 09:34:31 crc kubenswrapper[4861]: I0309 09:34:31.119568 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ww9c7" event={"ID":"b6f88b43-ae35-4f74-b14a-96332076ed1f","Type":"ContainerStarted","Data":"c9caeeabaa1883c187b4f193aa2841b732e5ebd7772ee9322056330044557ae1"} Mar 09 09:34:31 crc kubenswrapper[4861]: I0309 09:34:31.148212 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ww9c7" podStartSLOduration=1.44335378 podStartE2EDuration="2.148179386s" podCreationTimestamp="2026-03-09 09:34:29 +0000 UTC" firstStartedPulling="2026-03-09 09:34:30.032996122 +0000 UTC m=+1713.118035523" lastFinishedPulling="2026-03-09 09:34:30.737821728 +0000 UTC m=+1713.822861129" observedRunningTime="2026-03-09 09:34:31.136584171 +0000 UTC m=+1714.221623582" watchObservedRunningTime="2026-03-09 09:34:31.148179386 +0000 UTC m=+1714.233218797" Mar 09 09:34:31 crc kubenswrapper[4861]: I0309 09:34:31.668349 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cdcc666-9e35-47b3-a84b-0cd31afdc84a" path="/var/lib/kubelet/pods/3cdcc666-9e35-47b3-a84b-0cd31afdc84a/volumes" Mar 09 09:34:31 crc kubenswrapper[4861]: I0309 09:34:31.669068 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="caf1f04f-5f46-4a52-9e17-58aa6a2b61e7" path="/var/lib/kubelet/pods/caf1f04f-5f46-4a52-9e17-58aa6a2b61e7/volumes" Mar 09 09:34:34 crc kubenswrapper[4861]: I0309 09:34:34.052043 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-5kg66"] Mar 09 09:34:34 crc kubenswrapper[4861]: I0309 09:34:34.063074 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-6pxsf"] Mar 09 09:34:34 crc kubenswrapper[4861]: I0309 09:34:34.077399 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-1884-account-create-update-fl8nw"] Mar 09 09:34:34 crc kubenswrapper[4861]: I0309 09:34:34.090289 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-69b0-account-create-update-5pw85"] Mar 09 09:34:34 crc kubenswrapper[4861]: I0309 09:34:34.100598 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-5kg66"] Mar 09 09:34:34 crc kubenswrapper[4861]: I0309 09:34:34.109356 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-f21b-account-create-update-cmptx"] Mar 09 09:34:34 crc kubenswrapper[4861]: I0309 09:34:34.117177 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-6pxsf"] Mar 09 09:34:34 crc kubenswrapper[4861]: I0309 09:34:34.126858 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-f21b-account-create-update-cmptx"] Mar 09 09:34:34 crc kubenswrapper[4861]: I0309 09:34:34.135550 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-1884-account-create-update-fl8nw"] Mar 09 09:34:34 crc kubenswrapper[4861]: I0309 09:34:34.146725 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-69b0-account-create-update-5pw85"] Mar 09 09:34:34 crc kubenswrapper[4861]: I0309 09:34:34.658161 4861 scope.go:117] "RemoveContainer" containerID="c0726d3ac822004eacb4f8d12bb4cbaf2815fc9d29aaa8ba7db9d4fae1717ee1" Mar 09 09:34:34 crc kubenswrapper[4861]: E0309 09:34:34.658448 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:34:35 crc kubenswrapper[4861]: I0309 09:34:35.668599 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0413fd17-3445-4719-9d24-42d8a9e41905" path="/var/lib/kubelet/pods/0413fd17-3445-4719-9d24-42d8a9e41905/volumes" Mar 09 09:34:35 crc kubenswrapper[4861]: I0309 09:34:35.669211 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40076fb0-0798-4425-a7ec-2638a66ee6f5" path="/var/lib/kubelet/pods/40076fb0-0798-4425-a7ec-2638a66ee6f5/volumes" Mar 09 09:34:35 crc kubenswrapper[4861]: I0309 09:34:35.669839 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4263ee95-7df1-4358-80b0-c3516f030ff6" path="/var/lib/kubelet/pods/4263ee95-7df1-4358-80b0-c3516f030ff6/volumes" Mar 09 09:34:35 crc kubenswrapper[4861]: I0309 09:34:35.670533 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="634dd56c-c726-49cc-9a71-ef57a7d0a984" path="/var/lib/kubelet/pods/634dd56c-c726-49cc-9a71-ef57a7d0a984/volumes" Mar 09 09:34:35 crc kubenswrapper[4861]: I0309 09:34:35.671800 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98aa826b-02b5-4db3-b496-10eb34917427" path="/var/lib/kubelet/pods/98aa826b-02b5-4db3-b496-10eb34917427/volumes" Mar 09 09:34:38 crc kubenswrapper[4861]: I0309 09:34:38.032964 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-n79q7"] Mar 09 09:34:38 crc kubenswrapper[4861]: I0309 09:34:38.041175 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-n79q7"] Mar 09 09:34:39 crc kubenswrapper[4861]: I0309 09:34:39.675395 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24508739-f77b-4cb1-8b0e-bc18a292c6f0" path="/var/lib/kubelet/pods/24508739-f77b-4cb1-8b0e-bc18a292c6f0/volumes" Mar 09 09:34:47 crc kubenswrapper[4861]: I0309 09:34:47.663615 4861 scope.go:117] "RemoveContainer" containerID="c0726d3ac822004eacb4f8d12bb4cbaf2815fc9d29aaa8ba7db9d4fae1717ee1" Mar 09 09:34:47 crc kubenswrapper[4861]: E0309 09:34:47.664500 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:35:01 crc kubenswrapper[4861]: I0309 09:35:01.658819 4861 scope.go:117] "RemoveContainer" containerID="c0726d3ac822004eacb4f8d12bb4cbaf2815fc9d29aaa8ba7db9d4fae1717ee1" Mar 09 09:35:01 crc kubenswrapper[4861]: E0309 09:35:01.659632 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:35:08 crc kubenswrapper[4861]: I0309 09:35:08.057075 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-ntvbd"] Mar 09 09:35:08 crc kubenswrapper[4861]: I0309 09:35:08.065912 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-ntvbd"] Mar 09 09:35:09 crc kubenswrapper[4861]: I0309 09:35:09.669622 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e5455e9-fa02-46bd-8786-1888543b55cc" path="/var/lib/kubelet/pods/4e5455e9-fa02-46bd-8786-1888543b55cc/volumes" Mar 09 09:35:11 crc kubenswrapper[4861]: I0309 09:35:11.931325 4861 scope.go:117] "RemoveContainer" containerID="22d03a47e72066afbb0d4bf06ca8e439442024c3d3063ba6f9eddfa5d222ea60" Mar 09 09:35:11 crc kubenswrapper[4861]: I0309 09:35:11.964270 4861 scope.go:117] "RemoveContainer" containerID="50c06f6ec486776fed27be621771092d458b92984683634750db4b2fa7ca73f7" Mar 09 09:35:12 crc kubenswrapper[4861]: I0309 09:35:12.022275 4861 scope.go:117] "RemoveContainer" containerID="a66850429ce2fd114f33f4f8a53c612097eff74b5c4b688fc3a6dd8e084522ab" Mar 09 09:35:12 crc kubenswrapper[4861]: I0309 09:35:12.053506 4861 scope.go:117] "RemoveContainer" containerID="b1957f5b86896c192575f201cafc98ec5e171cde10b6c66681c4a6d641e37a8f" Mar 09 09:35:12 crc kubenswrapper[4861]: I0309 09:35:12.081742 4861 scope.go:117] "RemoveContainer" containerID="d43a5bb9b44c57a2c41149c014eb0b93c247b7214b4694b9ba5f4555d49bcb4f" Mar 09 09:35:12 crc kubenswrapper[4861]: I0309 09:35:12.139852 4861 scope.go:117] "RemoveContainer" containerID="4b60ef0028e183b2bf765de01466d8955fe335836880888abeb310abea8d615a" Mar 09 09:35:12 crc kubenswrapper[4861]: I0309 09:35:12.169178 4861 scope.go:117] "RemoveContainer" containerID="84ca4e525422b3807ce2160168de070c09c37ae2e82c11ecf1efe5cc31bf82e0" Mar 09 09:35:12 crc kubenswrapper[4861]: I0309 09:35:12.191634 4861 scope.go:117] "RemoveContainer" containerID="40d0a28810d1725fd9d9998dfc3cd8acc2dc4f3e07ec8b89cae28d4bf9d16b10" Mar 09 09:35:12 crc kubenswrapper[4861]: I0309 09:35:12.213349 4861 scope.go:117] "RemoveContainer" containerID="d2d5b3f66c142b0dee4cee1521dfa5ac2cec1368e09d96d9f53efdc2af1208fa" Mar 09 09:35:12 crc kubenswrapper[4861]: I0309 09:35:12.239065 4861 scope.go:117] "RemoveContainer" containerID="1fab9aa85d1803e1d3b8653b33579a54369de0af09e2a9c46f1d7ab50349d561" Mar 09 09:35:12 crc kubenswrapper[4861]: I0309 09:35:12.658509 4861 scope.go:117] "RemoveContainer" containerID="c0726d3ac822004eacb4f8d12bb4cbaf2815fc9d29aaa8ba7db9d4fae1717ee1" Mar 09 09:35:12 crc kubenswrapper[4861]: E0309 09:35:12.659013 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:35:21 crc kubenswrapper[4861]: I0309 09:35:21.037397 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-bcd5c"] Mar 09 09:35:21 crc kubenswrapper[4861]: I0309 09:35:21.049250 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-bcd5c"] Mar 09 09:35:21 crc kubenswrapper[4861]: I0309 09:35:21.668698 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbd4cfda-d65e-4915-be68-f207820fe15b" path="/var/lib/kubelet/pods/bbd4cfda-d65e-4915-be68-f207820fe15b/volumes" Mar 09 09:35:24 crc kubenswrapper[4861]: I0309 09:35:24.031986 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-k46zw"] Mar 09 09:35:24 crc kubenswrapper[4861]: I0309 09:35:24.041324 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-k46zw"] Mar 09 09:35:25 crc kubenswrapper[4861]: I0309 09:35:25.672992 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9820d89-3a89-4982-8520-f23dd0d099ad" path="/var/lib/kubelet/pods/c9820d89-3a89-4982-8520-f23dd0d099ad/volumes" Mar 09 09:35:27 crc kubenswrapper[4861]: I0309 09:35:27.664787 4861 scope.go:117] "RemoveContainer" containerID="c0726d3ac822004eacb4f8d12bb4cbaf2815fc9d29aaa8ba7db9d4fae1717ee1" Mar 09 09:35:27 crc kubenswrapper[4861]: E0309 09:35:27.665417 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:35:34 crc kubenswrapper[4861]: I0309 09:35:34.048871 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-sr27s"] Mar 09 09:35:34 crc kubenswrapper[4861]: I0309 09:35:34.058858 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-8wrdd"] Mar 09 09:35:34 crc kubenswrapper[4861]: I0309 09:35:34.070445 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-8wrdd"] Mar 09 09:35:34 crc kubenswrapper[4861]: I0309 09:35:34.080652 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-sr27s"] Mar 09 09:35:35 crc kubenswrapper[4861]: I0309 09:35:35.672052 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44c35b48-50b9-4dd8-846a-99714c14d3ab" path="/var/lib/kubelet/pods/44c35b48-50b9-4dd8-846a-99714c14d3ab/volumes" Mar 09 09:35:35 crc kubenswrapper[4861]: I0309 09:35:35.673310 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="deb8e24b-1a6f-4173-9a5f-62974b0331a5" path="/var/lib/kubelet/pods/deb8e24b-1a6f-4173-9a5f-62974b0331a5/volumes" Mar 09 09:35:36 crc kubenswrapper[4861]: I0309 09:35:36.698423 4861 generic.go:334] "Generic (PLEG): container finished" podID="b6f88b43-ae35-4f74-b14a-96332076ed1f" containerID="c9caeeabaa1883c187b4f193aa2841b732e5ebd7772ee9322056330044557ae1" exitCode=0 Mar 09 09:35:36 crc kubenswrapper[4861]: I0309 09:35:36.698484 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ww9c7" event={"ID":"b6f88b43-ae35-4f74-b14a-96332076ed1f","Type":"ContainerDied","Data":"c9caeeabaa1883c187b4f193aa2841b732e5ebd7772ee9322056330044557ae1"} Mar 09 09:35:38 crc kubenswrapper[4861]: I0309 09:35:38.126060 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ww9c7" Mar 09 09:35:38 crc kubenswrapper[4861]: I0309 09:35:38.268989 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b6f88b43-ae35-4f74-b14a-96332076ed1f-inventory\") pod \"b6f88b43-ae35-4f74-b14a-96332076ed1f\" (UID: \"b6f88b43-ae35-4f74-b14a-96332076ed1f\") " Mar 09 09:35:38 crc kubenswrapper[4861]: I0309 09:35:38.269148 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmmbl\" (UniqueName: \"kubernetes.io/projected/b6f88b43-ae35-4f74-b14a-96332076ed1f-kube-api-access-bmmbl\") pod \"b6f88b43-ae35-4f74-b14a-96332076ed1f\" (UID: \"b6f88b43-ae35-4f74-b14a-96332076ed1f\") " Mar 09 09:35:38 crc kubenswrapper[4861]: I0309 09:35:38.269204 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b6f88b43-ae35-4f74-b14a-96332076ed1f-ssh-key-openstack-edpm-ipam\") pod \"b6f88b43-ae35-4f74-b14a-96332076ed1f\" (UID: \"b6f88b43-ae35-4f74-b14a-96332076ed1f\") " Mar 09 09:35:38 crc kubenswrapper[4861]: I0309 09:35:38.279740 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6f88b43-ae35-4f74-b14a-96332076ed1f-kube-api-access-bmmbl" (OuterVolumeSpecName: "kube-api-access-bmmbl") pod "b6f88b43-ae35-4f74-b14a-96332076ed1f" (UID: "b6f88b43-ae35-4f74-b14a-96332076ed1f"). InnerVolumeSpecName "kube-api-access-bmmbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:35:38 crc kubenswrapper[4861]: I0309 09:35:38.301014 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6f88b43-ae35-4f74-b14a-96332076ed1f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b6f88b43-ae35-4f74-b14a-96332076ed1f" (UID: "b6f88b43-ae35-4f74-b14a-96332076ed1f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:35:38 crc kubenswrapper[4861]: I0309 09:35:38.310942 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6f88b43-ae35-4f74-b14a-96332076ed1f-inventory" (OuterVolumeSpecName: "inventory") pod "b6f88b43-ae35-4f74-b14a-96332076ed1f" (UID: "b6f88b43-ae35-4f74-b14a-96332076ed1f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:35:38 crc kubenswrapper[4861]: I0309 09:35:38.371662 4861 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b6f88b43-ae35-4f74-b14a-96332076ed1f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 09:35:38 crc kubenswrapper[4861]: I0309 09:35:38.371704 4861 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b6f88b43-ae35-4f74-b14a-96332076ed1f-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 09:35:38 crc kubenswrapper[4861]: I0309 09:35:38.371713 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmmbl\" (UniqueName: \"kubernetes.io/projected/b6f88b43-ae35-4f74-b14a-96332076ed1f-kube-api-access-bmmbl\") on node \"crc\" DevicePath \"\"" Mar 09 09:35:38 crc kubenswrapper[4861]: I0309 09:35:38.716267 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ww9c7" event={"ID":"b6f88b43-ae35-4f74-b14a-96332076ed1f","Type":"ContainerDied","Data":"08ec9b9932819b25a721be8ac4ef377ac05e008cc62300ba9afdf538eb91aa81"} Mar 09 09:35:38 crc kubenswrapper[4861]: I0309 09:35:38.716314 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="08ec9b9932819b25a721be8ac4ef377ac05e008cc62300ba9afdf538eb91aa81" Mar 09 09:35:38 crc kubenswrapper[4861]: I0309 09:35:38.716392 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ww9c7" Mar 09 09:35:38 crc kubenswrapper[4861]: I0309 09:35:38.810534 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7n4l5"] Mar 09 09:35:38 crc kubenswrapper[4861]: E0309 09:35:38.811070 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6f88b43-ae35-4f74-b14a-96332076ed1f" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 09 09:35:38 crc kubenswrapper[4861]: I0309 09:35:38.811100 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6f88b43-ae35-4f74-b14a-96332076ed1f" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 09 09:35:38 crc kubenswrapper[4861]: I0309 09:35:38.811552 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6f88b43-ae35-4f74-b14a-96332076ed1f" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 09 09:35:38 crc kubenswrapper[4861]: I0309 09:35:38.812340 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7n4l5" Mar 09 09:35:38 crc kubenswrapper[4861]: I0309 09:35:38.815730 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 09:35:38 crc kubenswrapper[4861]: I0309 09:35:38.815771 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lkd5q" Mar 09 09:35:38 crc kubenswrapper[4861]: I0309 09:35:38.815924 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 09:35:38 crc kubenswrapper[4861]: I0309 09:35:38.815995 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 09:35:38 crc kubenswrapper[4861]: I0309 09:35:38.824600 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7n4l5"] Mar 09 09:35:38 crc kubenswrapper[4861]: I0309 09:35:38.881423 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ec37a9fa-7555-4e81-af4a-dad48b85942c-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7n4l5\" (UID: \"ec37a9fa-7555-4e81-af4a-dad48b85942c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7n4l5" Mar 09 09:35:38 crc kubenswrapper[4861]: I0309 09:35:38.881747 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec37a9fa-7555-4e81-af4a-dad48b85942c-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7n4l5\" (UID: \"ec37a9fa-7555-4e81-af4a-dad48b85942c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7n4l5" Mar 09 09:35:38 crc kubenswrapper[4861]: I0309 09:35:38.881865 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5xxk\" (UniqueName: \"kubernetes.io/projected/ec37a9fa-7555-4e81-af4a-dad48b85942c-kube-api-access-r5xxk\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7n4l5\" (UID: \"ec37a9fa-7555-4e81-af4a-dad48b85942c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7n4l5" Mar 09 09:35:38 crc kubenswrapper[4861]: I0309 09:35:38.983230 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec37a9fa-7555-4e81-af4a-dad48b85942c-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7n4l5\" (UID: \"ec37a9fa-7555-4e81-af4a-dad48b85942c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7n4l5" Mar 09 09:35:38 crc kubenswrapper[4861]: I0309 09:35:38.983623 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5xxk\" (UniqueName: \"kubernetes.io/projected/ec37a9fa-7555-4e81-af4a-dad48b85942c-kube-api-access-r5xxk\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7n4l5\" (UID: \"ec37a9fa-7555-4e81-af4a-dad48b85942c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7n4l5" Mar 09 09:35:38 crc kubenswrapper[4861]: I0309 09:35:38.983688 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ec37a9fa-7555-4e81-af4a-dad48b85942c-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7n4l5\" (UID: \"ec37a9fa-7555-4e81-af4a-dad48b85942c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7n4l5" Mar 09 09:35:38 crc kubenswrapper[4861]: I0309 09:35:38.988765 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ec37a9fa-7555-4e81-af4a-dad48b85942c-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7n4l5\" (UID: \"ec37a9fa-7555-4e81-af4a-dad48b85942c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7n4l5" Mar 09 09:35:38 crc kubenswrapper[4861]: I0309 09:35:38.989050 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec37a9fa-7555-4e81-af4a-dad48b85942c-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7n4l5\" (UID: \"ec37a9fa-7555-4e81-af4a-dad48b85942c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7n4l5" Mar 09 09:35:39 crc kubenswrapper[4861]: I0309 09:35:39.000128 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5xxk\" (UniqueName: \"kubernetes.io/projected/ec37a9fa-7555-4e81-af4a-dad48b85942c-kube-api-access-r5xxk\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-7n4l5\" (UID: \"ec37a9fa-7555-4e81-af4a-dad48b85942c\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7n4l5" Mar 09 09:35:39 crc kubenswrapper[4861]: I0309 09:35:39.131395 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7n4l5" Mar 09 09:35:39 crc kubenswrapper[4861]: I0309 09:35:39.680170 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7n4l5"] Mar 09 09:35:39 crc kubenswrapper[4861]: I0309 09:35:39.725281 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7n4l5" event={"ID":"ec37a9fa-7555-4e81-af4a-dad48b85942c","Type":"ContainerStarted","Data":"a83476cf4f194c870063b977e7c854a6b9b772045e704e7cbcc6d18db3d7453b"} Mar 09 09:35:40 crc kubenswrapper[4861]: I0309 09:35:40.657430 4861 scope.go:117] "RemoveContainer" containerID="c0726d3ac822004eacb4f8d12bb4cbaf2815fc9d29aaa8ba7db9d4fae1717ee1" Mar 09 09:35:40 crc kubenswrapper[4861]: E0309 09:35:40.658244 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:35:40 crc kubenswrapper[4861]: I0309 09:35:40.737509 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7n4l5" event={"ID":"ec37a9fa-7555-4e81-af4a-dad48b85942c","Type":"ContainerStarted","Data":"9f3ce1d7a0770279daaf412375702b2710e71c95cf5770326b81785d46e4811b"} Mar 09 09:35:40 crc kubenswrapper[4861]: I0309 09:35:40.765713 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7n4l5" podStartSLOduration=2.136626965 podStartE2EDuration="2.765689437s" podCreationTimestamp="2026-03-09 09:35:38 +0000 UTC" firstStartedPulling="2026-03-09 09:35:39.683180229 +0000 UTC m=+1782.768219630" lastFinishedPulling="2026-03-09 09:35:40.312242661 +0000 UTC m=+1783.397282102" observedRunningTime="2026-03-09 09:35:40.759859174 +0000 UTC m=+1783.844898605" watchObservedRunningTime="2026-03-09 09:35:40.765689437 +0000 UTC m=+1783.850728848" Mar 09 09:35:45 crc kubenswrapper[4861]: I0309 09:35:45.778542 4861 generic.go:334] "Generic (PLEG): container finished" podID="ec37a9fa-7555-4e81-af4a-dad48b85942c" containerID="9f3ce1d7a0770279daaf412375702b2710e71c95cf5770326b81785d46e4811b" exitCode=0 Mar 09 09:35:45 crc kubenswrapper[4861]: I0309 09:35:45.778610 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7n4l5" event={"ID":"ec37a9fa-7555-4e81-af4a-dad48b85942c","Type":"ContainerDied","Data":"9f3ce1d7a0770279daaf412375702b2710e71c95cf5770326b81785d46e4811b"} Mar 09 09:35:47 crc kubenswrapper[4861]: I0309 09:35:47.225868 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7n4l5" Mar 09 09:35:47 crc kubenswrapper[4861]: I0309 09:35:47.404078 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ec37a9fa-7555-4e81-af4a-dad48b85942c-ssh-key-openstack-edpm-ipam\") pod \"ec37a9fa-7555-4e81-af4a-dad48b85942c\" (UID: \"ec37a9fa-7555-4e81-af4a-dad48b85942c\") " Mar 09 09:35:47 crc kubenswrapper[4861]: I0309 09:35:47.404550 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5xxk\" (UniqueName: \"kubernetes.io/projected/ec37a9fa-7555-4e81-af4a-dad48b85942c-kube-api-access-r5xxk\") pod \"ec37a9fa-7555-4e81-af4a-dad48b85942c\" (UID: \"ec37a9fa-7555-4e81-af4a-dad48b85942c\") " Mar 09 09:35:47 crc kubenswrapper[4861]: I0309 09:35:47.404614 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec37a9fa-7555-4e81-af4a-dad48b85942c-inventory\") pod \"ec37a9fa-7555-4e81-af4a-dad48b85942c\" (UID: \"ec37a9fa-7555-4e81-af4a-dad48b85942c\") " Mar 09 09:35:47 crc kubenswrapper[4861]: I0309 09:35:47.409809 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec37a9fa-7555-4e81-af4a-dad48b85942c-kube-api-access-r5xxk" (OuterVolumeSpecName: "kube-api-access-r5xxk") pod "ec37a9fa-7555-4e81-af4a-dad48b85942c" (UID: "ec37a9fa-7555-4e81-af4a-dad48b85942c"). InnerVolumeSpecName "kube-api-access-r5xxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:35:47 crc kubenswrapper[4861]: I0309 09:35:47.430978 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec37a9fa-7555-4e81-af4a-dad48b85942c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ec37a9fa-7555-4e81-af4a-dad48b85942c" (UID: "ec37a9fa-7555-4e81-af4a-dad48b85942c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:35:47 crc kubenswrapper[4861]: I0309 09:35:47.432248 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec37a9fa-7555-4e81-af4a-dad48b85942c-inventory" (OuterVolumeSpecName: "inventory") pod "ec37a9fa-7555-4e81-af4a-dad48b85942c" (UID: "ec37a9fa-7555-4e81-af4a-dad48b85942c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:35:47 crc kubenswrapper[4861]: I0309 09:35:47.507462 4861 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ec37a9fa-7555-4e81-af4a-dad48b85942c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 09:35:47 crc kubenswrapper[4861]: I0309 09:35:47.507747 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5xxk\" (UniqueName: \"kubernetes.io/projected/ec37a9fa-7555-4e81-af4a-dad48b85942c-kube-api-access-r5xxk\") on node \"crc\" DevicePath \"\"" Mar 09 09:35:47 crc kubenswrapper[4861]: I0309 09:35:47.507814 4861 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ec37a9fa-7555-4e81-af4a-dad48b85942c-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 09:35:47 crc kubenswrapper[4861]: I0309 09:35:47.801026 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7n4l5" event={"ID":"ec37a9fa-7555-4e81-af4a-dad48b85942c","Type":"ContainerDied","Data":"a83476cf4f194c870063b977e7c854a6b9b772045e704e7cbcc6d18db3d7453b"} Mar 09 09:35:47 crc kubenswrapper[4861]: I0309 09:35:47.801115 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a83476cf4f194c870063b977e7c854a6b9b772045e704e7cbcc6d18db3d7453b" Mar 09 09:35:47 crc kubenswrapper[4861]: I0309 09:35:47.801441 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-7n4l5" Mar 09 09:35:47 crc kubenswrapper[4861]: I0309 09:35:47.875071 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-j8d2h"] Mar 09 09:35:47 crc kubenswrapper[4861]: E0309 09:35:47.875582 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec37a9fa-7555-4e81-af4a-dad48b85942c" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 09 09:35:47 crc kubenswrapper[4861]: I0309 09:35:47.875609 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec37a9fa-7555-4e81-af4a-dad48b85942c" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 09 09:35:47 crc kubenswrapper[4861]: I0309 09:35:47.875819 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec37a9fa-7555-4e81-af4a-dad48b85942c" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 09 09:35:47 crc kubenswrapper[4861]: I0309 09:35:47.876578 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j8d2h" Mar 09 09:35:47 crc kubenswrapper[4861]: I0309 09:35:47.879005 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 09:35:47 crc kubenswrapper[4861]: I0309 09:35:47.879021 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 09:35:47 crc kubenswrapper[4861]: I0309 09:35:47.879849 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 09:35:47 crc kubenswrapper[4861]: I0309 09:35:47.882015 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lkd5q" Mar 09 09:35:47 crc kubenswrapper[4861]: I0309 09:35:47.890107 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-j8d2h"] Mar 09 09:35:47 crc kubenswrapper[4861]: I0309 09:35:47.918944 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0bb63c32-5f67-4912-b238-893dc92107b9-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-j8d2h\" (UID: \"0bb63c32-5f67-4912-b238-893dc92107b9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j8d2h" Mar 09 09:35:47 crc kubenswrapper[4861]: I0309 09:35:47.919092 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcc9x\" (UniqueName: \"kubernetes.io/projected/0bb63c32-5f67-4912-b238-893dc92107b9-kube-api-access-bcc9x\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-j8d2h\" (UID: \"0bb63c32-5f67-4912-b238-893dc92107b9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j8d2h" Mar 09 09:35:47 crc kubenswrapper[4861]: I0309 09:35:47.919138 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0bb63c32-5f67-4912-b238-893dc92107b9-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-j8d2h\" (UID: \"0bb63c32-5f67-4912-b238-893dc92107b9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j8d2h" Mar 09 09:35:48 crc kubenswrapper[4861]: I0309 09:35:48.020614 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcc9x\" (UniqueName: \"kubernetes.io/projected/0bb63c32-5f67-4912-b238-893dc92107b9-kube-api-access-bcc9x\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-j8d2h\" (UID: \"0bb63c32-5f67-4912-b238-893dc92107b9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j8d2h" Mar 09 09:35:48 crc kubenswrapper[4861]: I0309 09:35:48.020669 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0bb63c32-5f67-4912-b238-893dc92107b9-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-j8d2h\" (UID: \"0bb63c32-5f67-4912-b238-893dc92107b9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j8d2h" Mar 09 09:35:48 crc kubenswrapper[4861]: I0309 09:35:48.020734 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0bb63c32-5f67-4912-b238-893dc92107b9-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-j8d2h\" (UID: \"0bb63c32-5f67-4912-b238-893dc92107b9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j8d2h" Mar 09 09:35:48 crc kubenswrapper[4861]: I0309 09:35:48.024469 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0bb63c32-5f67-4912-b238-893dc92107b9-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-j8d2h\" (UID: \"0bb63c32-5f67-4912-b238-893dc92107b9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j8d2h" Mar 09 09:35:48 crc kubenswrapper[4861]: I0309 09:35:48.024821 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0bb63c32-5f67-4912-b238-893dc92107b9-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-j8d2h\" (UID: \"0bb63c32-5f67-4912-b238-893dc92107b9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j8d2h" Mar 09 09:35:48 crc kubenswrapper[4861]: I0309 09:35:48.037441 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcc9x\" (UniqueName: \"kubernetes.io/projected/0bb63c32-5f67-4912-b238-893dc92107b9-kube-api-access-bcc9x\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-j8d2h\" (UID: \"0bb63c32-5f67-4912-b238-893dc92107b9\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j8d2h" Mar 09 09:35:48 crc kubenswrapper[4861]: I0309 09:35:48.194770 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j8d2h" Mar 09 09:35:48 crc kubenswrapper[4861]: I0309 09:35:48.721627 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-j8d2h"] Mar 09 09:35:48 crc kubenswrapper[4861]: I0309 09:35:48.813916 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j8d2h" event={"ID":"0bb63c32-5f67-4912-b238-893dc92107b9","Type":"ContainerStarted","Data":"9d61d4800e52cb2c62a187ec2d46bb9461e29276a9f9d67dcc9abdffe3b2a646"} Mar 09 09:35:49 crc kubenswrapper[4861]: I0309 09:35:49.828763 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j8d2h" event={"ID":"0bb63c32-5f67-4912-b238-893dc92107b9","Type":"ContainerStarted","Data":"dc444131c5fdc034cd1044bdb1477b14eeecf1c448e923f5f940b030f90ee8db"} Mar 09 09:35:49 crc kubenswrapper[4861]: I0309 09:35:49.861837 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j8d2h" podStartSLOduration=2.384637621 podStartE2EDuration="2.861805533s" podCreationTimestamp="2026-03-09 09:35:47 +0000 UTC" firstStartedPulling="2026-03-09 09:35:48.725598209 +0000 UTC m=+1791.810637610" lastFinishedPulling="2026-03-09 09:35:49.202766121 +0000 UTC m=+1792.287805522" observedRunningTime="2026-03-09 09:35:49.847112272 +0000 UTC m=+1792.932151673" watchObservedRunningTime="2026-03-09 09:35:49.861805533 +0000 UTC m=+1792.946844934" Mar 09 09:35:54 crc kubenswrapper[4861]: I0309 09:35:54.658434 4861 scope.go:117] "RemoveContainer" containerID="c0726d3ac822004eacb4f8d12bb4cbaf2815fc9d29aaa8ba7db9d4fae1717ee1" Mar 09 09:35:54 crc kubenswrapper[4861]: E0309 09:35:54.658888 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:36:00 crc kubenswrapper[4861]: I0309 09:36:00.137487 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550816-kr9kp"] Mar 09 09:36:00 crc kubenswrapper[4861]: I0309 09:36:00.139063 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550816-kr9kp" Mar 09 09:36:00 crc kubenswrapper[4861]: I0309 09:36:00.143942 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:36:00 crc kubenswrapper[4861]: I0309 09:36:00.145117 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:36:00 crc kubenswrapper[4861]: I0309 09:36:00.145504 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k86l8" Mar 09 09:36:00 crc kubenswrapper[4861]: I0309 09:36:00.153959 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550816-kr9kp"] Mar 09 09:36:00 crc kubenswrapper[4861]: I0309 09:36:00.262136 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt6vg\" (UniqueName: \"kubernetes.io/projected/ad9cdaac-b9a0-401a-8095-8094dee9ce05-kube-api-access-pt6vg\") pod \"auto-csr-approver-29550816-kr9kp\" (UID: \"ad9cdaac-b9a0-401a-8095-8094dee9ce05\") " pod="openshift-infra/auto-csr-approver-29550816-kr9kp" Mar 09 09:36:00 crc kubenswrapper[4861]: I0309 09:36:00.364088 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pt6vg\" (UniqueName: \"kubernetes.io/projected/ad9cdaac-b9a0-401a-8095-8094dee9ce05-kube-api-access-pt6vg\") pod \"auto-csr-approver-29550816-kr9kp\" (UID: \"ad9cdaac-b9a0-401a-8095-8094dee9ce05\") " pod="openshift-infra/auto-csr-approver-29550816-kr9kp" Mar 09 09:36:00 crc kubenswrapper[4861]: I0309 09:36:00.385737 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt6vg\" (UniqueName: \"kubernetes.io/projected/ad9cdaac-b9a0-401a-8095-8094dee9ce05-kube-api-access-pt6vg\") pod \"auto-csr-approver-29550816-kr9kp\" (UID: \"ad9cdaac-b9a0-401a-8095-8094dee9ce05\") " pod="openshift-infra/auto-csr-approver-29550816-kr9kp" Mar 09 09:36:00 crc kubenswrapper[4861]: I0309 09:36:00.459151 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550816-kr9kp" Mar 09 09:36:00 crc kubenswrapper[4861]: I0309 09:36:00.897279 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550816-kr9kp"] Mar 09 09:36:00 crc kubenswrapper[4861]: W0309 09:36:00.899273 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad9cdaac_b9a0_401a_8095_8094dee9ce05.slice/crio-80c97446df0c2573d4372f61e1e7280f3ea9d5b6cb2289b8f4bfa561ddd71a3d WatchSource:0}: Error finding container 80c97446df0c2573d4372f61e1e7280f3ea9d5b6cb2289b8f4bfa561ddd71a3d: Status 404 returned error can't find the container with id 80c97446df0c2573d4372f61e1e7280f3ea9d5b6cb2289b8f4bfa561ddd71a3d Mar 09 09:36:00 crc kubenswrapper[4861]: I0309 09:36:00.926034 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550816-kr9kp" event={"ID":"ad9cdaac-b9a0-401a-8095-8094dee9ce05","Type":"ContainerStarted","Data":"80c97446df0c2573d4372f61e1e7280f3ea9d5b6cb2289b8f4bfa561ddd71a3d"} Mar 09 09:36:02 crc kubenswrapper[4861]: I0309 09:36:02.944973 4861 generic.go:334] "Generic (PLEG): container finished" podID="ad9cdaac-b9a0-401a-8095-8094dee9ce05" containerID="f77a4307056f9ad78d18c7e95129f670ac38d9210893427691e9fcf63cfda0d0" exitCode=0 Mar 09 09:36:02 crc kubenswrapper[4861]: I0309 09:36:02.945050 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550816-kr9kp" event={"ID":"ad9cdaac-b9a0-401a-8095-8094dee9ce05","Type":"ContainerDied","Data":"f77a4307056f9ad78d18c7e95129f670ac38d9210893427691e9fcf63cfda0d0"} Mar 09 09:36:04 crc kubenswrapper[4861]: I0309 09:36:04.332689 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550816-kr9kp" Mar 09 09:36:04 crc kubenswrapper[4861]: I0309 09:36:04.440768 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pt6vg\" (UniqueName: \"kubernetes.io/projected/ad9cdaac-b9a0-401a-8095-8094dee9ce05-kube-api-access-pt6vg\") pod \"ad9cdaac-b9a0-401a-8095-8094dee9ce05\" (UID: \"ad9cdaac-b9a0-401a-8095-8094dee9ce05\") " Mar 09 09:36:04 crc kubenswrapper[4861]: I0309 09:36:04.447635 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad9cdaac-b9a0-401a-8095-8094dee9ce05-kube-api-access-pt6vg" (OuterVolumeSpecName: "kube-api-access-pt6vg") pod "ad9cdaac-b9a0-401a-8095-8094dee9ce05" (UID: "ad9cdaac-b9a0-401a-8095-8094dee9ce05"). InnerVolumeSpecName "kube-api-access-pt6vg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:36:04 crc kubenswrapper[4861]: I0309 09:36:04.542725 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pt6vg\" (UniqueName: \"kubernetes.io/projected/ad9cdaac-b9a0-401a-8095-8094dee9ce05-kube-api-access-pt6vg\") on node \"crc\" DevicePath \"\"" Mar 09 09:36:04 crc kubenswrapper[4861]: I0309 09:36:04.965938 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550816-kr9kp" event={"ID":"ad9cdaac-b9a0-401a-8095-8094dee9ce05","Type":"ContainerDied","Data":"80c97446df0c2573d4372f61e1e7280f3ea9d5b6cb2289b8f4bfa561ddd71a3d"} Mar 09 09:36:04 crc kubenswrapper[4861]: I0309 09:36:04.965981 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80c97446df0c2573d4372f61e1e7280f3ea9d5b6cb2289b8f4bfa561ddd71a3d" Mar 09 09:36:04 crc kubenswrapper[4861]: I0309 09:36:04.966315 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550816-kr9kp" Mar 09 09:36:05 crc kubenswrapper[4861]: I0309 09:36:05.442150 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550810-fzhll"] Mar 09 09:36:05 crc kubenswrapper[4861]: I0309 09:36:05.449819 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550810-fzhll"] Mar 09 09:36:05 crc kubenswrapper[4861]: I0309 09:36:05.669123 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e6fc23e-235c-405e-84ab-b09260e27aac" path="/var/lib/kubelet/pods/8e6fc23e-235c-405e-84ab-b09260e27aac/volumes" Mar 09 09:36:09 crc kubenswrapper[4861]: I0309 09:36:09.657895 4861 scope.go:117] "RemoveContainer" containerID="c0726d3ac822004eacb4f8d12bb4cbaf2815fc9d29aaa8ba7db9d4fae1717ee1" Mar 09 09:36:09 crc kubenswrapper[4861]: E0309 09:36:09.659342 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:36:12 crc kubenswrapper[4861]: I0309 09:36:12.401199 4861 scope.go:117] "RemoveContainer" containerID="f98290531ebd1c593cfa0aacbe3b6f0e3032adc647ce41130ba5cc6ef8a30f66" Mar 09 09:36:12 crc kubenswrapper[4861]: I0309 09:36:12.449650 4861 scope.go:117] "RemoveContainer" containerID="5c1402523e38b6a059b942534e04454c0efa21ed13ded8af6b33ce288f6f6dbc" Mar 09 09:36:12 crc kubenswrapper[4861]: I0309 09:36:12.498663 4861 scope.go:117] "RemoveContainer" containerID="29b2b6555a7395967ad28cf5a8555c4db17c2735decaf0c6d5b435f52f5c0ac1" Mar 09 09:36:12 crc kubenswrapper[4861]: I0309 09:36:12.538048 4861 scope.go:117] "RemoveContainer" containerID="c6030e0696081dc572663a630ff4a6ab639c92c512e4f4ee1d80fd3780338ec9" Mar 09 09:36:12 crc kubenswrapper[4861]: I0309 09:36:12.590565 4861 scope.go:117] "RemoveContainer" containerID="a1b48cb736714ed720e92ad5dd691e482e8fb6a283685c7dcd91aabfe74ed413" Mar 09 09:36:22 crc kubenswrapper[4861]: I0309 09:36:22.138813 4861 generic.go:334] "Generic (PLEG): container finished" podID="0bb63c32-5f67-4912-b238-893dc92107b9" containerID="dc444131c5fdc034cd1044bdb1477b14eeecf1c448e923f5f940b030f90ee8db" exitCode=0 Mar 09 09:36:22 crc kubenswrapper[4861]: I0309 09:36:22.138955 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j8d2h" event={"ID":"0bb63c32-5f67-4912-b238-893dc92107b9","Type":"ContainerDied","Data":"dc444131c5fdc034cd1044bdb1477b14eeecf1c448e923f5f940b030f90ee8db"} Mar 09 09:36:23 crc kubenswrapper[4861]: I0309 09:36:23.557297 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j8d2h" Mar 09 09:36:23 crc kubenswrapper[4861]: I0309 09:36:23.588301 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0bb63c32-5f67-4912-b238-893dc92107b9-inventory\") pod \"0bb63c32-5f67-4912-b238-893dc92107b9\" (UID: \"0bb63c32-5f67-4912-b238-893dc92107b9\") " Mar 09 09:36:23 crc kubenswrapper[4861]: I0309 09:36:23.588837 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcc9x\" (UniqueName: \"kubernetes.io/projected/0bb63c32-5f67-4912-b238-893dc92107b9-kube-api-access-bcc9x\") pod \"0bb63c32-5f67-4912-b238-893dc92107b9\" (UID: \"0bb63c32-5f67-4912-b238-893dc92107b9\") " Mar 09 09:36:23 crc kubenswrapper[4861]: I0309 09:36:23.588994 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0bb63c32-5f67-4912-b238-893dc92107b9-ssh-key-openstack-edpm-ipam\") pod \"0bb63c32-5f67-4912-b238-893dc92107b9\" (UID: \"0bb63c32-5f67-4912-b238-893dc92107b9\") " Mar 09 09:36:23 crc kubenswrapper[4861]: I0309 09:36:23.601274 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bb63c32-5f67-4912-b238-893dc92107b9-kube-api-access-bcc9x" (OuterVolumeSpecName: "kube-api-access-bcc9x") pod "0bb63c32-5f67-4912-b238-893dc92107b9" (UID: "0bb63c32-5f67-4912-b238-893dc92107b9"). InnerVolumeSpecName "kube-api-access-bcc9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:36:23 crc kubenswrapper[4861]: I0309 09:36:23.617749 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bb63c32-5f67-4912-b238-893dc92107b9-inventory" (OuterVolumeSpecName: "inventory") pod "0bb63c32-5f67-4912-b238-893dc92107b9" (UID: "0bb63c32-5f67-4912-b238-893dc92107b9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:36:23 crc kubenswrapper[4861]: I0309 09:36:23.621381 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bb63c32-5f67-4912-b238-893dc92107b9-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0bb63c32-5f67-4912-b238-893dc92107b9" (UID: "0bb63c32-5f67-4912-b238-893dc92107b9"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:36:23 crc kubenswrapper[4861]: I0309 09:36:23.658153 4861 scope.go:117] "RemoveContainer" containerID="c0726d3ac822004eacb4f8d12bb4cbaf2815fc9d29aaa8ba7db9d4fae1717ee1" Mar 09 09:36:23 crc kubenswrapper[4861]: E0309 09:36:23.658532 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:36:23 crc kubenswrapper[4861]: I0309 09:36:23.691816 4861 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0bb63c32-5f67-4912-b238-893dc92107b9-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 09:36:23 crc kubenswrapper[4861]: I0309 09:36:23.691853 4861 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0bb63c32-5f67-4912-b238-893dc92107b9-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 09:36:23 crc kubenswrapper[4861]: I0309 09:36:23.691865 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bcc9x\" (UniqueName: \"kubernetes.io/projected/0bb63c32-5f67-4912-b238-893dc92107b9-kube-api-access-bcc9x\") on node \"crc\" DevicePath \"\"" Mar 09 09:36:24 crc kubenswrapper[4861]: I0309 09:36:24.158824 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j8d2h" event={"ID":"0bb63c32-5f67-4912-b238-893dc92107b9","Type":"ContainerDied","Data":"9d61d4800e52cb2c62a187ec2d46bb9461e29276a9f9d67dcc9abdffe3b2a646"} Mar 09 09:36:24 crc kubenswrapper[4861]: I0309 09:36:24.159155 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d61d4800e52cb2c62a187ec2d46bb9461e29276a9f9d67dcc9abdffe3b2a646" Mar 09 09:36:24 crc kubenswrapper[4861]: I0309 09:36:24.158882 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-j8d2h" Mar 09 09:36:24 crc kubenswrapper[4861]: I0309 09:36:24.249137 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dgmz7"] Mar 09 09:36:24 crc kubenswrapper[4861]: E0309 09:36:24.249611 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bb63c32-5f67-4912-b238-893dc92107b9" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 09 09:36:24 crc kubenswrapper[4861]: I0309 09:36:24.249638 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bb63c32-5f67-4912-b238-893dc92107b9" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 09 09:36:24 crc kubenswrapper[4861]: E0309 09:36:24.249653 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad9cdaac-b9a0-401a-8095-8094dee9ce05" containerName="oc" Mar 09 09:36:24 crc kubenswrapper[4861]: I0309 09:36:24.249661 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad9cdaac-b9a0-401a-8095-8094dee9ce05" containerName="oc" Mar 09 09:36:24 crc kubenswrapper[4861]: I0309 09:36:24.249929 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad9cdaac-b9a0-401a-8095-8094dee9ce05" containerName="oc" Mar 09 09:36:24 crc kubenswrapper[4861]: I0309 09:36:24.249949 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bb63c32-5f67-4912-b238-893dc92107b9" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 09 09:36:24 crc kubenswrapper[4861]: I0309 09:36:24.250813 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dgmz7" Mar 09 09:36:24 crc kubenswrapper[4861]: I0309 09:36:24.254484 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 09:36:24 crc kubenswrapper[4861]: I0309 09:36:24.254636 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 09:36:24 crc kubenswrapper[4861]: I0309 09:36:24.254686 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 09:36:24 crc kubenswrapper[4861]: I0309 09:36:24.254817 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lkd5q" Mar 09 09:36:24 crc kubenswrapper[4861]: I0309 09:36:24.261913 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dgmz7"] Mar 09 09:36:24 crc kubenswrapper[4861]: I0309 09:36:24.305895 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7bbe9e42-4b9b-42e7-bfed-ff93ff905164-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dgmz7\" (UID: \"7bbe9e42-4b9b-42e7-bfed-ff93ff905164\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dgmz7" Mar 09 09:36:24 crc kubenswrapper[4861]: I0309 09:36:24.306144 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7bbe9e42-4b9b-42e7-bfed-ff93ff905164-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dgmz7\" (UID: \"7bbe9e42-4b9b-42e7-bfed-ff93ff905164\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dgmz7" Mar 09 09:36:24 crc kubenswrapper[4861]: I0309 09:36:24.306445 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrhnb\" (UniqueName: \"kubernetes.io/projected/7bbe9e42-4b9b-42e7-bfed-ff93ff905164-kube-api-access-jrhnb\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dgmz7\" (UID: \"7bbe9e42-4b9b-42e7-bfed-ff93ff905164\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dgmz7" Mar 09 09:36:24 crc kubenswrapper[4861]: I0309 09:36:24.408636 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrhnb\" (UniqueName: \"kubernetes.io/projected/7bbe9e42-4b9b-42e7-bfed-ff93ff905164-kube-api-access-jrhnb\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dgmz7\" (UID: \"7bbe9e42-4b9b-42e7-bfed-ff93ff905164\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dgmz7" Mar 09 09:36:24 crc kubenswrapper[4861]: I0309 09:36:24.409542 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7bbe9e42-4b9b-42e7-bfed-ff93ff905164-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dgmz7\" (UID: \"7bbe9e42-4b9b-42e7-bfed-ff93ff905164\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dgmz7" Mar 09 09:36:24 crc kubenswrapper[4861]: I0309 09:36:24.410233 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7bbe9e42-4b9b-42e7-bfed-ff93ff905164-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dgmz7\" (UID: \"7bbe9e42-4b9b-42e7-bfed-ff93ff905164\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dgmz7" Mar 09 09:36:24 crc kubenswrapper[4861]: I0309 09:36:24.416327 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7bbe9e42-4b9b-42e7-bfed-ff93ff905164-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dgmz7\" (UID: \"7bbe9e42-4b9b-42e7-bfed-ff93ff905164\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dgmz7" Mar 09 09:36:24 crc kubenswrapper[4861]: I0309 09:36:24.416741 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7bbe9e42-4b9b-42e7-bfed-ff93ff905164-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dgmz7\" (UID: \"7bbe9e42-4b9b-42e7-bfed-ff93ff905164\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dgmz7" Mar 09 09:36:24 crc kubenswrapper[4861]: I0309 09:36:24.427198 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrhnb\" (UniqueName: \"kubernetes.io/projected/7bbe9e42-4b9b-42e7-bfed-ff93ff905164-kube-api-access-jrhnb\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dgmz7\" (UID: \"7bbe9e42-4b9b-42e7-bfed-ff93ff905164\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dgmz7" Mar 09 09:36:24 crc kubenswrapper[4861]: I0309 09:36:24.568038 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dgmz7" Mar 09 09:36:25 crc kubenswrapper[4861]: I0309 09:36:25.097520 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dgmz7"] Mar 09 09:36:25 crc kubenswrapper[4861]: I0309 09:36:25.171792 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dgmz7" event={"ID":"7bbe9e42-4b9b-42e7-bfed-ff93ff905164","Type":"ContainerStarted","Data":"a9422c37712d5e4baf952665da7ae7a5d3333087dbcec45e1e24711693f45dd1"} Mar 09 09:36:26 crc kubenswrapper[4861]: I0309 09:36:26.026187 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-4f2ds"] Mar 09 09:36:26 crc kubenswrapper[4861]: I0309 09:36:26.034759 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-4f2ds"] Mar 09 09:36:26 crc kubenswrapper[4861]: I0309 09:36:26.181069 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dgmz7" event={"ID":"7bbe9e42-4b9b-42e7-bfed-ff93ff905164","Type":"ContainerStarted","Data":"17fb9ac94b7210d826fd3cedc26a8e5bb09bc5bdca5dd059bf1c6c5c0abd5f69"} Mar 09 09:36:26 crc kubenswrapper[4861]: I0309 09:36:26.197112 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dgmz7" podStartSLOduration=1.6769470640000002 podStartE2EDuration="2.197090631s" podCreationTimestamp="2026-03-09 09:36:24 +0000 UTC" firstStartedPulling="2026-03-09 09:36:25.091061194 +0000 UTC m=+1828.176100595" lastFinishedPulling="2026-03-09 09:36:25.611204761 +0000 UTC m=+1828.696244162" observedRunningTime="2026-03-09 09:36:26.194770717 +0000 UTC m=+1829.279810128" watchObservedRunningTime="2026-03-09 09:36:26.197090631 +0000 UTC m=+1829.282130032" Mar 09 09:36:27 crc kubenswrapper[4861]: I0309 09:36:27.029481 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-z4xzx"] Mar 09 09:36:27 crc kubenswrapper[4861]: I0309 09:36:27.036929 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-ct722"] Mar 09 09:36:27 crc kubenswrapper[4861]: I0309 09:36:27.044360 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-4ee7-account-create-update-fhsvd"] Mar 09 09:36:27 crc kubenswrapper[4861]: I0309 09:36:27.053017 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-ddfc-account-create-update-s2nh2"] Mar 09 09:36:27 crc kubenswrapper[4861]: I0309 09:36:27.062311 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-c9be-account-create-update-jzpgr"] Mar 09 09:36:27 crc kubenswrapper[4861]: I0309 09:36:27.071148 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-z4xzx"] Mar 09 09:36:27 crc kubenswrapper[4861]: I0309 09:36:27.081066 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-ct722"] Mar 09 09:36:27 crc kubenswrapper[4861]: I0309 09:36:27.090202 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-ddfc-account-create-update-s2nh2"] Mar 09 09:36:27 crc kubenswrapper[4861]: I0309 09:36:27.098075 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-c9be-account-create-update-jzpgr"] Mar 09 09:36:27 crc kubenswrapper[4861]: I0309 09:36:27.107947 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-4ee7-account-create-update-fhsvd"] Mar 09 09:36:27 crc kubenswrapper[4861]: I0309 09:36:27.668459 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19523fad-3ad4-4c4a-b329-90f8136ce34c" path="/var/lib/kubelet/pods/19523fad-3ad4-4c4a-b329-90f8136ce34c/volumes" Mar 09 09:36:27 crc kubenswrapper[4861]: I0309 09:36:27.669041 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53c84cc0-8dc4-44c5-89d7-bb118f6b4fe2" path="/var/lib/kubelet/pods/53c84cc0-8dc4-44c5-89d7-bb118f6b4fe2/volumes" Mar 09 09:36:27 crc kubenswrapper[4861]: I0309 09:36:27.669612 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bde3ec7b-bd8f-4936-91b4-c8a7063628c8" path="/var/lib/kubelet/pods/bde3ec7b-bd8f-4936-91b4-c8a7063628c8/volumes" Mar 09 09:36:27 crc kubenswrapper[4861]: I0309 09:36:27.670146 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf5b6714-95ce-4470-9536-10e1d281b52e" path="/var/lib/kubelet/pods/bf5b6714-95ce-4470-9536-10e1d281b52e/volumes" Mar 09 09:36:27 crc kubenswrapper[4861]: I0309 09:36:27.671180 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1271b64-efb8-425e-8718-1e28003a5722" path="/var/lib/kubelet/pods/c1271b64-efb8-425e-8718-1e28003a5722/volumes" Mar 09 09:36:27 crc kubenswrapper[4861]: I0309 09:36:27.671749 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e727b52d-d582-4d07-a852-edd0a15e1ba7" path="/var/lib/kubelet/pods/e727b52d-d582-4d07-a852-edd0a15e1ba7/volumes" Mar 09 09:36:37 crc kubenswrapper[4861]: I0309 09:36:37.663311 4861 scope.go:117] "RemoveContainer" containerID="c0726d3ac822004eacb4f8d12bb4cbaf2815fc9d29aaa8ba7db9d4fae1717ee1" Mar 09 09:36:37 crc kubenswrapper[4861]: E0309 09:36:37.665329 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:36:52 crc kubenswrapper[4861]: I0309 09:36:52.039765 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6zv4b"] Mar 09 09:36:52 crc kubenswrapper[4861]: I0309 09:36:52.050253 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6zv4b"] Mar 09 09:36:52 crc kubenswrapper[4861]: I0309 09:36:52.657488 4861 scope.go:117] "RemoveContainer" containerID="c0726d3ac822004eacb4f8d12bb4cbaf2815fc9d29aaa8ba7db9d4fae1717ee1" Mar 09 09:36:52 crc kubenswrapper[4861]: E0309 09:36:52.657822 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:36:53 crc kubenswrapper[4861]: I0309 09:36:53.669004 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4599154b-2118-461d-9999-d07931415f9c" path="/var/lib/kubelet/pods/4599154b-2118-461d-9999-d07931415f9c/volumes" Mar 09 09:37:04 crc kubenswrapper[4861]: I0309 09:37:04.658405 4861 scope.go:117] "RemoveContainer" containerID="c0726d3ac822004eacb4f8d12bb4cbaf2815fc9d29aaa8ba7db9d4fae1717ee1" Mar 09 09:37:04 crc kubenswrapper[4861]: E0309 09:37:04.659172 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:37:10 crc kubenswrapper[4861]: I0309 09:37:10.054343 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-sbjdd"] Mar 09 09:37:10 crc kubenswrapper[4861]: I0309 09:37:10.069561 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-sbjdd"] Mar 09 09:37:10 crc kubenswrapper[4861]: I0309 09:37:10.578271 4861 generic.go:334] "Generic (PLEG): container finished" podID="7bbe9e42-4b9b-42e7-bfed-ff93ff905164" containerID="17fb9ac94b7210d826fd3cedc26a8e5bb09bc5bdca5dd059bf1c6c5c0abd5f69" exitCode=0 Mar 09 09:37:10 crc kubenswrapper[4861]: I0309 09:37:10.578313 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dgmz7" event={"ID":"7bbe9e42-4b9b-42e7-bfed-ff93ff905164","Type":"ContainerDied","Data":"17fb9ac94b7210d826fd3cedc26a8e5bb09bc5bdca5dd059bf1c6c5c0abd5f69"} Mar 09 09:37:11 crc kubenswrapper[4861]: I0309 09:37:11.669642 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17803e89-5e7e-4d37-b96f-53e26da13fc2" path="/var/lib/kubelet/pods/17803e89-5e7e-4d37-b96f-53e26da13fc2/volumes" Mar 09 09:37:11 crc kubenswrapper[4861]: I0309 09:37:11.935265 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dgmz7" Mar 09 09:37:12 crc kubenswrapper[4861]: I0309 09:37:12.098125 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7bbe9e42-4b9b-42e7-bfed-ff93ff905164-ssh-key-openstack-edpm-ipam\") pod \"7bbe9e42-4b9b-42e7-bfed-ff93ff905164\" (UID: \"7bbe9e42-4b9b-42e7-bfed-ff93ff905164\") " Mar 09 09:37:12 crc kubenswrapper[4861]: I0309 09:37:12.098217 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7bbe9e42-4b9b-42e7-bfed-ff93ff905164-inventory\") pod \"7bbe9e42-4b9b-42e7-bfed-ff93ff905164\" (UID: \"7bbe9e42-4b9b-42e7-bfed-ff93ff905164\") " Mar 09 09:37:12 crc kubenswrapper[4861]: I0309 09:37:12.098318 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrhnb\" (UniqueName: \"kubernetes.io/projected/7bbe9e42-4b9b-42e7-bfed-ff93ff905164-kube-api-access-jrhnb\") pod \"7bbe9e42-4b9b-42e7-bfed-ff93ff905164\" (UID: \"7bbe9e42-4b9b-42e7-bfed-ff93ff905164\") " Mar 09 09:37:12 crc kubenswrapper[4861]: I0309 09:37:12.110718 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bbe9e42-4b9b-42e7-bfed-ff93ff905164-kube-api-access-jrhnb" (OuterVolumeSpecName: "kube-api-access-jrhnb") pod "7bbe9e42-4b9b-42e7-bfed-ff93ff905164" (UID: "7bbe9e42-4b9b-42e7-bfed-ff93ff905164"). InnerVolumeSpecName "kube-api-access-jrhnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:37:12 crc kubenswrapper[4861]: I0309 09:37:12.137153 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bbe9e42-4b9b-42e7-bfed-ff93ff905164-inventory" (OuterVolumeSpecName: "inventory") pod "7bbe9e42-4b9b-42e7-bfed-ff93ff905164" (UID: "7bbe9e42-4b9b-42e7-bfed-ff93ff905164"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:37:12 crc kubenswrapper[4861]: I0309 09:37:12.155994 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bbe9e42-4b9b-42e7-bfed-ff93ff905164-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7bbe9e42-4b9b-42e7-bfed-ff93ff905164" (UID: "7bbe9e42-4b9b-42e7-bfed-ff93ff905164"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:37:12 crc kubenswrapper[4861]: I0309 09:37:12.200536 4861 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7bbe9e42-4b9b-42e7-bfed-ff93ff905164-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 09:37:12 crc kubenswrapper[4861]: I0309 09:37:12.200573 4861 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7bbe9e42-4b9b-42e7-bfed-ff93ff905164-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 09:37:12 crc kubenswrapper[4861]: I0309 09:37:12.200583 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrhnb\" (UniqueName: \"kubernetes.io/projected/7bbe9e42-4b9b-42e7-bfed-ff93ff905164-kube-api-access-jrhnb\") on node \"crc\" DevicePath \"\"" Mar 09 09:37:12 crc kubenswrapper[4861]: I0309 09:37:12.597285 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dgmz7" event={"ID":"7bbe9e42-4b9b-42e7-bfed-ff93ff905164","Type":"ContainerDied","Data":"a9422c37712d5e4baf952665da7ae7a5d3333087dbcec45e1e24711693f45dd1"} Mar 09 09:37:12 crc kubenswrapper[4861]: I0309 09:37:12.597324 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9422c37712d5e4baf952665da7ae7a5d3333087dbcec45e1e24711693f45dd1" Mar 09 09:37:12 crc kubenswrapper[4861]: I0309 09:37:12.597296 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dgmz7" Mar 09 09:37:12 crc kubenswrapper[4861]: I0309 09:37:12.723815 4861 scope.go:117] "RemoveContainer" containerID="8dba9ea5cbb350757abc3c6c107260c1845bbb08db4d254372d925dd47300f02" Mar 09 09:37:12 crc kubenswrapper[4861]: I0309 09:37:12.763103 4861 scope.go:117] "RemoveContainer" containerID="bb043b3f327d1cfc30bfb89247439e71ac9e61ef07686e06bb68810faf01e8c7" Mar 09 09:37:12 crc kubenswrapper[4861]: I0309 09:37:12.769487 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-xq9tq"] Mar 09 09:37:12 crc kubenswrapper[4861]: E0309 09:37:12.770087 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bbe9e42-4b9b-42e7-bfed-ff93ff905164" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 09 09:37:12 crc kubenswrapper[4861]: I0309 09:37:12.771065 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bbe9e42-4b9b-42e7-bfed-ff93ff905164" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 09 09:37:12 crc kubenswrapper[4861]: I0309 09:37:12.771491 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bbe9e42-4b9b-42e7-bfed-ff93ff905164" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 09 09:37:12 crc kubenswrapper[4861]: I0309 09:37:12.772362 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-xq9tq" Mar 09 09:37:12 crc kubenswrapper[4861]: I0309 09:37:12.778588 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 09:37:12 crc kubenswrapper[4861]: I0309 09:37:12.778967 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 09:37:12 crc kubenswrapper[4861]: I0309 09:37:12.779269 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lkd5q" Mar 09 09:37:12 crc kubenswrapper[4861]: I0309 09:37:12.779617 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 09:37:12 crc kubenswrapper[4861]: I0309 09:37:12.782353 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-xq9tq"] Mar 09 09:37:12 crc kubenswrapper[4861]: E0309 09:37:12.800795 4861 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7bbe9e42_4b9b_42e7_bfed_ff93ff905164.slice/crio-a9422c37712d5e4baf952665da7ae7a5d3333087dbcec45e1e24711693f45dd1\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7bbe9e42_4b9b_42e7_bfed_ff93ff905164.slice\": RecentStats: unable to find data in memory cache]" Mar 09 09:37:12 crc kubenswrapper[4861]: I0309 09:37:12.804817 4861 scope.go:117] "RemoveContainer" containerID="be077af5cfad01301f03211be785b590aa18994c7c2f198e213cf684d795b731" Mar 09 09:37:12 crc kubenswrapper[4861]: I0309 09:37:12.812232 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/e8f222a3-04ea-475e-aab7-97cf0ba5021c-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-xq9tq\" (UID: \"e8f222a3-04ea-475e-aab7-97cf0ba5021c\") " pod="openstack/ssh-known-hosts-edpm-deployment-xq9tq" Mar 09 09:37:12 crc kubenswrapper[4861]: I0309 09:37:12.812321 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e8f222a3-04ea-475e-aab7-97cf0ba5021c-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-xq9tq\" (UID: \"e8f222a3-04ea-475e-aab7-97cf0ba5021c\") " pod="openstack/ssh-known-hosts-edpm-deployment-xq9tq" Mar 09 09:37:12 crc kubenswrapper[4861]: I0309 09:37:12.812355 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k82jr\" (UniqueName: \"kubernetes.io/projected/e8f222a3-04ea-475e-aab7-97cf0ba5021c-kube-api-access-k82jr\") pod \"ssh-known-hosts-edpm-deployment-xq9tq\" (UID: \"e8f222a3-04ea-475e-aab7-97cf0ba5021c\") " pod="openstack/ssh-known-hosts-edpm-deployment-xq9tq" Mar 09 09:37:12 crc kubenswrapper[4861]: I0309 09:37:12.847317 4861 scope.go:117] "RemoveContainer" containerID="f93ea925aa9afcb72e0a19f87a27a62a1977b3537b78c1b0133b9e09d8c7df4b" Mar 09 09:37:12 crc kubenswrapper[4861]: I0309 09:37:12.902732 4861 scope.go:117] "RemoveContainer" containerID="b6a18808b1a1ab9736d1bfd5c639a8629172104f526890dce6521c53541fa906" Mar 09 09:37:12 crc kubenswrapper[4861]: I0309 09:37:12.914128 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e8f222a3-04ea-475e-aab7-97cf0ba5021c-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-xq9tq\" (UID: \"e8f222a3-04ea-475e-aab7-97cf0ba5021c\") " pod="openstack/ssh-known-hosts-edpm-deployment-xq9tq" Mar 09 09:37:12 crc kubenswrapper[4861]: I0309 09:37:12.914211 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k82jr\" (UniqueName: \"kubernetes.io/projected/e8f222a3-04ea-475e-aab7-97cf0ba5021c-kube-api-access-k82jr\") pod \"ssh-known-hosts-edpm-deployment-xq9tq\" (UID: \"e8f222a3-04ea-475e-aab7-97cf0ba5021c\") " pod="openstack/ssh-known-hosts-edpm-deployment-xq9tq" Mar 09 09:37:12 crc kubenswrapper[4861]: I0309 09:37:12.914345 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/e8f222a3-04ea-475e-aab7-97cf0ba5021c-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-xq9tq\" (UID: \"e8f222a3-04ea-475e-aab7-97cf0ba5021c\") " pod="openstack/ssh-known-hosts-edpm-deployment-xq9tq" Mar 09 09:37:12 crc kubenswrapper[4861]: I0309 09:37:12.919304 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e8f222a3-04ea-475e-aab7-97cf0ba5021c-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-xq9tq\" (UID: \"e8f222a3-04ea-475e-aab7-97cf0ba5021c\") " pod="openstack/ssh-known-hosts-edpm-deployment-xq9tq" Mar 09 09:37:12 crc kubenswrapper[4861]: I0309 09:37:12.919317 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/e8f222a3-04ea-475e-aab7-97cf0ba5021c-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-xq9tq\" (UID: \"e8f222a3-04ea-475e-aab7-97cf0ba5021c\") " pod="openstack/ssh-known-hosts-edpm-deployment-xq9tq" Mar 09 09:37:12 crc kubenswrapper[4861]: I0309 09:37:12.934234 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k82jr\" (UniqueName: \"kubernetes.io/projected/e8f222a3-04ea-475e-aab7-97cf0ba5021c-kube-api-access-k82jr\") pod \"ssh-known-hosts-edpm-deployment-xq9tq\" (UID: \"e8f222a3-04ea-475e-aab7-97cf0ba5021c\") " pod="openstack/ssh-known-hosts-edpm-deployment-xq9tq" Mar 09 09:37:13 crc kubenswrapper[4861]: I0309 09:37:13.036431 4861 scope.go:117] "RemoveContainer" containerID="152f30f73b4b29dceb1ed0035e331f3a29771538542812988eeffc9e3474a5f9" Mar 09 09:37:13 crc kubenswrapper[4861]: I0309 09:37:13.054355 4861 scope.go:117] "RemoveContainer" containerID="7d9b3cf1b6dbf0b577036f782d8c13f846201986b4f1aaba042eaed9a0fec4c8" Mar 09 09:37:13 crc kubenswrapper[4861]: I0309 09:37:13.072959 4861 scope.go:117] "RemoveContainer" containerID="dcb93a23b30b98b25417e3ee5e031f7a57d13ca01b3b417824757cfd480d919b" Mar 09 09:37:13 crc kubenswrapper[4861]: I0309 09:37:13.094936 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-xq9tq" Mar 09 09:37:13 crc kubenswrapper[4861]: I0309 09:37:13.620854 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-xq9tq"] Mar 09 09:37:14 crc kubenswrapper[4861]: I0309 09:37:14.624081 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-xq9tq" event={"ID":"e8f222a3-04ea-475e-aab7-97cf0ba5021c","Type":"ContainerStarted","Data":"0b5f1a7bbd679f2da38d9f47b7562247ce60c2ec70507ea5e378f8069750dc18"} Mar 09 09:37:14 crc kubenswrapper[4861]: I0309 09:37:14.624392 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-xq9tq" event={"ID":"e8f222a3-04ea-475e-aab7-97cf0ba5021c","Type":"ContainerStarted","Data":"3c4473705dc46d02763d36aab935b610082159f9f4f7c1569895bfb689839b33"} Mar 09 09:37:14 crc kubenswrapper[4861]: I0309 09:37:14.655448 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-xq9tq" podStartSLOduration=1.909392654 podStartE2EDuration="2.655430307s" podCreationTimestamp="2026-03-09 09:37:12 +0000 UTC" firstStartedPulling="2026-03-09 09:37:13.626400568 +0000 UTC m=+1876.711439969" lastFinishedPulling="2026-03-09 09:37:14.372438221 +0000 UTC m=+1877.457477622" observedRunningTime="2026-03-09 09:37:14.644760347 +0000 UTC m=+1877.729799748" watchObservedRunningTime="2026-03-09 09:37:14.655430307 +0000 UTC m=+1877.740469708" Mar 09 09:37:15 crc kubenswrapper[4861]: I0309 09:37:15.032513 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-sf294"] Mar 09 09:37:15 crc kubenswrapper[4861]: I0309 09:37:15.042884 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-sf294"] Mar 09 09:37:15 crc kubenswrapper[4861]: I0309 09:37:15.658436 4861 scope.go:117] "RemoveContainer" containerID="c0726d3ac822004eacb4f8d12bb4cbaf2815fc9d29aaa8ba7db9d4fae1717ee1" Mar 09 09:37:15 crc kubenswrapper[4861]: E0309 09:37:15.658678 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:37:15 crc kubenswrapper[4861]: I0309 09:37:15.669325 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3c0062b-4a2c-451a-b683-eeeea965a54e" path="/var/lib/kubelet/pods/b3c0062b-4a2c-451a-b683-eeeea965a54e/volumes" Mar 09 09:37:21 crc kubenswrapper[4861]: I0309 09:37:21.687166 4861 generic.go:334] "Generic (PLEG): container finished" podID="e8f222a3-04ea-475e-aab7-97cf0ba5021c" containerID="0b5f1a7bbd679f2da38d9f47b7562247ce60c2ec70507ea5e378f8069750dc18" exitCode=0 Mar 09 09:37:21 crc kubenswrapper[4861]: I0309 09:37:21.687283 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-xq9tq" event={"ID":"e8f222a3-04ea-475e-aab7-97cf0ba5021c","Type":"ContainerDied","Data":"0b5f1a7bbd679f2da38d9f47b7562247ce60c2ec70507ea5e378f8069750dc18"} Mar 09 09:37:23 crc kubenswrapper[4861]: I0309 09:37:23.104965 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-xq9tq" Mar 09 09:37:23 crc kubenswrapper[4861]: I0309 09:37:23.204463 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/e8f222a3-04ea-475e-aab7-97cf0ba5021c-inventory-0\") pod \"e8f222a3-04ea-475e-aab7-97cf0ba5021c\" (UID: \"e8f222a3-04ea-475e-aab7-97cf0ba5021c\") " Mar 09 09:37:23 crc kubenswrapper[4861]: I0309 09:37:23.204601 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e8f222a3-04ea-475e-aab7-97cf0ba5021c-ssh-key-openstack-edpm-ipam\") pod \"e8f222a3-04ea-475e-aab7-97cf0ba5021c\" (UID: \"e8f222a3-04ea-475e-aab7-97cf0ba5021c\") " Mar 09 09:37:23 crc kubenswrapper[4861]: I0309 09:37:23.204647 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k82jr\" (UniqueName: \"kubernetes.io/projected/e8f222a3-04ea-475e-aab7-97cf0ba5021c-kube-api-access-k82jr\") pod \"e8f222a3-04ea-475e-aab7-97cf0ba5021c\" (UID: \"e8f222a3-04ea-475e-aab7-97cf0ba5021c\") " Mar 09 09:37:23 crc kubenswrapper[4861]: I0309 09:37:23.209738 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8f222a3-04ea-475e-aab7-97cf0ba5021c-kube-api-access-k82jr" (OuterVolumeSpecName: "kube-api-access-k82jr") pod "e8f222a3-04ea-475e-aab7-97cf0ba5021c" (UID: "e8f222a3-04ea-475e-aab7-97cf0ba5021c"). InnerVolumeSpecName "kube-api-access-k82jr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:37:23 crc kubenswrapper[4861]: I0309 09:37:23.235799 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8f222a3-04ea-475e-aab7-97cf0ba5021c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e8f222a3-04ea-475e-aab7-97cf0ba5021c" (UID: "e8f222a3-04ea-475e-aab7-97cf0ba5021c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:37:23 crc kubenswrapper[4861]: I0309 09:37:23.235916 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8f222a3-04ea-475e-aab7-97cf0ba5021c-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "e8f222a3-04ea-475e-aab7-97cf0ba5021c" (UID: "e8f222a3-04ea-475e-aab7-97cf0ba5021c"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:37:23 crc kubenswrapper[4861]: I0309 09:37:23.307398 4861 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/e8f222a3-04ea-475e-aab7-97cf0ba5021c-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 09 09:37:23 crc kubenswrapper[4861]: I0309 09:37:23.307442 4861 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e8f222a3-04ea-475e-aab7-97cf0ba5021c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 09:37:23 crc kubenswrapper[4861]: I0309 09:37:23.307453 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k82jr\" (UniqueName: \"kubernetes.io/projected/e8f222a3-04ea-475e-aab7-97cf0ba5021c-kube-api-access-k82jr\") on node \"crc\" DevicePath \"\"" Mar 09 09:37:23 crc kubenswrapper[4861]: I0309 09:37:23.707901 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-xq9tq" event={"ID":"e8f222a3-04ea-475e-aab7-97cf0ba5021c","Type":"ContainerDied","Data":"3c4473705dc46d02763d36aab935b610082159f9f4f7c1569895bfb689839b33"} Mar 09 09:37:23 crc kubenswrapper[4861]: I0309 09:37:23.707960 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c4473705dc46d02763d36aab935b610082159f9f4f7c1569895bfb689839b33" Mar 09 09:37:23 crc kubenswrapper[4861]: I0309 09:37:23.707994 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-xq9tq" Mar 09 09:37:23 crc kubenswrapper[4861]: I0309 09:37:23.782103 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-5mlpf"] Mar 09 09:37:23 crc kubenswrapper[4861]: E0309 09:37:23.782466 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8f222a3-04ea-475e-aab7-97cf0ba5021c" containerName="ssh-known-hosts-edpm-deployment" Mar 09 09:37:23 crc kubenswrapper[4861]: I0309 09:37:23.782482 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8f222a3-04ea-475e-aab7-97cf0ba5021c" containerName="ssh-known-hosts-edpm-deployment" Mar 09 09:37:23 crc kubenswrapper[4861]: I0309 09:37:23.782720 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8f222a3-04ea-475e-aab7-97cf0ba5021c" containerName="ssh-known-hosts-edpm-deployment" Mar 09 09:37:23 crc kubenswrapper[4861]: I0309 09:37:23.784335 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5mlpf" Mar 09 09:37:23 crc kubenswrapper[4861]: I0309 09:37:23.788785 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 09:37:23 crc kubenswrapper[4861]: I0309 09:37:23.788866 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 09:37:23 crc kubenswrapper[4861]: I0309 09:37:23.788908 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lkd5q" Mar 09 09:37:23 crc kubenswrapper[4861]: I0309 09:37:23.788975 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 09:37:23 crc kubenswrapper[4861]: I0309 09:37:23.793622 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-5mlpf"] Mar 09 09:37:23 crc kubenswrapper[4861]: I0309 09:37:23.919660 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmk7j\" (UniqueName: \"kubernetes.io/projected/df0bad47-fa01-426d-af7b-e09057048052-kube-api-access-kmk7j\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5mlpf\" (UID: \"df0bad47-fa01-426d-af7b-e09057048052\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5mlpf" Mar 09 09:37:23 crc kubenswrapper[4861]: I0309 09:37:23.919860 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/df0bad47-fa01-426d-af7b-e09057048052-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5mlpf\" (UID: \"df0bad47-fa01-426d-af7b-e09057048052\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5mlpf" Mar 09 09:37:23 crc kubenswrapper[4861]: I0309 09:37:23.919942 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df0bad47-fa01-426d-af7b-e09057048052-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5mlpf\" (UID: \"df0bad47-fa01-426d-af7b-e09057048052\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5mlpf" Mar 09 09:37:24 crc kubenswrapper[4861]: I0309 09:37:24.022028 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/df0bad47-fa01-426d-af7b-e09057048052-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5mlpf\" (UID: \"df0bad47-fa01-426d-af7b-e09057048052\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5mlpf" Mar 09 09:37:24 crc kubenswrapper[4861]: I0309 09:37:24.022137 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df0bad47-fa01-426d-af7b-e09057048052-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5mlpf\" (UID: \"df0bad47-fa01-426d-af7b-e09057048052\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5mlpf" Mar 09 09:37:24 crc kubenswrapper[4861]: I0309 09:37:24.022203 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmk7j\" (UniqueName: \"kubernetes.io/projected/df0bad47-fa01-426d-af7b-e09057048052-kube-api-access-kmk7j\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5mlpf\" (UID: \"df0bad47-fa01-426d-af7b-e09057048052\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5mlpf" Mar 09 09:37:24 crc kubenswrapper[4861]: I0309 09:37:24.027348 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/df0bad47-fa01-426d-af7b-e09057048052-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5mlpf\" (UID: \"df0bad47-fa01-426d-af7b-e09057048052\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5mlpf" Mar 09 09:37:24 crc kubenswrapper[4861]: I0309 09:37:24.028960 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df0bad47-fa01-426d-af7b-e09057048052-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5mlpf\" (UID: \"df0bad47-fa01-426d-af7b-e09057048052\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5mlpf" Mar 09 09:37:24 crc kubenswrapper[4861]: I0309 09:37:24.039919 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmk7j\" (UniqueName: \"kubernetes.io/projected/df0bad47-fa01-426d-af7b-e09057048052-kube-api-access-kmk7j\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-5mlpf\" (UID: \"df0bad47-fa01-426d-af7b-e09057048052\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5mlpf" Mar 09 09:37:24 crc kubenswrapper[4861]: I0309 09:37:24.113338 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5mlpf" Mar 09 09:37:24 crc kubenswrapper[4861]: I0309 09:37:24.680704 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-5mlpf"] Mar 09 09:37:24 crc kubenswrapper[4861]: I0309 09:37:24.718678 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5mlpf" event={"ID":"df0bad47-fa01-426d-af7b-e09057048052","Type":"ContainerStarted","Data":"693d621198051a507a8af023c27cc7be4a541c2360dd5587e292898c0b47c020"} Mar 09 09:37:25 crc kubenswrapper[4861]: I0309 09:37:25.729352 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5mlpf" event={"ID":"df0bad47-fa01-426d-af7b-e09057048052","Type":"ContainerStarted","Data":"f76d968e35efbcdc32f9ca97db494c967ad6e0fdb578a1a22a5730364756fdd6"} Mar 09 09:37:25 crc kubenswrapper[4861]: I0309 09:37:25.750735 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5mlpf" podStartSLOduration=2.271248256 podStartE2EDuration="2.750714579s" podCreationTimestamp="2026-03-09 09:37:23 +0000 UTC" firstStartedPulling="2026-03-09 09:37:24.684485786 +0000 UTC m=+1887.769525197" lastFinishedPulling="2026-03-09 09:37:25.163952109 +0000 UTC m=+1888.248991520" observedRunningTime="2026-03-09 09:37:25.749661331 +0000 UTC m=+1888.834700772" watchObservedRunningTime="2026-03-09 09:37:25.750714579 +0000 UTC m=+1888.835753990" Mar 09 09:37:27 crc kubenswrapper[4861]: I0309 09:37:27.664331 4861 scope.go:117] "RemoveContainer" containerID="c0726d3ac822004eacb4f8d12bb4cbaf2815fc9d29aaa8ba7db9d4fae1717ee1" Mar 09 09:37:28 crc kubenswrapper[4861]: I0309 09:37:28.755344 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" event={"ID":"6f7875e3-174f-4c67-8675-d878de74aa4f","Type":"ContainerStarted","Data":"2cf1bc55664e082ce607bad22b0e501635384de314ffbc4f1270ecbbe7d97b60"} Mar 09 09:37:34 crc kubenswrapper[4861]: I0309 09:37:34.816981 4861 generic.go:334] "Generic (PLEG): container finished" podID="df0bad47-fa01-426d-af7b-e09057048052" containerID="f76d968e35efbcdc32f9ca97db494c967ad6e0fdb578a1a22a5730364756fdd6" exitCode=0 Mar 09 09:37:34 crc kubenswrapper[4861]: I0309 09:37:34.817059 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5mlpf" event={"ID":"df0bad47-fa01-426d-af7b-e09057048052","Type":"ContainerDied","Data":"f76d968e35efbcdc32f9ca97db494c967ad6e0fdb578a1a22a5730364756fdd6"} Mar 09 09:37:36 crc kubenswrapper[4861]: I0309 09:37:36.205753 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5mlpf" Mar 09 09:37:36 crc kubenswrapper[4861]: I0309 09:37:36.284852 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/df0bad47-fa01-426d-af7b-e09057048052-ssh-key-openstack-edpm-ipam\") pod \"df0bad47-fa01-426d-af7b-e09057048052\" (UID: \"df0bad47-fa01-426d-af7b-e09057048052\") " Mar 09 09:37:36 crc kubenswrapper[4861]: I0309 09:37:36.284929 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df0bad47-fa01-426d-af7b-e09057048052-inventory\") pod \"df0bad47-fa01-426d-af7b-e09057048052\" (UID: \"df0bad47-fa01-426d-af7b-e09057048052\") " Mar 09 09:37:36 crc kubenswrapper[4861]: I0309 09:37:36.285008 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmk7j\" (UniqueName: \"kubernetes.io/projected/df0bad47-fa01-426d-af7b-e09057048052-kube-api-access-kmk7j\") pod \"df0bad47-fa01-426d-af7b-e09057048052\" (UID: \"df0bad47-fa01-426d-af7b-e09057048052\") " Mar 09 09:37:36 crc kubenswrapper[4861]: I0309 09:37:36.290416 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df0bad47-fa01-426d-af7b-e09057048052-kube-api-access-kmk7j" (OuterVolumeSpecName: "kube-api-access-kmk7j") pod "df0bad47-fa01-426d-af7b-e09057048052" (UID: "df0bad47-fa01-426d-af7b-e09057048052"). InnerVolumeSpecName "kube-api-access-kmk7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:37:36 crc kubenswrapper[4861]: I0309 09:37:36.309847 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df0bad47-fa01-426d-af7b-e09057048052-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "df0bad47-fa01-426d-af7b-e09057048052" (UID: "df0bad47-fa01-426d-af7b-e09057048052"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:37:36 crc kubenswrapper[4861]: I0309 09:37:36.318018 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df0bad47-fa01-426d-af7b-e09057048052-inventory" (OuterVolumeSpecName: "inventory") pod "df0bad47-fa01-426d-af7b-e09057048052" (UID: "df0bad47-fa01-426d-af7b-e09057048052"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:37:36 crc kubenswrapper[4861]: I0309 09:37:36.387677 4861 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/df0bad47-fa01-426d-af7b-e09057048052-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 09:37:36 crc kubenswrapper[4861]: I0309 09:37:36.387714 4861 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df0bad47-fa01-426d-af7b-e09057048052-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 09:37:36 crc kubenswrapper[4861]: I0309 09:37:36.387725 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmk7j\" (UniqueName: \"kubernetes.io/projected/df0bad47-fa01-426d-af7b-e09057048052-kube-api-access-kmk7j\") on node \"crc\" DevicePath \"\"" Mar 09 09:37:36 crc kubenswrapper[4861]: I0309 09:37:36.851304 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5mlpf" event={"ID":"df0bad47-fa01-426d-af7b-e09057048052","Type":"ContainerDied","Data":"693d621198051a507a8af023c27cc7be4a541c2360dd5587e292898c0b47c020"} Mar 09 09:37:36 crc kubenswrapper[4861]: I0309 09:37:36.851726 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="693d621198051a507a8af023c27cc7be4a541c2360dd5587e292898c0b47c020" Mar 09 09:37:36 crc kubenswrapper[4861]: I0309 09:37:36.851932 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-5mlpf" Mar 09 09:37:36 crc kubenswrapper[4861]: I0309 09:37:36.923190 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-62xxn"] Mar 09 09:37:36 crc kubenswrapper[4861]: E0309 09:37:36.923573 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df0bad47-fa01-426d-af7b-e09057048052" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 09 09:37:36 crc kubenswrapper[4861]: I0309 09:37:36.923594 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="df0bad47-fa01-426d-af7b-e09057048052" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 09 09:37:36 crc kubenswrapper[4861]: I0309 09:37:36.923793 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="df0bad47-fa01-426d-af7b-e09057048052" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 09 09:37:36 crc kubenswrapper[4861]: I0309 09:37:36.924389 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-62xxn" Mar 09 09:37:36 crc kubenswrapper[4861]: I0309 09:37:36.928501 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 09:37:36 crc kubenswrapper[4861]: I0309 09:37:36.928692 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lkd5q" Mar 09 09:37:36 crc kubenswrapper[4861]: I0309 09:37:36.928818 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 09:37:36 crc kubenswrapper[4861]: I0309 09:37:36.928945 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 09:37:36 crc kubenswrapper[4861]: I0309 09:37:36.946557 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-62xxn"] Mar 09 09:37:37 crc kubenswrapper[4861]: I0309 09:37:37.001474 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3b0d4f8-537e-4894-bcf9-0cfa00a145ec-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-62xxn\" (UID: \"e3b0d4f8-537e-4894-bcf9-0cfa00a145ec\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-62xxn" Mar 09 09:37:37 crc kubenswrapper[4861]: I0309 09:37:37.001704 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e3b0d4f8-537e-4894-bcf9-0cfa00a145ec-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-62xxn\" (UID: \"e3b0d4f8-537e-4894-bcf9-0cfa00a145ec\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-62xxn" Mar 09 09:37:37 crc kubenswrapper[4861]: I0309 09:37:37.001824 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbz58\" (UniqueName: \"kubernetes.io/projected/e3b0d4f8-537e-4894-bcf9-0cfa00a145ec-kube-api-access-hbz58\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-62xxn\" (UID: \"e3b0d4f8-537e-4894-bcf9-0cfa00a145ec\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-62xxn" Mar 09 09:37:37 crc kubenswrapper[4861]: I0309 09:37:37.111116 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3b0d4f8-537e-4894-bcf9-0cfa00a145ec-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-62xxn\" (UID: \"e3b0d4f8-537e-4894-bcf9-0cfa00a145ec\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-62xxn" Mar 09 09:37:37 crc kubenswrapper[4861]: I0309 09:37:37.111473 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e3b0d4f8-537e-4894-bcf9-0cfa00a145ec-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-62xxn\" (UID: \"e3b0d4f8-537e-4894-bcf9-0cfa00a145ec\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-62xxn" Mar 09 09:37:37 crc kubenswrapper[4861]: I0309 09:37:37.111515 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbz58\" (UniqueName: \"kubernetes.io/projected/e3b0d4f8-537e-4894-bcf9-0cfa00a145ec-kube-api-access-hbz58\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-62xxn\" (UID: \"e3b0d4f8-537e-4894-bcf9-0cfa00a145ec\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-62xxn" Mar 09 09:37:37 crc kubenswrapper[4861]: I0309 09:37:37.117419 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3b0d4f8-537e-4894-bcf9-0cfa00a145ec-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-62xxn\" (UID: \"e3b0d4f8-537e-4894-bcf9-0cfa00a145ec\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-62xxn" Mar 09 09:37:37 crc kubenswrapper[4861]: I0309 09:37:37.121951 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e3b0d4f8-537e-4894-bcf9-0cfa00a145ec-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-62xxn\" (UID: \"e3b0d4f8-537e-4894-bcf9-0cfa00a145ec\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-62xxn" Mar 09 09:37:37 crc kubenswrapper[4861]: I0309 09:37:37.135966 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbz58\" (UniqueName: \"kubernetes.io/projected/e3b0d4f8-537e-4894-bcf9-0cfa00a145ec-kube-api-access-hbz58\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-62xxn\" (UID: \"e3b0d4f8-537e-4894-bcf9-0cfa00a145ec\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-62xxn" Mar 09 09:37:37 crc kubenswrapper[4861]: I0309 09:37:37.257187 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-62xxn" Mar 09 09:37:37 crc kubenswrapper[4861]: I0309 09:37:37.772816 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-62xxn"] Mar 09 09:37:37 crc kubenswrapper[4861]: I0309 09:37:37.861433 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-62xxn" event={"ID":"e3b0d4f8-537e-4894-bcf9-0cfa00a145ec","Type":"ContainerStarted","Data":"da640e43c0822f54143e57c044dabe8de3cfd1a113ff5ece2a75760da6d95a82"} Mar 09 09:37:38 crc kubenswrapper[4861]: I0309 09:37:38.870025 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-62xxn" event={"ID":"e3b0d4f8-537e-4894-bcf9-0cfa00a145ec","Type":"ContainerStarted","Data":"ef243e4272c391d6768ca895f0fb999ad386623b343cbdd6d216b1d070b6d923"} Mar 09 09:37:38 crc kubenswrapper[4861]: I0309 09:37:38.887072 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-62xxn" podStartSLOduration=2.4437159729999998 podStartE2EDuration="2.887049358s" podCreationTimestamp="2026-03-09 09:37:36 +0000 UTC" firstStartedPulling="2026-03-09 09:37:37.77468464 +0000 UTC m=+1900.859724041" lastFinishedPulling="2026-03-09 09:37:38.218017985 +0000 UTC m=+1901.303057426" observedRunningTime="2026-03-09 09:37:38.883045068 +0000 UTC m=+1901.968084469" watchObservedRunningTime="2026-03-09 09:37:38.887049358 +0000 UTC m=+1901.972088759" Mar 09 09:37:47 crc kubenswrapper[4861]: I0309 09:37:47.962073 4861 generic.go:334] "Generic (PLEG): container finished" podID="e3b0d4f8-537e-4894-bcf9-0cfa00a145ec" containerID="ef243e4272c391d6768ca895f0fb999ad386623b343cbdd6d216b1d070b6d923" exitCode=0 Mar 09 09:37:47 crc kubenswrapper[4861]: I0309 09:37:47.962175 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-62xxn" event={"ID":"e3b0d4f8-537e-4894-bcf9-0cfa00a145ec","Type":"ContainerDied","Data":"ef243e4272c391d6768ca895f0fb999ad386623b343cbdd6d216b1d070b6d923"} Mar 09 09:37:49 crc kubenswrapper[4861]: I0309 09:37:49.358848 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-62xxn" Mar 09 09:37:49 crc kubenswrapper[4861]: I0309 09:37:49.537035 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3b0d4f8-537e-4894-bcf9-0cfa00a145ec-inventory\") pod \"e3b0d4f8-537e-4894-bcf9-0cfa00a145ec\" (UID: \"e3b0d4f8-537e-4894-bcf9-0cfa00a145ec\") " Mar 09 09:37:49 crc kubenswrapper[4861]: I0309 09:37:49.537093 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbz58\" (UniqueName: \"kubernetes.io/projected/e3b0d4f8-537e-4894-bcf9-0cfa00a145ec-kube-api-access-hbz58\") pod \"e3b0d4f8-537e-4894-bcf9-0cfa00a145ec\" (UID: \"e3b0d4f8-537e-4894-bcf9-0cfa00a145ec\") " Mar 09 09:37:49 crc kubenswrapper[4861]: I0309 09:37:49.537173 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e3b0d4f8-537e-4894-bcf9-0cfa00a145ec-ssh-key-openstack-edpm-ipam\") pod \"e3b0d4f8-537e-4894-bcf9-0cfa00a145ec\" (UID: \"e3b0d4f8-537e-4894-bcf9-0cfa00a145ec\") " Mar 09 09:37:49 crc kubenswrapper[4861]: I0309 09:37:49.543345 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3b0d4f8-537e-4894-bcf9-0cfa00a145ec-kube-api-access-hbz58" (OuterVolumeSpecName: "kube-api-access-hbz58") pod "e3b0d4f8-537e-4894-bcf9-0cfa00a145ec" (UID: "e3b0d4f8-537e-4894-bcf9-0cfa00a145ec"). InnerVolumeSpecName "kube-api-access-hbz58". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:37:49 crc kubenswrapper[4861]: I0309 09:37:49.564770 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3b0d4f8-537e-4894-bcf9-0cfa00a145ec-inventory" (OuterVolumeSpecName: "inventory") pod "e3b0d4f8-537e-4894-bcf9-0cfa00a145ec" (UID: "e3b0d4f8-537e-4894-bcf9-0cfa00a145ec"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:37:49 crc kubenswrapper[4861]: I0309 09:37:49.583779 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3b0d4f8-537e-4894-bcf9-0cfa00a145ec-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e3b0d4f8-537e-4894-bcf9-0cfa00a145ec" (UID: "e3b0d4f8-537e-4894-bcf9-0cfa00a145ec"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:37:49 crc kubenswrapper[4861]: I0309 09:37:49.639577 4861 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e3b0d4f8-537e-4894-bcf9-0cfa00a145ec-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 09:37:49 crc kubenswrapper[4861]: I0309 09:37:49.639613 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbz58\" (UniqueName: \"kubernetes.io/projected/e3b0d4f8-537e-4894-bcf9-0cfa00a145ec-kube-api-access-hbz58\") on node \"crc\" DevicePath \"\"" Mar 09 09:37:49 crc kubenswrapper[4861]: I0309 09:37:49.639626 4861 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e3b0d4f8-537e-4894-bcf9-0cfa00a145ec-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 09:37:49 crc kubenswrapper[4861]: I0309 09:37:49.981496 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-62xxn" event={"ID":"e3b0d4f8-537e-4894-bcf9-0cfa00a145ec","Type":"ContainerDied","Data":"da640e43c0822f54143e57c044dabe8de3cfd1a113ff5ece2a75760da6d95a82"} Mar 09 09:37:49 crc kubenswrapper[4861]: I0309 09:37:49.981544 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da640e43c0822f54143e57c044dabe8de3cfd1a113ff5ece2a75760da6d95a82" Mar 09 09:37:49 crc kubenswrapper[4861]: I0309 09:37:49.981617 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-62xxn" Mar 09 09:37:50 crc kubenswrapper[4861]: I0309 09:37:50.063140 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9"] Mar 09 09:37:50 crc kubenswrapper[4861]: E0309 09:37:50.063538 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3b0d4f8-537e-4894-bcf9-0cfa00a145ec" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 09 09:37:50 crc kubenswrapper[4861]: I0309 09:37:50.063557 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3b0d4f8-537e-4894-bcf9-0cfa00a145ec" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 09 09:37:50 crc kubenswrapper[4861]: I0309 09:37:50.063726 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3b0d4f8-537e-4894-bcf9-0cfa00a145ec" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 09 09:37:50 crc kubenswrapper[4861]: I0309 09:37:50.064318 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9" Mar 09 09:37:50 crc kubenswrapper[4861]: I0309 09:37:50.067010 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Mar 09 09:37:50 crc kubenswrapper[4861]: I0309 09:37:50.067217 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 09:37:50 crc kubenswrapper[4861]: I0309 09:37:50.068177 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Mar 09 09:37:50 crc kubenswrapper[4861]: I0309 09:37:50.072263 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 09:37:50 crc kubenswrapper[4861]: I0309 09:37:50.072327 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Mar 09 09:37:50 crc kubenswrapper[4861]: I0309 09:37:50.072329 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 09:37:50 crc kubenswrapper[4861]: I0309 09:37:50.072482 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lkd5q" Mar 09 09:37:50 crc kubenswrapper[4861]: I0309 09:37:50.072631 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Mar 09 09:37:50 crc kubenswrapper[4861]: I0309 09:37:50.084582 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9"] Mar 09 09:37:50 crc kubenswrapper[4861]: I0309 09:37:50.156987 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2420e9c4-faed-48f0-857d-4aba72c5cab2-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9\" (UID: \"2420e9c4-faed-48f0-857d-4aba72c5cab2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9" Mar 09 09:37:50 crc kubenswrapper[4861]: I0309 09:37:50.157390 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2420e9c4-faed-48f0-857d-4aba72c5cab2-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9\" (UID: \"2420e9c4-faed-48f0-857d-4aba72c5cab2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9" Mar 09 09:37:50 crc kubenswrapper[4861]: I0309 09:37:50.157426 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2420e9c4-faed-48f0-857d-4aba72c5cab2-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9\" (UID: \"2420e9c4-faed-48f0-857d-4aba72c5cab2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9" Mar 09 09:37:50 crc kubenswrapper[4861]: I0309 09:37:50.157475 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2420e9c4-faed-48f0-857d-4aba72c5cab2-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9\" (UID: \"2420e9c4-faed-48f0-857d-4aba72c5cab2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9" Mar 09 09:37:50 crc kubenswrapper[4861]: I0309 09:37:50.157521 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2420e9c4-faed-48f0-857d-4aba72c5cab2-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9\" (UID: \"2420e9c4-faed-48f0-857d-4aba72c5cab2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9" Mar 09 09:37:50 crc kubenswrapper[4861]: I0309 09:37:50.157560 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9fkd\" (UniqueName: \"kubernetes.io/projected/2420e9c4-faed-48f0-857d-4aba72c5cab2-kube-api-access-g9fkd\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9\" (UID: \"2420e9c4-faed-48f0-857d-4aba72c5cab2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9" Mar 09 09:37:50 crc kubenswrapper[4861]: I0309 09:37:50.157615 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2420e9c4-faed-48f0-857d-4aba72c5cab2-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9\" (UID: \"2420e9c4-faed-48f0-857d-4aba72c5cab2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9" Mar 09 09:37:50 crc kubenswrapper[4861]: I0309 09:37:50.157663 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2420e9c4-faed-48f0-857d-4aba72c5cab2-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9\" (UID: \"2420e9c4-faed-48f0-857d-4aba72c5cab2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9" Mar 09 09:37:50 crc kubenswrapper[4861]: I0309 09:37:50.157719 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2420e9c4-faed-48f0-857d-4aba72c5cab2-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9\" (UID: \"2420e9c4-faed-48f0-857d-4aba72c5cab2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9" Mar 09 09:37:50 crc kubenswrapper[4861]: I0309 09:37:50.157752 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2420e9c4-faed-48f0-857d-4aba72c5cab2-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9\" (UID: \"2420e9c4-faed-48f0-857d-4aba72c5cab2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9" Mar 09 09:37:50 crc kubenswrapper[4861]: I0309 09:37:50.157793 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2420e9c4-faed-48f0-857d-4aba72c5cab2-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9\" (UID: \"2420e9c4-faed-48f0-857d-4aba72c5cab2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9" Mar 09 09:37:50 crc kubenswrapper[4861]: I0309 09:37:50.157980 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2420e9c4-faed-48f0-857d-4aba72c5cab2-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9\" (UID: \"2420e9c4-faed-48f0-857d-4aba72c5cab2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9" Mar 09 09:37:50 crc kubenswrapper[4861]: I0309 09:37:50.158056 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2420e9c4-faed-48f0-857d-4aba72c5cab2-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9\" (UID: \"2420e9c4-faed-48f0-857d-4aba72c5cab2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9" Mar 09 09:37:50 crc kubenswrapper[4861]: I0309 09:37:50.158088 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2420e9c4-faed-48f0-857d-4aba72c5cab2-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9\" (UID: \"2420e9c4-faed-48f0-857d-4aba72c5cab2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9" Mar 09 09:37:50 crc kubenswrapper[4861]: I0309 09:37:50.260413 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2420e9c4-faed-48f0-857d-4aba72c5cab2-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9\" (UID: \"2420e9c4-faed-48f0-857d-4aba72c5cab2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9" Mar 09 09:37:50 crc kubenswrapper[4861]: I0309 09:37:50.260546 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2420e9c4-faed-48f0-857d-4aba72c5cab2-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9\" (UID: \"2420e9c4-faed-48f0-857d-4aba72c5cab2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9" Mar 09 09:37:50 crc kubenswrapper[4861]: I0309 09:37:50.260580 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2420e9c4-faed-48f0-857d-4aba72c5cab2-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9\" (UID: \"2420e9c4-faed-48f0-857d-4aba72c5cab2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9" Mar 09 09:37:50 crc kubenswrapper[4861]: I0309 09:37:50.261311 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2420e9c4-faed-48f0-857d-4aba72c5cab2-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9\" (UID: \"2420e9c4-faed-48f0-857d-4aba72c5cab2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9" Mar 09 09:37:50 crc kubenswrapper[4861]: I0309 09:37:50.261401 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2420e9c4-faed-48f0-857d-4aba72c5cab2-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9\" (UID: \"2420e9c4-faed-48f0-857d-4aba72c5cab2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9" Mar 09 09:37:50 crc kubenswrapper[4861]: I0309 09:37:50.261429 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2420e9c4-faed-48f0-857d-4aba72c5cab2-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9\" (UID: \"2420e9c4-faed-48f0-857d-4aba72c5cab2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9" Mar 09 09:37:50 crc kubenswrapper[4861]: I0309 09:37:50.261453 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2420e9c4-faed-48f0-857d-4aba72c5cab2-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9\" (UID: \"2420e9c4-faed-48f0-857d-4aba72c5cab2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9" Mar 09 09:37:50 crc kubenswrapper[4861]: I0309 09:37:50.261505 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2420e9c4-faed-48f0-857d-4aba72c5cab2-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9\" (UID: \"2420e9c4-faed-48f0-857d-4aba72c5cab2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9" Mar 09 09:37:50 crc kubenswrapper[4861]: I0309 09:37:50.261549 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2420e9c4-faed-48f0-857d-4aba72c5cab2-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9\" (UID: \"2420e9c4-faed-48f0-857d-4aba72c5cab2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9" Mar 09 09:37:50 crc kubenswrapper[4861]: I0309 09:37:50.261596 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9fkd\" (UniqueName: \"kubernetes.io/projected/2420e9c4-faed-48f0-857d-4aba72c5cab2-kube-api-access-g9fkd\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9\" (UID: \"2420e9c4-faed-48f0-857d-4aba72c5cab2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9" Mar 09 09:37:50 crc kubenswrapper[4861]: I0309 09:37:50.261672 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2420e9c4-faed-48f0-857d-4aba72c5cab2-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9\" (UID: \"2420e9c4-faed-48f0-857d-4aba72c5cab2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9" Mar 09 09:37:50 crc kubenswrapper[4861]: I0309 09:37:50.261724 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2420e9c4-faed-48f0-857d-4aba72c5cab2-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9\" (UID: \"2420e9c4-faed-48f0-857d-4aba72c5cab2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9" Mar 09 09:37:50 crc kubenswrapper[4861]: I0309 09:37:50.261780 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2420e9c4-faed-48f0-857d-4aba72c5cab2-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9\" (UID: \"2420e9c4-faed-48f0-857d-4aba72c5cab2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9" Mar 09 09:37:50 crc kubenswrapper[4861]: I0309 09:37:50.261829 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2420e9c4-faed-48f0-857d-4aba72c5cab2-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9\" (UID: \"2420e9c4-faed-48f0-857d-4aba72c5cab2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9" Mar 09 09:37:50 crc kubenswrapper[4861]: I0309 09:37:50.279218 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2420e9c4-faed-48f0-857d-4aba72c5cab2-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9\" (UID: \"2420e9c4-faed-48f0-857d-4aba72c5cab2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9" Mar 09 09:37:50 crc kubenswrapper[4861]: I0309 09:37:50.279271 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2420e9c4-faed-48f0-857d-4aba72c5cab2-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9\" (UID: \"2420e9c4-faed-48f0-857d-4aba72c5cab2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9" Mar 09 09:37:50 crc kubenswrapper[4861]: I0309 09:37:50.280761 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2420e9c4-faed-48f0-857d-4aba72c5cab2-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9\" (UID: \"2420e9c4-faed-48f0-857d-4aba72c5cab2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9" Mar 09 09:37:50 crc kubenswrapper[4861]: I0309 09:37:50.284191 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2420e9c4-faed-48f0-857d-4aba72c5cab2-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9\" (UID: \"2420e9c4-faed-48f0-857d-4aba72c5cab2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9" Mar 09 09:37:50 crc kubenswrapper[4861]: I0309 09:37:50.285168 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2420e9c4-faed-48f0-857d-4aba72c5cab2-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9\" (UID: \"2420e9c4-faed-48f0-857d-4aba72c5cab2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9" Mar 09 09:37:50 crc kubenswrapper[4861]: I0309 09:37:50.285219 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2420e9c4-faed-48f0-857d-4aba72c5cab2-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9\" (UID: \"2420e9c4-faed-48f0-857d-4aba72c5cab2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9" Mar 09 09:37:50 crc kubenswrapper[4861]: I0309 09:37:50.285687 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2420e9c4-faed-48f0-857d-4aba72c5cab2-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9\" (UID: \"2420e9c4-faed-48f0-857d-4aba72c5cab2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9" Mar 09 09:37:50 crc kubenswrapper[4861]: I0309 09:37:50.285762 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2420e9c4-faed-48f0-857d-4aba72c5cab2-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9\" (UID: \"2420e9c4-faed-48f0-857d-4aba72c5cab2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9" Mar 09 09:37:50 crc kubenswrapper[4861]: I0309 09:37:50.285790 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2420e9c4-faed-48f0-857d-4aba72c5cab2-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9\" (UID: \"2420e9c4-faed-48f0-857d-4aba72c5cab2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9" Mar 09 09:37:50 crc kubenswrapper[4861]: I0309 09:37:50.288147 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2420e9c4-faed-48f0-857d-4aba72c5cab2-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9\" (UID: \"2420e9c4-faed-48f0-857d-4aba72c5cab2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9" Mar 09 09:37:50 crc kubenswrapper[4861]: I0309 09:37:50.308949 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9fkd\" (UniqueName: \"kubernetes.io/projected/2420e9c4-faed-48f0-857d-4aba72c5cab2-kube-api-access-g9fkd\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9\" (UID: \"2420e9c4-faed-48f0-857d-4aba72c5cab2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9" Mar 09 09:37:50 crc kubenswrapper[4861]: I0309 09:37:50.309023 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2420e9c4-faed-48f0-857d-4aba72c5cab2-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9\" (UID: \"2420e9c4-faed-48f0-857d-4aba72c5cab2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9" Mar 09 09:37:50 crc kubenswrapper[4861]: I0309 09:37:50.309411 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2420e9c4-faed-48f0-857d-4aba72c5cab2-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9\" (UID: \"2420e9c4-faed-48f0-857d-4aba72c5cab2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9" Mar 09 09:37:50 crc kubenswrapper[4861]: I0309 09:37:50.309536 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2420e9c4-faed-48f0-857d-4aba72c5cab2-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9\" (UID: \"2420e9c4-faed-48f0-857d-4aba72c5cab2\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9" Mar 09 09:37:50 crc kubenswrapper[4861]: I0309 09:37:50.390208 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9" Mar 09 09:37:50 crc kubenswrapper[4861]: I0309 09:37:50.910164 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9"] Mar 09 09:37:50 crc kubenswrapper[4861]: I0309 09:37:50.990454 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9" event={"ID":"2420e9c4-faed-48f0-857d-4aba72c5cab2","Type":"ContainerStarted","Data":"11e8a983af6ea10a419499dcf14109f5a91cc19917f550f88b30848fa71cc589"} Mar 09 09:37:52 crc kubenswrapper[4861]: I0309 09:37:52.000941 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9" event={"ID":"2420e9c4-faed-48f0-857d-4aba72c5cab2","Type":"ContainerStarted","Data":"230d296a8930811c5e831cc5b4c3d213c640acafc079c7c05ca0aee6c5d61879"} Mar 09 09:37:52 crc kubenswrapper[4861]: I0309 09:37:52.025272 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9" podStartSLOduration=1.591303524 podStartE2EDuration="2.025250319s" podCreationTimestamp="2026-03-09 09:37:50 +0000 UTC" firstStartedPulling="2026-03-09 09:37:50.918001941 +0000 UTC m=+1914.003041342" lastFinishedPulling="2026-03-09 09:37:51.351948726 +0000 UTC m=+1914.436988137" observedRunningTime="2026-03-09 09:37:52.017301099 +0000 UTC m=+1915.102340510" watchObservedRunningTime="2026-03-09 09:37:52.025250319 +0000 UTC m=+1915.110289710" Mar 09 09:37:57 crc kubenswrapper[4861]: I0309 09:37:57.046843 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-84rm5"] Mar 09 09:37:57 crc kubenswrapper[4861]: I0309 09:37:57.056928 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-84rm5"] Mar 09 09:37:57 crc kubenswrapper[4861]: I0309 09:37:57.669271 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e81aeea2-beec-4987-b527-db644692cb14" path="/var/lib/kubelet/pods/e81aeea2-beec-4987-b527-db644692cb14/volumes" Mar 09 09:38:00 crc kubenswrapper[4861]: I0309 09:38:00.130801 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550818-9ttd2"] Mar 09 09:38:00 crc kubenswrapper[4861]: I0309 09:38:00.132453 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550818-9ttd2" Mar 09 09:38:00 crc kubenswrapper[4861]: I0309 09:38:00.136341 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:38:00 crc kubenswrapper[4861]: I0309 09:38:00.136638 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k86l8" Mar 09 09:38:00 crc kubenswrapper[4861]: I0309 09:38:00.136821 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:38:00 crc kubenswrapper[4861]: I0309 09:38:00.147408 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2579\" (UniqueName: \"kubernetes.io/projected/eaaefb0a-991f-42b0-9474-43af65c61889-kube-api-access-d2579\") pod \"auto-csr-approver-29550818-9ttd2\" (UID: \"eaaefb0a-991f-42b0-9474-43af65c61889\") " pod="openshift-infra/auto-csr-approver-29550818-9ttd2" Mar 09 09:38:00 crc kubenswrapper[4861]: I0309 09:38:00.148075 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550818-9ttd2"] Mar 09 09:38:00 crc kubenswrapper[4861]: I0309 09:38:00.249632 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2579\" (UniqueName: \"kubernetes.io/projected/eaaefb0a-991f-42b0-9474-43af65c61889-kube-api-access-d2579\") pod \"auto-csr-approver-29550818-9ttd2\" (UID: \"eaaefb0a-991f-42b0-9474-43af65c61889\") " pod="openshift-infra/auto-csr-approver-29550818-9ttd2" Mar 09 09:38:00 crc kubenswrapper[4861]: I0309 09:38:00.274284 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2579\" (UniqueName: \"kubernetes.io/projected/eaaefb0a-991f-42b0-9474-43af65c61889-kube-api-access-d2579\") pod \"auto-csr-approver-29550818-9ttd2\" (UID: \"eaaefb0a-991f-42b0-9474-43af65c61889\") " pod="openshift-infra/auto-csr-approver-29550818-9ttd2" Mar 09 09:38:00 crc kubenswrapper[4861]: I0309 09:38:00.456763 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550818-9ttd2" Mar 09 09:38:00 crc kubenswrapper[4861]: I0309 09:38:00.918415 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550818-9ttd2"] Mar 09 09:38:01 crc kubenswrapper[4861]: I0309 09:38:01.081774 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550818-9ttd2" event={"ID":"eaaefb0a-991f-42b0-9474-43af65c61889","Type":"ContainerStarted","Data":"ad76a58cdb53a2e1d160b96abe6067f0720e1f4b75162361796eba10965fe53b"} Mar 09 09:38:02 crc kubenswrapper[4861]: I0309 09:38:02.095981 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550818-9ttd2" event={"ID":"eaaefb0a-991f-42b0-9474-43af65c61889","Type":"ContainerStarted","Data":"cbcf8d3ddb6b7e7c6231812579d51b9e4a1075aba3a1d47372344e8c62d3a1dd"} Mar 09 09:38:02 crc kubenswrapper[4861]: I0309 09:38:02.127437 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550818-9ttd2" podStartSLOduration=1.328738625 podStartE2EDuration="2.127406983s" podCreationTimestamp="2026-03-09 09:38:00 +0000 UTC" firstStartedPulling="2026-03-09 09:38:00.921784476 +0000 UTC m=+1924.006823877" lastFinishedPulling="2026-03-09 09:38:01.720452814 +0000 UTC m=+1924.805492235" observedRunningTime="2026-03-09 09:38:02.108070799 +0000 UTC m=+1925.193110200" watchObservedRunningTime="2026-03-09 09:38:02.127406983 +0000 UTC m=+1925.212446424" Mar 09 09:38:03 crc kubenswrapper[4861]: I0309 09:38:03.107052 4861 generic.go:334] "Generic (PLEG): container finished" podID="eaaefb0a-991f-42b0-9474-43af65c61889" containerID="cbcf8d3ddb6b7e7c6231812579d51b9e4a1075aba3a1d47372344e8c62d3a1dd" exitCode=0 Mar 09 09:38:03 crc kubenswrapper[4861]: I0309 09:38:03.107090 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550818-9ttd2" event={"ID":"eaaefb0a-991f-42b0-9474-43af65c61889","Type":"ContainerDied","Data":"cbcf8d3ddb6b7e7c6231812579d51b9e4a1075aba3a1d47372344e8c62d3a1dd"} Mar 09 09:38:04 crc kubenswrapper[4861]: I0309 09:38:04.465587 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550818-9ttd2" Mar 09 09:38:04 crc kubenswrapper[4861]: I0309 09:38:04.629048 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2579\" (UniqueName: \"kubernetes.io/projected/eaaefb0a-991f-42b0-9474-43af65c61889-kube-api-access-d2579\") pod \"eaaefb0a-991f-42b0-9474-43af65c61889\" (UID: \"eaaefb0a-991f-42b0-9474-43af65c61889\") " Mar 09 09:38:04 crc kubenswrapper[4861]: I0309 09:38:04.780951 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eaaefb0a-991f-42b0-9474-43af65c61889-kube-api-access-d2579" (OuterVolumeSpecName: "kube-api-access-d2579") pod "eaaefb0a-991f-42b0-9474-43af65c61889" (UID: "eaaefb0a-991f-42b0-9474-43af65c61889"). InnerVolumeSpecName "kube-api-access-d2579". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:38:04 crc kubenswrapper[4861]: I0309 09:38:04.834787 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2579\" (UniqueName: \"kubernetes.io/projected/eaaefb0a-991f-42b0-9474-43af65c61889-kube-api-access-d2579\") on node \"crc\" DevicePath \"\"" Mar 09 09:38:05 crc kubenswrapper[4861]: I0309 09:38:05.123300 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550818-9ttd2" event={"ID":"eaaefb0a-991f-42b0-9474-43af65c61889","Type":"ContainerDied","Data":"ad76a58cdb53a2e1d160b96abe6067f0720e1f4b75162361796eba10965fe53b"} Mar 09 09:38:05 crc kubenswrapper[4861]: I0309 09:38:05.123339 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad76a58cdb53a2e1d160b96abe6067f0720e1f4b75162361796eba10965fe53b" Mar 09 09:38:05 crc kubenswrapper[4861]: I0309 09:38:05.123437 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550818-9ttd2" Mar 09 09:38:05 crc kubenswrapper[4861]: I0309 09:38:05.194135 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550812-vzh87"] Mar 09 09:38:05 crc kubenswrapper[4861]: I0309 09:38:05.205778 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550812-vzh87"] Mar 09 09:38:05 crc kubenswrapper[4861]: I0309 09:38:05.670606 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03740edc-76c4-4be9-8871-29035d061880" path="/var/lib/kubelet/pods/03740edc-76c4-4be9-8871-29035d061880/volumes" Mar 09 09:38:13 crc kubenswrapper[4861]: I0309 09:38:13.249861 4861 scope.go:117] "RemoveContainer" containerID="a329ea2a6dc542f07a16030075887e823b17c6ed8a244888d2f86c973035e7b4" Mar 09 09:38:13 crc kubenswrapper[4861]: I0309 09:38:13.294809 4861 scope.go:117] "RemoveContainer" containerID="6a6e0147999328f0e670af6e9773adc78a928a0f8004229f1539591977be0d16" Mar 09 09:38:13 crc kubenswrapper[4861]: I0309 09:38:13.345275 4861 scope.go:117] "RemoveContainer" containerID="960f160c4e088103d19e3ef98910c17562f52e181ad6218e1d8e809c962675bc" Mar 09 09:38:25 crc kubenswrapper[4861]: I0309 09:38:25.294404 4861 generic.go:334] "Generic (PLEG): container finished" podID="2420e9c4-faed-48f0-857d-4aba72c5cab2" containerID="230d296a8930811c5e831cc5b4c3d213c640acafc079c7c05ca0aee6c5d61879" exitCode=0 Mar 09 09:38:25 crc kubenswrapper[4861]: I0309 09:38:25.294486 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9" event={"ID":"2420e9c4-faed-48f0-857d-4aba72c5cab2","Type":"ContainerDied","Data":"230d296a8930811c5e831cc5b4c3d213c640acafc079c7c05ca0aee6c5d61879"} Mar 09 09:38:26 crc kubenswrapper[4861]: I0309 09:38:26.722715 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9" Mar 09 09:38:26 crc kubenswrapper[4861]: I0309 09:38:26.766814 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2420e9c4-faed-48f0-857d-4aba72c5cab2-inventory\") pod \"2420e9c4-faed-48f0-857d-4aba72c5cab2\" (UID: \"2420e9c4-faed-48f0-857d-4aba72c5cab2\") " Mar 09 09:38:26 crc kubenswrapper[4861]: I0309 09:38:26.766875 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2420e9c4-faed-48f0-857d-4aba72c5cab2-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"2420e9c4-faed-48f0-857d-4aba72c5cab2\" (UID: \"2420e9c4-faed-48f0-857d-4aba72c5cab2\") " Mar 09 09:38:26 crc kubenswrapper[4861]: I0309 09:38:26.766903 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2420e9c4-faed-48f0-857d-4aba72c5cab2-openstack-edpm-ipam-ovn-default-certs-0\") pod \"2420e9c4-faed-48f0-857d-4aba72c5cab2\" (UID: \"2420e9c4-faed-48f0-857d-4aba72c5cab2\") " Mar 09 09:38:26 crc kubenswrapper[4861]: I0309 09:38:26.766931 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2420e9c4-faed-48f0-857d-4aba72c5cab2-libvirt-combined-ca-bundle\") pod \"2420e9c4-faed-48f0-857d-4aba72c5cab2\" (UID: \"2420e9c4-faed-48f0-857d-4aba72c5cab2\") " Mar 09 09:38:26 crc kubenswrapper[4861]: I0309 09:38:26.766952 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2420e9c4-faed-48f0-857d-4aba72c5cab2-bootstrap-combined-ca-bundle\") pod \"2420e9c4-faed-48f0-857d-4aba72c5cab2\" (UID: \"2420e9c4-faed-48f0-857d-4aba72c5cab2\") " Mar 09 09:38:26 crc kubenswrapper[4861]: I0309 09:38:26.766999 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2420e9c4-faed-48f0-857d-4aba72c5cab2-ovn-combined-ca-bundle\") pod \"2420e9c4-faed-48f0-857d-4aba72c5cab2\" (UID: \"2420e9c4-faed-48f0-857d-4aba72c5cab2\") " Mar 09 09:38:26 crc kubenswrapper[4861]: I0309 09:38:26.767017 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2420e9c4-faed-48f0-857d-4aba72c5cab2-nova-combined-ca-bundle\") pod \"2420e9c4-faed-48f0-857d-4aba72c5cab2\" (UID: \"2420e9c4-faed-48f0-857d-4aba72c5cab2\") " Mar 09 09:38:26 crc kubenswrapper[4861]: I0309 09:38:26.767125 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2420e9c4-faed-48f0-857d-4aba72c5cab2-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"2420e9c4-faed-48f0-857d-4aba72c5cab2\" (UID: \"2420e9c4-faed-48f0-857d-4aba72c5cab2\") " Mar 09 09:38:26 crc kubenswrapper[4861]: I0309 09:38:26.767749 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2420e9c4-faed-48f0-857d-4aba72c5cab2-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"2420e9c4-faed-48f0-857d-4aba72c5cab2\" (UID: \"2420e9c4-faed-48f0-857d-4aba72c5cab2\") " Mar 09 09:38:26 crc kubenswrapper[4861]: I0309 09:38:26.767788 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2420e9c4-faed-48f0-857d-4aba72c5cab2-ssh-key-openstack-edpm-ipam\") pod \"2420e9c4-faed-48f0-857d-4aba72c5cab2\" (UID: \"2420e9c4-faed-48f0-857d-4aba72c5cab2\") " Mar 09 09:38:26 crc kubenswrapper[4861]: I0309 09:38:26.767816 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2420e9c4-faed-48f0-857d-4aba72c5cab2-neutron-metadata-combined-ca-bundle\") pod \"2420e9c4-faed-48f0-857d-4aba72c5cab2\" (UID: \"2420e9c4-faed-48f0-857d-4aba72c5cab2\") " Mar 09 09:38:26 crc kubenswrapper[4861]: I0309 09:38:26.767884 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2420e9c4-faed-48f0-857d-4aba72c5cab2-repo-setup-combined-ca-bundle\") pod \"2420e9c4-faed-48f0-857d-4aba72c5cab2\" (UID: \"2420e9c4-faed-48f0-857d-4aba72c5cab2\") " Mar 09 09:38:26 crc kubenswrapper[4861]: I0309 09:38:26.767941 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2420e9c4-faed-48f0-857d-4aba72c5cab2-telemetry-combined-ca-bundle\") pod \"2420e9c4-faed-48f0-857d-4aba72c5cab2\" (UID: \"2420e9c4-faed-48f0-857d-4aba72c5cab2\") " Mar 09 09:38:26 crc kubenswrapper[4861]: I0309 09:38:26.767961 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9fkd\" (UniqueName: \"kubernetes.io/projected/2420e9c4-faed-48f0-857d-4aba72c5cab2-kube-api-access-g9fkd\") pod \"2420e9c4-faed-48f0-857d-4aba72c5cab2\" (UID: \"2420e9c4-faed-48f0-857d-4aba72c5cab2\") " Mar 09 09:38:26 crc kubenswrapper[4861]: I0309 09:38:26.797053 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2420e9c4-faed-48f0-857d-4aba72c5cab2-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "2420e9c4-faed-48f0-857d-4aba72c5cab2" (UID: "2420e9c4-faed-48f0-857d-4aba72c5cab2"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:38:26 crc kubenswrapper[4861]: I0309 09:38:26.797188 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2420e9c4-faed-48f0-857d-4aba72c5cab2-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "2420e9c4-faed-48f0-857d-4aba72c5cab2" (UID: "2420e9c4-faed-48f0-857d-4aba72c5cab2"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:38:26 crc kubenswrapper[4861]: I0309 09:38:26.800561 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2420e9c4-faed-48f0-857d-4aba72c5cab2-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "2420e9c4-faed-48f0-857d-4aba72c5cab2" (UID: "2420e9c4-faed-48f0-857d-4aba72c5cab2"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:38:26 crc kubenswrapper[4861]: I0309 09:38:26.801690 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2420e9c4-faed-48f0-857d-4aba72c5cab2-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "2420e9c4-faed-48f0-857d-4aba72c5cab2" (UID: "2420e9c4-faed-48f0-857d-4aba72c5cab2"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:38:26 crc kubenswrapper[4861]: I0309 09:38:26.801722 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2420e9c4-faed-48f0-857d-4aba72c5cab2-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "2420e9c4-faed-48f0-857d-4aba72c5cab2" (UID: "2420e9c4-faed-48f0-857d-4aba72c5cab2"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:38:26 crc kubenswrapper[4861]: I0309 09:38:26.807341 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2420e9c4-faed-48f0-857d-4aba72c5cab2-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "2420e9c4-faed-48f0-857d-4aba72c5cab2" (UID: "2420e9c4-faed-48f0-857d-4aba72c5cab2"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:38:26 crc kubenswrapper[4861]: I0309 09:38:26.807542 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2420e9c4-faed-48f0-857d-4aba72c5cab2-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "2420e9c4-faed-48f0-857d-4aba72c5cab2" (UID: "2420e9c4-faed-48f0-857d-4aba72c5cab2"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:38:26 crc kubenswrapper[4861]: I0309 09:38:26.841630 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2420e9c4-faed-48f0-857d-4aba72c5cab2-kube-api-access-g9fkd" (OuterVolumeSpecName: "kube-api-access-g9fkd") pod "2420e9c4-faed-48f0-857d-4aba72c5cab2" (UID: "2420e9c4-faed-48f0-857d-4aba72c5cab2"). InnerVolumeSpecName "kube-api-access-g9fkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:38:26 crc kubenswrapper[4861]: I0309 09:38:26.841618 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2420e9c4-faed-48f0-857d-4aba72c5cab2-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "2420e9c4-faed-48f0-857d-4aba72c5cab2" (UID: "2420e9c4-faed-48f0-857d-4aba72c5cab2"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:38:26 crc kubenswrapper[4861]: I0309 09:38:26.841672 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2420e9c4-faed-48f0-857d-4aba72c5cab2-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "2420e9c4-faed-48f0-857d-4aba72c5cab2" (UID: "2420e9c4-faed-48f0-857d-4aba72c5cab2"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:38:26 crc kubenswrapper[4861]: I0309 09:38:26.841727 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2420e9c4-faed-48f0-857d-4aba72c5cab2-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "2420e9c4-faed-48f0-857d-4aba72c5cab2" (UID: "2420e9c4-faed-48f0-857d-4aba72c5cab2"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:38:26 crc kubenswrapper[4861]: I0309 09:38:26.842055 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2420e9c4-faed-48f0-857d-4aba72c5cab2-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "2420e9c4-faed-48f0-857d-4aba72c5cab2" (UID: "2420e9c4-faed-48f0-857d-4aba72c5cab2"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:38:26 crc kubenswrapper[4861]: I0309 09:38:26.870884 4861 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2420e9c4-faed-48f0-857d-4aba72c5cab2-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 09 09:38:26 crc kubenswrapper[4861]: I0309 09:38:26.871138 4861 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2420e9c4-faed-48f0-857d-4aba72c5cab2-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 09 09:38:26 crc kubenswrapper[4861]: I0309 09:38:26.871149 4861 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2420e9c4-faed-48f0-857d-4aba72c5cab2-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:38:26 crc kubenswrapper[4861]: I0309 09:38:26.871160 4861 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2420e9c4-faed-48f0-857d-4aba72c5cab2-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:38:26 crc kubenswrapper[4861]: I0309 09:38:26.871170 4861 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2420e9c4-faed-48f0-857d-4aba72c5cab2-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:38:26 crc kubenswrapper[4861]: I0309 09:38:26.871178 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9fkd\" (UniqueName: \"kubernetes.io/projected/2420e9c4-faed-48f0-857d-4aba72c5cab2-kube-api-access-g9fkd\") on node \"crc\" DevicePath \"\"" Mar 09 09:38:26 crc kubenswrapper[4861]: I0309 09:38:26.871186 4861 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2420e9c4-faed-48f0-857d-4aba72c5cab2-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 09 09:38:26 crc kubenswrapper[4861]: I0309 09:38:26.871194 4861 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/2420e9c4-faed-48f0-857d-4aba72c5cab2-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 09 09:38:26 crc kubenswrapper[4861]: I0309 09:38:26.871203 4861 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2420e9c4-faed-48f0-857d-4aba72c5cab2-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:38:26 crc kubenswrapper[4861]: I0309 09:38:26.871211 4861 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2420e9c4-faed-48f0-857d-4aba72c5cab2-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:38:26 crc kubenswrapper[4861]: I0309 09:38:26.871220 4861 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2420e9c4-faed-48f0-857d-4aba72c5cab2-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:38:26 crc kubenswrapper[4861]: I0309 09:38:26.871229 4861 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2420e9c4-faed-48f0-857d-4aba72c5cab2-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:38:26 crc kubenswrapper[4861]: I0309 09:38:26.877134 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2420e9c4-faed-48f0-857d-4aba72c5cab2-inventory" (OuterVolumeSpecName: "inventory") pod "2420e9c4-faed-48f0-857d-4aba72c5cab2" (UID: "2420e9c4-faed-48f0-857d-4aba72c5cab2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:38:26 crc kubenswrapper[4861]: I0309 09:38:26.892951 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2420e9c4-faed-48f0-857d-4aba72c5cab2-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2420e9c4-faed-48f0-857d-4aba72c5cab2" (UID: "2420e9c4-faed-48f0-857d-4aba72c5cab2"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:38:26 crc kubenswrapper[4861]: I0309 09:38:26.972064 4861 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2420e9c4-faed-48f0-857d-4aba72c5cab2-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 09:38:26 crc kubenswrapper[4861]: I0309 09:38:26.972095 4861 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2420e9c4-faed-48f0-857d-4aba72c5cab2-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 09:38:27 crc kubenswrapper[4861]: I0309 09:38:27.311453 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9" event={"ID":"2420e9c4-faed-48f0-857d-4aba72c5cab2","Type":"ContainerDied","Data":"11e8a983af6ea10a419499dcf14109f5a91cc19917f550f88b30848fa71cc589"} Mar 09 09:38:27 crc kubenswrapper[4861]: I0309 09:38:27.311491 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11e8a983af6ea10a419499dcf14109f5a91cc19917f550f88b30848fa71cc589" Mar 09 09:38:27 crc kubenswrapper[4861]: I0309 09:38:27.311498 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9" Mar 09 09:38:27 crc kubenswrapper[4861]: I0309 09:38:27.409226 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-9rkkl"] Mar 09 09:38:27 crc kubenswrapper[4861]: E0309 09:38:27.409690 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2420e9c4-faed-48f0-857d-4aba72c5cab2" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 09 09:38:27 crc kubenswrapper[4861]: I0309 09:38:27.409711 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="2420e9c4-faed-48f0-857d-4aba72c5cab2" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 09 09:38:27 crc kubenswrapper[4861]: E0309 09:38:27.409728 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaaefb0a-991f-42b0-9474-43af65c61889" containerName="oc" Mar 09 09:38:27 crc kubenswrapper[4861]: I0309 09:38:27.409733 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaaefb0a-991f-42b0-9474-43af65c61889" containerName="oc" Mar 09 09:38:27 crc kubenswrapper[4861]: I0309 09:38:27.409923 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="2420e9c4-faed-48f0-857d-4aba72c5cab2" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 09 09:38:27 crc kubenswrapper[4861]: I0309 09:38:27.409953 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaaefb0a-991f-42b0-9474-43af65c61889" containerName="oc" Mar 09 09:38:27 crc kubenswrapper[4861]: I0309 09:38:27.410503 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9rkkl" Mar 09 09:38:27 crc kubenswrapper[4861]: I0309 09:38:27.415547 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 09 09:38:27 crc kubenswrapper[4861]: I0309 09:38:27.415795 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 09:38:27 crc kubenswrapper[4861]: I0309 09:38:27.416032 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lkd5q" Mar 09 09:38:27 crc kubenswrapper[4861]: I0309 09:38:27.416278 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 09:38:27 crc kubenswrapper[4861]: I0309 09:38:27.418701 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 09:38:27 crc kubenswrapper[4861]: I0309 09:38:27.421170 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-9rkkl"] Mar 09 09:38:27 crc kubenswrapper[4861]: I0309 09:38:27.482902 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/47423b67-9acf-48b2-b8b5-d47b822ad425-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9rkkl\" (UID: \"47423b67-9acf-48b2-b8b5-d47b822ad425\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9rkkl" Mar 09 09:38:27 crc kubenswrapper[4861]: I0309 09:38:27.482973 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfzxk\" (UniqueName: \"kubernetes.io/projected/47423b67-9acf-48b2-b8b5-d47b822ad425-kube-api-access-nfzxk\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9rkkl\" (UID: \"47423b67-9acf-48b2-b8b5-d47b822ad425\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9rkkl" Mar 09 09:38:27 crc kubenswrapper[4861]: I0309 09:38:27.483084 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/47423b67-9acf-48b2-b8b5-d47b822ad425-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9rkkl\" (UID: \"47423b67-9acf-48b2-b8b5-d47b822ad425\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9rkkl" Mar 09 09:38:27 crc kubenswrapper[4861]: I0309 09:38:27.483147 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/47423b67-9acf-48b2-b8b5-d47b822ad425-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9rkkl\" (UID: \"47423b67-9acf-48b2-b8b5-d47b822ad425\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9rkkl" Mar 09 09:38:27 crc kubenswrapper[4861]: I0309 09:38:27.483222 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47423b67-9acf-48b2-b8b5-d47b822ad425-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9rkkl\" (UID: \"47423b67-9acf-48b2-b8b5-d47b822ad425\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9rkkl" Mar 09 09:38:27 crc kubenswrapper[4861]: I0309 09:38:27.584003 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47423b67-9acf-48b2-b8b5-d47b822ad425-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9rkkl\" (UID: \"47423b67-9acf-48b2-b8b5-d47b822ad425\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9rkkl" Mar 09 09:38:27 crc kubenswrapper[4861]: I0309 09:38:27.584073 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/47423b67-9acf-48b2-b8b5-d47b822ad425-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9rkkl\" (UID: \"47423b67-9acf-48b2-b8b5-d47b822ad425\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9rkkl" Mar 09 09:38:27 crc kubenswrapper[4861]: I0309 09:38:27.584138 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfzxk\" (UniqueName: \"kubernetes.io/projected/47423b67-9acf-48b2-b8b5-d47b822ad425-kube-api-access-nfzxk\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9rkkl\" (UID: \"47423b67-9acf-48b2-b8b5-d47b822ad425\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9rkkl" Mar 09 09:38:27 crc kubenswrapper[4861]: I0309 09:38:27.584255 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/47423b67-9acf-48b2-b8b5-d47b822ad425-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9rkkl\" (UID: \"47423b67-9acf-48b2-b8b5-d47b822ad425\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9rkkl" Mar 09 09:38:27 crc kubenswrapper[4861]: I0309 09:38:27.584323 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/47423b67-9acf-48b2-b8b5-d47b822ad425-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9rkkl\" (UID: \"47423b67-9acf-48b2-b8b5-d47b822ad425\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9rkkl" Mar 09 09:38:27 crc kubenswrapper[4861]: I0309 09:38:27.585187 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/47423b67-9acf-48b2-b8b5-d47b822ad425-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9rkkl\" (UID: \"47423b67-9acf-48b2-b8b5-d47b822ad425\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9rkkl" Mar 09 09:38:27 crc kubenswrapper[4861]: I0309 09:38:27.588663 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/47423b67-9acf-48b2-b8b5-d47b822ad425-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9rkkl\" (UID: \"47423b67-9acf-48b2-b8b5-d47b822ad425\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9rkkl" Mar 09 09:38:27 crc kubenswrapper[4861]: I0309 09:38:27.589404 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/47423b67-9acf-48b2-b8b5-d47b822ad425-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9rkkl\" (UID: \"47423b67-9acf-48b2-b8b5-d47b822ad425\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9rkkl" Mar 09 09:38:27 crc kubenswrapper[4861]: I0309 09:38:27.590207 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47423b67-9acf-48b2-b8b5-d47b822ad425-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9rkkl\" (UID: \"47423b67-9acf-48b2-b8b5-d47b822ad425\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9rkkl" Mar 09 09:38:27 crc kubenswrapper[4861]: I0309 09:38:27.600070 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfzxk\" (UniqueName: \"kubernetes.io/projected/47423b67-9acf-48b2-b8b5-d47b822ad425-kube-api-access-nfzxk\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-9rkkl\" (UID: \"47423b67-9acf-48b2-b8b5-d47b822ad425\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9rkkl" Mar 09 09:38:27 crc kubenswrapper[4861]: I0309 09:38:27.727678 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9rkkl" Mar 09 09:38:28 crc kubenswrapper[4861]: I0309 09:38:28.241169 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-9rkkl"] Mar 09 09:38:28 crc kubenswrapper[4861]: I0309 09:38:28.323424 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9rkkl" event={"ID":"47423b67-9acf-48b2-b8b5-d47b822ad425","Type":"ContainerStarted","Data":"791672ce13fbdfcb3b1452a8e36ec8e3cb73b3f927e9fe8cd02e8e0e7b3e334e"} Mar 09 09:38:29 crc kubenswrapper[4861]: I0309 09:38:29.332702 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9rkkl" event={"ID":"47423b67-9acf-48b2-b8b5-d47b822ad425","Type":"ContainerStarted","Data":"71b14997157d3ad27da29b770e33454f94b3840defca276fa6d3dc186b05d235"} Mar 09 09:38:29 crc kubenswrapper[4861]: I0309 09:38:29.363653 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9rkkl" podStartSLOduration=1.932468007 podStartE2EDuration="2.363633736s" podCreationTimestamp="2026-03-09 09:38:27 +0000 UTC" firstStartedPulling="2026-03-09 09:38:28.244398767 +0000 UTC m=+1951.329438168" lastFinishedPulling="2026-03-09 09:38:28.675564486 +0000 UTC m=+1951.760603897" observedRunningTime="2026-03-09 09:38:29.354201706 +0000 UTC m=+1952.439241107" watchObservedRunningTime="2026-03-09 09:38:29.363633736 +0000 UTC m=+1952.448673137" Mar 09 09:39:29 crc kubenswrapper[4861]: I0309 09:39:29.859683 4861 generic.go:334] "Generic (PLEG): container finished" podID="47423b67-9acf-48b2-b8b5-d47b822ad425" containerID="71b14997157d3ad27da29b770e33454f94b3840defca276fa6d3dc186b05d235" exitCode=0 Mar 09 09:39:29 crc kubenswrapper[4861]: I0309 09:39:29.859755 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9rkkl" event={"ID":"47423b67-9acf-48b2-b8b5-d47b822ad425","Type":"ContainerDied","Data":"71b14997157d3ad27da29b770e33454f94b3840defca276fa6d3dc186b05d235"} Mar 09 09:39:31 crc kubenswrapper[4861]: I0309 09:39:31.294744 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9rkkl" Mar 09 09:39:31 crc kubenswrapper[4861]: I0309 09:39:31.386110 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47423b67-9acf-48b2-b8b5-d47b822ad425-ovn-combined-ca-bundle\") pod \"47423b67-9acf-48b2-b8b5-d47b822ad425\" (UID: \"47423b67-9acf-48b2-b8b5-d47b822ad425\") " Mar 09 09:39:31 crc kubenswrapper[4861]: I0309 09:39:31.386290 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/47423b67-9acf-48b2-b8b5-d47b822ad425-ssh-key-openstack-edpm-ipam\") pod \"47423b67-9acf-48b2-b8b5-d47b822ad425\" (UID: \"47423b67-9acf-48b2-b8b5-d47b822ad425\") " Mar 09 09:39:31 crc kubenswrapper[4861]: I0309 09:39:31.386390 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfzxk\" (UniqueName: \"kubernetes.io/projected/47423b67-9acf-48b2-b8b5-d47b822ad425-kube-api-access-nfzxk\") pod \"47423b67-9acf-48b2-b8b5-d47b822ad425\" (UID: \"47423b67-9acf-48b2-b8b5-d47b822ad425\") " Mar 09 09:39:31 crc kubenswrapper[4861]: I0309 09:39:31.386425 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/47423b67-9acf-48b2-b8b5-d47b822ad425-ovncontroller-config-0\") pod \"47423b67-9acf-48b2-b8b5-d47b822ad425\" (UID: \"47423b67-9acf-48b2-b8b5-d47b822ad425\") " Mar 09 09:39:31 crc kubenswrapper[4861]: I0309 09:39:31.387354 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/47423b67-9acf-48b2-b8b5-d47b822ad425-inventory\") pod \"47423b67-9acf-48b2-b8b5-d47b822ad425\" (UID: \"47423b67-9acf-48b2-b8b5-d47b822ad425\") " Mar 09 09:39:31 crc kubenswrapper[4861]: I0309 09:39:31.391299 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47423b67-9acf-48b2-b8b5-d47b822ad425-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "47423b67-9acf-48b2-b8b5-d47b822ad425" (UID: "47423b67-9acf-48b2-b8b5-d47b822ad425"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:39:31 crc kubenswrapper[4861]: I0309 09:39:31.392708 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47423b67-9acf-48b2-b8b5-d47b822ad425-kube-api-access-nfzxk" (OuterVolumeSpecName: "kube-api-access-nfzxk") pod "47423b67-9acf-48b2-b8b5-d47b822ad425" (UID: "47423b67-9acf-48b2-b8b5-d47b822ad425"). InnerVolumeSpecName "kube-api-access-nfzxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:39:31 crc kubenswrapper[4861]: I0309 09:39:31.418396 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47423b67-9acf-48b2-b8b5-d47b822ad425-inventory" (OuterVolumeSpecName: "inventory") pod "47423b67-9acf-48b2-b8b5-d47b822ad425" (UID: "47423b67-9acf-48b2-b8b5-d47b822ad425"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:39:31 crc kubenswrapper[4861]: I0309 09:39:31.425531 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47423b67-9acf-48b2-b8b5-d47b822ad425-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "47423b67-9acf-48b2-b8b5-d47b822ad425" (UID: "47423b67-9acf-48b2-b8b5-d47b822ad425"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:39:31 crc kubenswrapper[4861]: I0309 09:39:31.425545 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47423b67-9acf-48b2-b8b5-d47b822ad425-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "47423b67-9acf-48b2-b8b5-d47b822ad425" (UID: "47423b67-9acf-48b2-b8b5-d47b822ad425"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:39:31 crc kubenswrapper[4861]: I0309 09:39:31.489915 4861 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/47423b67-9acf-48b2-b8b5-d47b822ad425-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 09:39:31 crc kubenswrapper[4861]: I0309 09:39:31.489962 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfzxk\" (UniqueName: \"kubernetes.io/projected/47423b67-9acf-48b2-b8b5-d47b822ad425-kube-api-access-nfzxk\") on node \"crc\" DevicePath \"\"" Mar 09 09:39:31 crc kubenswrapper[4861]: I0309 09:39:31.489974 4861 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/47423b67-9acf-48b2-b8b5-d47b822ad425-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 09 09:39:31 crc kubenswrapper[4861]: I0309 09:39:31.489988 4861 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/47423b67-9acf-48b2-b8b5-d47b822ad425-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 09:39:31 crc kubenswrapper[4861]: I0309 09:39:31.490000 4861 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47423b67-9acf-48b2-b8b5-d47b822ad425-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:39:31 crc kubenswrapper[4861]: I0309 09:39:31.877816 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9rkkl" event={"ID":"47423b67-9acf-48b2-b8b5-d47b822ad425","Type":"ContainerDied","Data":"791672ce13fbdfcb3b1452a8e36ec8e3cb73b3f927e9fe8cd02e8e0e7b3e334e"} Mar 09 09:39:31 crc kubenswrapper[4861]: I0309 09:39:31.877860 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="791672ce13fbdfcb3b1452a8e36ec8e3cb73b3f927e9fe8cd02e8e0e7b3e334e" Mar 09 09:39:31 crc kubenswrapper[4861]: I0309 09:39:31.877862 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-9rkkl" Mar 09 09:39:31 crc kubenswrapper[4861]: I0309 09:39:31.966868 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d9zgg"] Mar 09 09:39:31 crc kubenswrapper[4861]: E0309 09:39:31.967474 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47423b67-9acf-48b2-b8b5-d47b822ad425" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 09 09:39:31 crc kubenswrapper[4861]: I0309 09:39:31.967499 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="47423b67-9acf-48b2-b8b5-d47b822ad425" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 09 09:39:31 crc kubenswrapper[4861]: I0309 09:39:31.967712 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="47423b67-9acf-48b2-b8b5-d47b822ad425" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 09 09:39:31 crc kubenswrapper[4861]: I0309 09:39:31.968517 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d9zgg" Mar 09 09:39:31 crc kubenswrapper[4861]: I0309 09:39:31.970506 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 09:39:31 crc kubenswrapper[4861]: I0309 09:39:31.970561 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 09 09:39:31 crc kubenswrapper[4861]: I0309 09:39:31.970853 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 09:39:31 crc kubenswrapper[4861]: I0309 09:39:31.970957 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lkd5q" Mar 09 09:39:31 crc kubenswrapper[4861]: I0309 09:39:31.971121 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 09:39:31 crc kubenswrapper[4861]: I0309 09:39:31.971670 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 09 09:39:31 crc kubenswrapper[4861]: I0309 09:39:31.977035 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d9zgg"] Mar 09 09:39:32 crc kubenswrapper[4861]: I0309 09:39:32.000305 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e525f15-c77e-4a1c-a161-4db82064bf70-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d9zgg\" (UID: \"9e525f15-c77e-4a1c-a161-4db82064bf70\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d9zgg" Mar 09 09:39:32 crc kubenswrapper[4861]: I0309 09:39:32.000404 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9e525f15-c77e-4a1c-a161-4db82064bf70-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d9zgg\" (UID: \"9e525f15-c77e-4a1c-a161-4db82064bf70\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d9zgg" Mar 09 09:39:32 crc kubenswrapper[4861]: I0309 09:39:32.000441 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9g9wr\" (UniqueName: \"kubernetes.io/projected/9e525f15-c77e-4a1c-a161-4db82064bf70-kube-api-access-9g9wr\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d9zgg\" (UID: \"9e525f15-c77e-4a1c-a161-4db82064bf70\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d9zgg" Mar 09 09:39:32 crc kubenswrapper[4861]: I0309 09:39:32.000479 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9e525f15-c77e-4a1c-a161-4db82064bf70-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d9zgg\" (UID: \"9e525f15-c77e-4a1c-a161-4db82064bf70\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d9zgg" Mar 09 09:39:32 crc kubenswrapper[4861]: I0309 09:39:32.000796 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9e525f15-c77e-4a1c-a161-4db82064bf70-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d9zgg\" (UID: \"9e525f15-c77e-4a1c-a161-4db82064bf70\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d9zgg" Mar 09 09:39:32 crc kubenswrapper[4861]: I0309 09:39:32.000958 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9e525f15-c77e-4a1c-a161-4db82064bf70-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d9zgg\" (UID: \"9e525f15-c77e-4a1c-a161-4db82064bf70\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d9zgg" Mar 09 09:39:32 crc kubenswrapper[4861]: I0309 09:39:32.103295 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9e525f15-c77e-4a1c-a161-4db82064bf70-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d9zgg\" (UID: \"9e525f15-c77e-4a1c-a161-4db82064bf70\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d9zgg" Mar 09 09:39:32 crc kubenswrapper[4861]: I0309 09:39:32.103601 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9e525f15-c77e-4a1c-a161-4db82064bf70-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d9zgg\" (UID: \"9e525f15-c77e-4a1c-a161-4db82064bf70\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d9zgg" Mar 09 09:39:32 crc kubenswrapper[4861]: I0309 09:39:32.103711 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e525f15-c77e-4a1c-a161-4db82064bf70-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d9zgg\" (UID: \"9e525f15-c77e-4a1c-a161-4db82064bf70\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d9zgg" Mar 09 09:39:32 crc kubenswrapper[4861]: I0309 09:39:32.103766 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9e525f15-c77e-4a1c-a161-4db82064bf70-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d9zgg\" (UID: \"9e525f15-c77e-4a1c-a161-4db82064bf70\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d9zgg" Mar 09 09:39:32 crc kubenswrapper[4861]: I0309 09:39:32.103823 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9g9wr\" (UniqueName: \"kubernetes.io/projected/9e525f15-c77e-4a1c-a161-4db82064bf70-kube-api-access-9g9wr\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d9zgg\" (UID: \"9e525f15-c77e-4a1c-a161-4db82064bf70\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d9zgg" Mar 09 09:39:32 crc kubenswrapper[4861]: I0309 09:39:32.103880 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9e525f15-c77e-4a1c-a161-4db82064bf70-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d9zgg\" (UID: \"9e525f15-c77e-4a1c-a161-4db82064bf70\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d9zgg" Mar 09 09:39:32 crc kubenswrapper[4861]: I0309 09:39:32.108973 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9e525f15-c77e-4a1c-a161-4db82064bf70-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d9zgg\" (UID: \"9e525f15-c77e-4a1c-a161-4db82064bf70\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d9zgg" Mar 09 09:39:32 crc kubenswrapper[4861]: I0309 09:39:32.109006 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9e525f15-c77e-4a1c-a161-4db82064bf70-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d9zgg\" (UID: \"9e525f15-c77e-4a1c-a161-4db82064bf70\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d9zgg" Mar 09 09:39:32 crc kubenswrapper[4861]: I0309 09:39:32.117674 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9e525f15-c77e-4a1c-a161-4db82064bf70-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d9zgg\" (UID: \"9e525f15-c77e-4a1c-a161-4db82064bf70\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d9zgg" Mar 09 09:39:32 crc kubenswrapper[4861]: I0309 09:39:32.118156 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e525f15-c77e-4a1c-a161-4db82064bf70-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d9zgg\" (UID: \"9e525f15-c77e-4a1c-a161-4db82064bf70\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d9zgg" Mar 09 09:39:32 crc kubenswrapper[4861]: I0309 09:39:32.125142 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9e525f15-c77e-4a1c-a161-4db82064bf70-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d9zgg\" (UID: \"9e525f15-c77e-4a1c-a161-4db82064bf70\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d9zgg" Mar 09 09:39:32 crc kubenswrapper[4861]: I0309 09:39:32.125788 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9g9wr\" (UniqueName: \"kubernetes.io/projected/9e525f15-c77e-4a1c-a161-4db82064bf70-kube-api-access-9g9wr\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-d9zgg\" (UID: \"9e525f15-c77e-4a1c-a161-4db82064bf70\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d9zgg" Mar 09 09:39:32 crc kubenswrapper[4861]: I0309 09:39:32.283082 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d9zgg" Mar 09 09:39:32 crc kubenswrapper[4861]: W0309 09:39:32.903752 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9e525f15_c77e_4a1c_a161_4db82064bf70.slice/crio-753ee8c4ed321eeb7285bbab02764169154e7740bafaa3000e0d5231219f42f9 WatchSource:0}: Error finding container 753ee8c4ed321eeb7285bbab02764169154e7740bafaa3000e0d5231219f42f9: Status 404 returned error can't find the container with id 753ee8c4ed321eeb7285bbab02764169154e7740bafaa3000e0d5231219f42f9 Mar 09 09:39:32 crc kubenswrapper[4861]: I0309 09:39:32.905653 4861 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 09:39:32 crc kubenswrapper[4861]: I0309 09:39:32.906546 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d9zgg"] Mar 09 09:39:33 crc kubenswrapper[4861]: I0309 09:39:33.902809 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d9zgg" event={"ID":"9e525f15-c77e-4a1c-a161-4db82064bf70","Type":"ContainerStarted","Data":"753ee8c4ed321eeb7285bbab02764169154e7740bafaa3000e0d5231219f42f9"} Mar 09 09:39:34 crc kubenswrapper[4861]: I0309 09:39:34.912471 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d9zgg" event={"ID":"9e525f15-c77e-4a1c-a161-4db82064bf70","Type":"ContainerStarted","Data":"e48d4f898d2531d44bceb7d457aa754b59e07b72a921dd50fea17caae3a2cd1c"} Mar 09 09:39:34 crc kubenswrapper[4861]: I0309 09:39:34.930831 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d9zgg" podStartSLOduration=3.156536733 podStartE2EDuration="3.930813376s" podCreationTimestamp="2026-03-09 09:39:31 +0000 UTC" firstStartedPulling="2026-03-09 09:39:32.905341546 +0000 UTC m=+2015.990380947" lastFinishedPulling="2026-03-09 09:39:33.679618189 +0000 UTC m=+2016.764657590" observedRunningTime="2026-03-09 09:39:34.927372581 +0000 UTC m=+2018.012411982" watchObservedRunningTime="2026-03-09 09:39:34.930813376 +0000 UTC m=+2018.015852787" Mar 09 09:39:54 crc kubenswrapper[4861]: I0309 09:39:54.605925 4861 patch_prober.go:28] interesting pod/machine-config-daemon-5g7gc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:39:54 crc kubenswrapper[4861]: I0309 09:39:54.606523 4861 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:40:00 crc kubenswrapper[4861]: I0309 09:40:00.134957 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550820-zv5t2"] Mar 09 09:40:00 crc kubenswrapper[4861]: I0309 09:40:00.138390 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550820-zv5t2" Mar 09 09:40:00 crc kubenswrapper[4861]: I0309 09:40:00.144496 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:40:00 crc kubenswrapper[4861]: I0309 09:40:00.144555 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:40:00 crc kubenswrapper[4861]: I0309 09:40:00.144690 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k86l8" Mar 09 09:40:00 crc kubenswrapper[4861]: I0309 09:40:00.149480 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550820-zv5t2"] Mar 09 09:40:00 crc kubenswrapper[4861]: I0309 09:40:00.233136 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rpsj\" (UniqueName: \"kubernetes.io/projected/1574dfe2-f3f7-4f85-9a01-4b436cb68a9c-kube-api-access-7rpsj\") pod \"auto-csr-approver-29550820-zv5t2\" (UID: \"1574dfe2-f3f7-4f85-9a01-4b436cb68a9c\") " pod="openshift-infra/auto-csr-approver-29550820-zv5t2" Mar 09 09:40:00 crc kubenswrapper[4861]: I0309 09:40:00.335091 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rpsj\" (UniqueName: \"kubernetes.io/projected/1574dfe2-f3f7-4f85-9a01-4b436cb68a9c-kube-api-access-7rpsj\") pod \"auto-csr-approver-29550820-zv5t2\" (UID: \"1574dfe2-f3f7-4f85-9a01-4b436cb68a9c\") " pod="openshift-infra/auto-csr-approver-29550820-zv5t2" Mar 09 09:40:00 crc kubenswrapper[4861]: I0309 09:40:00.353909 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rpsj\" (UniqueName: \"kubernetes.io/projected/1574dfe2-f3f7-4f85-9a01-4b436cb68a9c-kube-api-access-7rpsj\") pod \"auto-csr-approver-29550820-zv5t2\" (UID: \"1574dfe2-f3f7-4f85-9a01-4b436cb68a9c\") " pod="openshift-infra/auto-csr-approver-29550820-zv5t2" Mar 09 09:40:00 crc kubenswrapper[4861]: I0309 09:40:00.460151 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550820-zv5t2" Mar 09 09:40:00 crc kubenswrapper[4861]: I0309 09:40:00.912552 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550820-zv5t2"] Mar 09 09:40:00 crc kubenswrapper[4861]: W0309 09:40:00.917549 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1574dfe2_f3f7_4f85_9a01_4b436cb68a9c.slice/crio-9125280b726bed76875dde627af2daecb820a2f8beb943ccc9ee9682c7af5ad1 WatchSource:0}: Error finding container 9125280b726bed76875dde627af2daecb820a2f8beb943ccc9ee9682c7af5ad1: Status 404 returned error can't find the container with id 9125280b726bed76875dde627af2daecb820a2f8beb943ccc9ee9682c7af5ad1 Mar 09 09:40:01 crc kubenswrapper[4861]: I0309 09:40:01.146557 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550820-zv5t2" event={"ID":"1574dfe2-f3f7-4f85-9a01-4b436cb68a9c","Type":"ContainerStarted","Data":"9125280b726bed76875dde627af2daecb820a2f8beb943ccc9ee9682c7af5ad1"} Mar 09 09:40:03 crc kubenswrapper[4861]: I0309 09:40:03.164665 4861 generic.go:334] "Generic (PLEG): container finished" podID="1574dfe2-f3f7-4f85-9a01-4b436cb68a9c" containerID="68ea9335b1b681a65de954bfe47de66d6b6b1f67d9d8220c96e492d1331065c6" exitCode=0 Mar 09 09:40:03 crc kubenswrapper[4861]: I0309 09:40:03.164859 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550820-zv5t2" event={"ID":"1574dfe2-f3f7-4f85-9a01-4b436cb68a9c","Type":"ContainerDied","Data":"68ea9335b1b681a65de954bfe47de66d6b6b1f67d9d8220c96e492d1331065c6"} Mar 09 09:40:04 crc kubenswrapper[4861]: I0309 09:40:04.498638 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550820-zv5t2" Mar 09 09:40:04 crc kubenswrapper[4861]: I0309 09:40:04.625630 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rpsj\" (UniqueName: \"kubernetes.io/projected/1574dfe2-f3f7-4f85-9a01-4b436cb68a9c-kube-api-access-7rpsj\") pod \"1574dfe2-f3f7-4f85-9a01-4b436cb68a9c\" (UID: \"1574dfe2-f3f7-4f85-9a01-4b436cb68a9c\") " Mar 09 09:40:04 crc kubenswrapper[4861]: I0309 09:40:04.632183 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1574dfe2-f3f7-4f85-9a01-4b436cb68a9c-kube-api-access-7rpsj" (OuterVolumeSpecName: "kube-api-access-7rpsj") pod "1574dfe2-f3f7-4f85-9a01-4b436cb68a9c" (UID: "1574dfe2-f3f7-4f85-9a01-4b436cb68a9c"). InnerVolumeSpecName "kube-api-access-7rpsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:40:04 crc kubenswrapper[4861]: I0309 09:40:04.727690 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rpsj\" (UniqueName: \"kubernetes.io/projected/1574dfe2-f3f7-4f85-9a01-4b436cb68a9c-kube-api-access-7rpsj\") on node \"crc\" DevicePath \"\"" Mar 09 09:40:05 crc kubenswrapper[4861]: I0309 09:40:05.186558 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550820-zv5t2" event={"ID":"1574dfe2-f3f7-4f85-9a01-4b436cb68a9c","Type":"ContainerDied","Data":"9125280b726bed76875dde627af2daecb820a2f8beb943ccc9ee9682c7af5ad1"} Mar 09 09:40:05 crc kubenswrapper[4861]: I0309 09:40:05.186597 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9125280b726bed76875dde627af2daecb820a2f8beb943ccc9ee9682c7af5ad1" Mar 09 09:40:05 crc kubenswrapper[4861]: I0309 09:40:05.186623 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550820-zv5t2" Mar 09 09:40:05 crc kubenswrapper[4861]: I0309 09:40:05.562975 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550814-zp5g2"] Mar 09 09:40:05 crc kubenswrapper[4861]: I0309 09:40:05.571468 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550814-zp5g2"] Mar 09 09:40:05 crc kubenswrapper[4861]: I0309 09:40:05.668201 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2c0233c-02dd-4711-8633-a75b72a8fb19" path="/var/lib/kubelet/pods/e2c0233c-02dd-4711-8633-a75b72a8fb19/volumes" Mar 09 09:40:13 crc kubenswrapper[4861]: I0309 09:40:13.468101 4861 scope.go:117] "RemoveContainer" containerID="694cf3cfb68f6b432415aaab5d6235d47fd0f6ad980609da705a1c42dfab4c46" Mar 09 09:40:18 crc kubenswrapper[4861]: I0309 09:40:18.515003 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qjh8b"] Mar 09 09:40:18 crc kubenswrapper[4861]: E0309 09:40:18.516438 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1574dfe2-f3f7-4f85-9a01-4b436cb68a9c" containerName="oc" Mar 09 09:40:18 crc kubenswrapper[4861]: I0309 09:40:18.516489 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="1574dfe2-f3f7-4f85-9a01-4b436cb68a9c" containerName="oc" Mar 09 09:40:18 crc kubenswrapper[4861]: I0309 09:40:18.516949 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="1574dfe2-f3f7-4f85-9a01-4b436cb68a9c" containerName="oc" Mar 09 09:40:18 crc kubenswrapper[4861]: I0309 09:40:18.519188 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qjh8b" Mar 09 09:40:18 crc kubenswrapper[4861]: I0309 09:40:18.526748 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qjh8b"] Mar 09 09:40:18 crc kubenswrapper[4861]: I0309 09:40:18.596316 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsq4j\" (UniqueName: \"kubernetes.io/projected/d0523999-9c2d-4335-8c1e-249abc1099b9-kube-api-access-gsq4j\") pod \"community-operators-qjh8b\" (UID: \"d0523999-9c2d-4335-8c1e-249abc1099b9\") " pod="openshift-marketplace/community-operators-qjh8b" Mar 09 09:40:18 crc kubenswrapper[4861]: I0309 09:40:18.596437 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0523999-9c2d-4335-8c1e-249abc1099b9-catalog-content\") pod \"community-operators-qjh8b\" (UID: \"d0523999-9c2d-4335-8c1e-249abc1099b9\") " pod="openshift-marketplace/community-operators-qjh8b" Mar 09 09:40:18 crc kubenswrapper[4861]: I0309 09:40:18.596558 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0523999-9c2d-4335-8c1e-249abc1099b9-utilities\") pod \"community-operators-qjh8b\" (UID: \"d0523999-9c2d-4335-8c1e-249abc1099b9\") " pod="openshift-marketplace/community-operators-qjh8b" Mar 09 09:40:18 crc kubenswrapper[4861]: I0309 09:40:18.697902 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsq4j\" (UniqueName: \"kubernetes.io/projected/d0523999-9c2d-4335-8c1e-249abc1099b9-kube-api-access-gsq4j\") pod \"community-operators-qjh8b\" (UID: \"d0523999-9c2d-4335-8c1e-249abc1099b9\") " pod="openshift-marketplace/community-operators-qjh8b" Mar 09 09:40:18 crc kubenswrapper[4861]: I0309 09:40:18.698046 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0523999-9c2d-4335-8c1e-249abc1099b9-catalog-content\") pod \"community-operators-qjh8b\" (UID: \"d0523999-9c2d-4335-8c1e-249abc1099b9\") " pod="openshift-marketplace/community-operators-qjh8b" Mar 09 09:40:18 crc kubenswrapper[4861]: I0309 09:40:18.698112 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0523999-9c2d-4335-8c1e-249abc1099b9-utilities\") pod \"community-operators-qjh8b\" (UID: \"d0523999-9c2d-4335-8c1e-249abc1099b9\") " pod="openshift-marketplace/community-operators-qjh8b" Mar 09 09:40:18 crc kubenswrapper[4861]: I0309 09:40:18.698663 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0523999-9c2d-4335-8c1e-249abc1099b9-utilities\") pod \"community-operators-qjh8b\" (UID: \"d0523999-9c2d-4335-8c1e-249abc1099b9\") " pod="openshift-marketplace/community-operators-qjh8b" Mar 09 09:40:18 crc kubenswrapper[4861]: I0309 09:40:18.698668 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0523999-9c2d-4335-8c1e-249abc1099b9-catalog-content\") pod \"community-operators-qjh8b\" (UID: \"d0523999-9c2d-4335-8c1e-249abc1099b9\") " pod="openshift-marketplace/community-operators-qjh8b" Mar 09 09:40:18 crc kubenswrapper[4861]: I0309 09:40:18.719523 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsq4j\" (UniqueName: \"kubernetes.io/projected/d0523999-9c2d-4335-8c1e-249abc1099b9-kube-api-access-gsq4j\") pod \"community-operators-qjh8b\" (UID: \"d0523999-9c2d-4335-8c1e-249abc1099b9\") " pod="openshift-marketplace/community-operators-qjh8b" Mar 09 09:40:18 crc kubenswrapper[4861]: I0309 09:40:18.882898 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qjh8b" Mar 09 09:40:19 crc kubenswrapper[4861]: I0309 09:40:19.449207 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qjh8b"] Mar 09 09:40:20 crc kubenswrapper[4861]: I0309 09:40:20.325974 4861 generic.go:334] "Generic (PLEG): container finished" podID="d0523999-9c2d-4335-8c1e-249abc1099b9" containerID="536a65f3794ddab69cd20b3ff11ae82e2e93cf196dede9489d1db44acd8c348f" exitCode=0 Mar 09 09:40:20 crc kubenswrapper[4861]: I0309 09:40:20.326338 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qjh8b" event={"ID":"d0523999-9c2d-4335-8c1e-249abc1099b9","Type":"ContainerDied","Data":"536a65f3794ddab69cd20b3ff11ae82e2e93cf196dede9489d1db44acd8c348f"} Mar 09 09:40:20 crc kubenswrapper[4861]: I0309 09:40:20.326367 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qjh8b" event={"ID":"d0523999-9c2d-4335-8c1e-249abc1099b9","Type":"ContainerStarted","Data":"e9fa7b58ea53a76f5cf3afb21b30566280e43e7e5f11cfd1f96365846f1d0f59"} Mar 09 09:40:22 crc kubenswrapper[4861]: I0309 09:40:22.309675 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5khxx"] Mar 09 09:40:22 crc kubenswrapper[4861]: I0309 09:40:22.311840 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5khxx" Mar 09 09:40:22 crc kubenswrapper[4861]: I0309 09:40:22.321662 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5khxx"] Mar 09 09:40:22 crc kubenswrapper[4861]: I0309 09:40:22.347296 4861 generic.go:334] "Generic (PLEG): container finished" podID="9e525f15-c77e-4a1c-a161-4db82064bf70" containerID="e48d4f898d2531d44bceb7d457aa754b59e07b72a921dd50fea17caae3a2cd1c" exitCode=0 Mar 09 09:40:22 crc kubenswrapper[4861]: I0309 09:40:22.347341 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d9zgg" event={"ID":"9e525f15-c77e-4a1c-a161-4db82064bf70","Type":"ContainerDied","Data":"e48d4f898d2531d44bceb7d457aa754b59e07b72a921dd50fea17caae3a2cd1c"} Mar 09 09:40:22 crc kubenswrapper[4861]: I0309 09:40:22.371449 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bec06ec-962e-4cef-81a7-f81708d72f2c-catalog-content\") pod \"certified-operators-5khxx\" (UID: \"8bec06ec-962e-4cef-81a7-f81708d72f2c\") " pod="openshift-marketplace/certified-operators-5khxx" Mar 09 09:40:22 crc kubenswrapper[4861]: I0309 09:40:22.371490 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8z82\" (UniqueName: \"kubernetes.io/projected/8bec06ec-962e-4cef-81a7-f81708d72f2c-kube-api-access-k8z82\") pod \"certified-operators-5khxx\" (UID: \"8bec06ec-962e-4cef-81a7-f81708d72f2c\") " pod="openshift-marketplace/certified-operators-5khxx" Mar 09 09:40:22 crc kubenswrapper[4861]: I0309 09:40:22.371836 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bec06ec-962e-4cef-81a7-f81708d72f2c-utilities\") pod \"certified-operators-5khxx\" (UID: \"8bec06ec-962e-4cef-81a7-f81708d72f2c\") " pod="openshift-marketplace/certified-operators-5khxx" Mar 09 09:40:22 crc kubenswrapper[4861]: I0309 09:40:22.475654 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bec06ec-962e-4cef-81a7-f81708d72f2c-catalog-content\") pod \"certified-operators-5khxx\" (UID: \"8bec06ec-962e-4cef-81a7-f81708d72f2c\") " pod="openshift-marketplace/certified-operators-5khxx" Mar 09 09:40:22 crc kubenswrapper[4861]: I0309 09:40:22.475728 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8z82\" (UniqueName: \"kubernetes.io/projected/8bec06ec-962e-4cef-81a7-f81708d72f2c-kube-api-access-k8z82\") pod \"certified-operators-5khxx\" (UID: \"8bec06ec-962e-4cef-81a7-f81708d72f2c\") " pod="openshift-marketplace/certified-operators-5khxx" Mar 09 09:40:22 crc kubenswrapper[4861]: I0309 09:40:22.475911 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bec06ec-962e-4cef-81a7-f81708d72f2c-utilities\") pod \"certified-operators-5khxx\" (UID: \"8bec06ec-962e-4cef-81a7-f81708d72f2c\") " pod="openshift-marketplace/certified-operators-5khxx" Mar 09 09:40:22 crc kubenswrapper[4861]: I0309 09:40:22.476148 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bec06ec-962e-4cef-81a7-f81708d72f2c-catalog-content\") pod \"certified-operators-5khxx\" (UID: \"8bec06ec-962e-4cef-81a7-f81708d72f2c\") " pod="openshift-marketplace/certified-operators-5khxx" Mar 09 09:40:22 crc kubenswrapper[4861]: I0309 09:40:22.476408 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bec06ec-962e-4cef-81a7-f81708d72f2c-utilities\") pod \"certified-operators-5khxx\" (UID: \"8bec06ec-962e-4cef-81a7-f81708d72f2c\") " pod="openshift-marketplace/certified-operators-5khxx" Mar 09 09:40:22 crc kubenswrapper[4861]: I0309 09:40:22.499106 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8z82\" (UniqueName: \"kubernetes.io/projected/8bec06ec-962e-4cef-81a7-f81708d72f2c-kube-api-access-k8z82\") pod \"certified-operators-5khxx\" (UID: \"8bec06ec-962e-4cef-81a7-f81708d72f2c\") " pod="openshift-marketplace/certified-operators-5khxx" Mar 09 09:40:22 crc kubenswrapper[4861]: I0309 09:40:22.640534 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5khxx" Mar 09 09:40:24 crc kubenswrapper[4861]: I0309 09:40:24.260444 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d9zgg" Mar 09 09:40:24 crc kubenswrapper[4861]: I0309 09:40:24.363911 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d9zgg" event={"ID":"9e525f15-c77e-4a1c-a161-4db82064bf70","Type":"ContainerDied","Data":"753ee8c4ed321eeb7285bbab02764169154e7740bafaa3000e0d5231219f42f9"} Mar 09 09:40:24 crc kubenswrapper[4861]: I0309 09:40:24.363950 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="753ee8c4ed321eeb7285bbab02764169154e7740bafaa3000e0d5231219f42f9" Mar 09 09:40:24 crc kubenswrapper[4861]: I0309 09:40:24.364008 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-d9zgg" Mar 09 09:40:24 crc kubenswrapper[4861]: I0309 09:40:24.406174 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9g9wr\" (UniqueName: \"kubernetes.io/projected/9e525f15-c77e-4a1c-a161-4db82064bf70-kube-api-access-9g9wr\") pod \"9e525f15-c77e-4a1c-a161-4db82064bf70\" (UID: \"9e525f15-c77e-4a1c-a161-4db82064bf70\") " Mar 09 09:40:24 crc kubenswrapper[4861]: I0309 09:40:24.406274 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9e525f15-c77e-4a1c-a161-4db82064bf70-ssh-key-openstack-edpm-ipam\") pod \"9e525f15-c77e-4a1c-a161-4db82064bf70\" (UID: \"9e525f15-c77e-4a1c-a161-4db82064bf70\") " Mar 09 09:40:24 crc kubenswrapper[4861]: I0309 09:40:24.406390 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9e525f15-c77e-4a1c-a161-4db82064bf70-inventory\") pod \"9e525f15-c77e-4a1c-a161-4db82064bf70\" (UID: \"9e525f15-c77e-4a1c-a161-4db82064bf70\") " Mar 09 09:40:24 crc kubenswrapper[4861]: I0309 09:40:24.406431 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9e525f15-c77e-4a1c-a161-4db82064bf70-neutron-ovn-metadata-agent-neutron-config-0\") pod \"9e525f15-c77e-4a1c-a161-4db82064bf70\" (UID: \"9e525f15-c77e-4a1c-a161-4db82064bf70\") " Mar 09 09:40:24 crc kubenswrapper[4861]: I0309 09:40:24.406467 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9e525f15-c77e-4a1c-a161-4db82064bf70-nova-metadata-neutron-config-0\") pod \"9e525f15-c77e-4a1c-a161-4db82064bf70\" (UID: \"9e525f15-c77e-4a1c-a161-4db82064bf70\") " Mar 09 09:40:24 crc kubenswrapper[4861]: I0309 09:40:24.406742 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e525f15-c77e-4a1c-a161-4db82064bf70-neutron-metadata-combined-ca-bundle\") pod \"9e525f15-c77e-4a1c-a161-4db82064bf70\" (UID: \"9e525f15-c77e-4a1c-a161-4db82064bf70\") " Mar 09 09:40:24 crc kubenswrapper[4861]: I0309 09:40:24.420566 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e525f15-c77e-4a1c-a161-4db82064bf70-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "9e525f15-c77e-4a1c-a161-4db82064bf70" (UID: "9e525f15-c77e-4a1c-a161-4db82064bf70"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:40:24 crc kubenswrapper[4861]: I0309 09:40:24.420616 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e525f15-c77e-4a1c-a161-4db82064bf70-kube-api-access-9g9wr" (OuterVolumeSpecName: "kube-api-access-9g9wr") pod "9e525f15-c77e-4a1c-a161-4db82064bf70" (UID: "9e525f15-c77e-4a1c-a161-4db82064bf70"). InnerVolumeSpecName "kube-api-access-9g9wr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:40:24 crc kubenswrapper[4861]: I0309 09:40:24.437196 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e525f15-c77e-4a1c-a161-4db82064bf70-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "9e525f15-c77e-4a1c-a161-4db82064bf70" (UID: "9e525f15-c77e-4a1c-a161-4db82064bf70"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:40:24 crc kubenswrapper[4861]: I0309 09:40:24.439306 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e525f15-c77e-4a1c-a161-4db82064bf70-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "9e525f15-c77e-4a1c-a161-4db82064bf70" (UID: "9e525f15-c77e-4a1c-a161-4db82064bf70"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:40:24 crc kubenswrapper[4861]: I0309 09:40:24.446360 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e525f15-c77e-4a1c-a161-4db82064bf70-inventory" (OuterVolumeSpecName: "inventory") pod "9e525f15-c77e-4a1c-a161-4db82064bf70" (UID: "9e525f15-c77e-4a1c-a161-4db82064bf70"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:40:24 crc kubenswrapper[4861]: I0309 09:40:24.447235 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e525f15-c77e-4a1c-a161-4db82064bf70-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9e525f15-c77e-4a1c-a161-4db82064bf70" (UID: "9e525f15-c77e-4a1c-a161-4db82064bf70"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:40:24 crc kubenswrapper[4861]: I0309 09:40:24.509020 4861 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9e525f15-c77e-4a1c-a161-4db82064bf70-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 09:40:24 crc kubenswrapper[4861]: I0309 09:40:24.509076 4861 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9e525f15-c77e-4a1c-a161-4db82064bf70-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 09:40:24 crc kubenswrapper[4861]: I0309 09:40:24.509092 4861 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9e525f15-c77e-4a1c-a161-4db82064bf70-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 09 09:40:24 crc kubenswrapper[4861]: I0309 09:40:24.509106 4861 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9e525f15-c77e-4a1c-a161-4db82064bf70-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 09 09:40:24 crc kubenswrapper[4861]: I0309 09:40:24.509119 4861 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e525f15-c77e-4a1c-a161-4db82064bf70-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:40:24 crc kubenswrapper[4861]: I0309 09:40:24.509146 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9g9wr\" (UniqueName: \"kubernetes.io/projected/9e525f15-c77e-4a1c-a161-4db82064bf70-kube-api-access-9g9wr\") on node \"crc\" DevicePath \"\"" Mar 09 09:40:24 crc kubenswrapper[4861]: I0309 09:40:24.605589 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ngps7"] Mar 09 09:40:24 crc kubenswrapper[4861]: I0309 09:40:24.606067 4861 patch_prober.go:28] interesting pod/machine-config-daemon-5g7gc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:40:24 crc kubenswrapper[4861]: I0309 09:40:24.606118 4861 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:40:24 crc kubenswrapper[4861]: E0309 09:40:24.606201 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e525f15-c77e-4a1c-a161-4db82064bf70" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 09 09:40:24 crc kubenswrapper[4861]: I0309 09:40:24.606232 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e525f15-c77e-4a1c-a161-4db82064bf70" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 09 09:40:24 crc kubenswrapper[4861]: I0309 09:40:24.607535 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e525f15-c77e-4a1c-a161-4db82064bf70" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 09 09:40:24 crc kubenswrapper[4861]: I0309 09:40:24.608415 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ngps7" Mar 09 09:40:24 crc kubenswrapper[4861]: I0309 09:40:24.611705 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 09 09:40:24 crc kubenswrapper[4861]: I0309 09:40:24.616547 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ngps7"] Mar 09 09:40:24 crc kubenswrapper[4861]: I0309 09:40:24.655611 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5khxx"] Mar 09 09:40:24 crc kubenswrapper[4861]: I0309 09:40:24.711851 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d783d4c7-dfa9-4783-a80c-2938d2a5841d-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ngps7\" (UID: \"d783d4c7-dfa9-4783-a80c-2938d2a5841d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ngps7" Mar 09 09:40:24 crc kubenswrapper[4861]: I0309 09:40:24.712157 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/d783d4c7-dfa9-4783-a80c-2938d2a5841d-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ngps7\" (UID: \"d783d4c7-dfa9-4783-a80c-2938d2a5841d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ngps7" Mar 09 09:40:24 crc kubenswrapper[4861]: I0309 09:40:24.712184 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d783d4c7-dfa9-4783-a80c-2938d2a5841d-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ngps7\" (UID: \"d783d4c7-dfa9-4783-a80c-2938d2a5841d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ngps7" Mar 09 09:40:24 crc kubenswrapper[4861]: I0309 09:40:24.712205 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rgcs\" (UniqueName: \"kubernetes.io/projected/d783d4c7-dfa9-4783-a80c-2938d2a5841d-kube-api-access-9rgcs\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ngps7\" (UID: \"d783d4c7-dfa9-4783-a80c-2938d2a5841d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ngps7" Mar 09 09:40:24 crc kubenswrapper[4861]: I0309 09:40:24.712239 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d783d4c7-dfa9-4783-a80c-2938d2a5841d-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ngps7\" (UID: \"d783d4c7-dfa9-4783-a80c-2938d2a5841d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ngps7" Mar 09 09:40:24 crc kubenswrapper[4861]: I0309 09:40:24.814821 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d783d4c7-dfa9-4783-a80c-2938d2a5841d-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ngps7\" (UID: \"d783d4c7-dfa9-4783-a80c-2938d2a5841d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ngps7" Mar 09 09:40:24 crc kubenswrapper[4861]: I0309 09:40:24.814928 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/d783d4c7-dfa9-4783-a80c-2938d2a5841d-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ngps7\" (UID: \"d783d4c7-dfa9-4783-a80c-2938d2a5841d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ngps7" Mar 09 09:40:24 crc kubenswrapper[4861]: I0309 09:40:24.814952 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d783d4c7-dfa9-4783-a80c-2938d2a5841d-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ngps7\" (UID: \"d783d4c7-dfa9-4783-a80c-2938d2a5841d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ngps7" Mar 09 09:40:24 crc kubenswrapper[4861]: I0309 09:40:24.814978 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rgcs\" (UniqueName: \"kubernetes.io/projected/d783d4c7-dfa9-4783-a80c-2938d2a5841d-kube-api-access-9rgcs\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ngps7\" (UID: \"d783d4c7-dfa9-4783-a80c-2938d2a5841d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ngps7" Mar 09 09:40:24 crc kubenswrapper[4861]: I0309 09:40:24.815029 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d783d4c7-dfa9-4783-a80c-2938d2a5841d-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ngps7\" (UID: \"d783d4c7-dfa9-4783-a80c-2938d2a5841d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ngps7" Mar 09 09:40:24 crc kubenswrapper[4861]: I0309 09:40:24.823347 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/d783d4c7-dfa9-4783-a80c-2938d2a5841d-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ngps7\" (UID: \"d783d4c7-dfa9-4783-a80c-2938d2a5841d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ngps7" Mar 09 09:40:24 crc kubenswrapper[4861]: I0309 09:40:24.823568 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d783d4c7-dfa9-4783-a80c-2938d2a5841d-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ngps7\" (UID: \"d783d4c7-dfa9-4783-a80c-2938d2a5841d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ngps7" Mar 09 09:40:24 crc kubenswrapper[4861]: I0309 09:40:24.823901 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d783d4c7-dfa9-4783-a80c-2938d2a5841d-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ngps7\" (UID: \"d783d4c7-dfa9-4783-a80c-2938d2a5841d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ngps7" Mar 09 09:40:24 crc kubenswrapper[4861]: I0309 09:40:24.825295 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d783d4c7-dfa9-4783-a80c-2938d2a5841d-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ngps7\" (UID: \"d783d4c7-dfa9-4783-a80c-2938d2a5841d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ngps7" Mar 09 09:40:24 crc kubenswrapper[4861]: I0309 09:40:24.839515 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rgcs\" (UniqueName: \"kubernetes.io/projected/d783d4c7-dfa9-4783-a80c-2938d2a5841d-kube-api-access-9rgcs\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-ngps7\" (UID: \"d783d4c7-dfa9-4783-a80c-2938d2a5841d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ngps7" Mar 09 09:40:24 crc kubenswrapper[4861]: I0309 09:40:24.942241 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ngps7" Mar 09 09:40:25 crc kubenswrapper[4861]: I0309 09:40:25.261995 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ngps7"] Mar 09 09:40:25 crc kubenswrapper[4861]: W0309 09:40:25.264317 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd783d4c7_dfa9_4783_a80c_2938d2a5841d.slice/crio-ac1bdee8e2a19add7534a707c6b9e5ca0ab5c79294e98a258e2f07f58f7d647c WatchSource:0}: Error finding container ac1bdee8e2a19add7534a707c6b9e5ca0ab5c79294e98a258e2f07f58f7d647c: Status 404 returned error can't find the container with id ac1bdee8e2a19add7534a707c6b9e5ca0ab5c79294e98a258e2f07f58f7d647c Mar 09 09:40:25 crc kubenswrapper[4861]: I0309 09:40:25.375022 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ngps7" event={"ID":"d783d4c7-dfa9-4783-a80c-2938d2a5841d","Type":"ContainerStarted","Data":"ac1bdee8e2a19add7534a707c6b9e5ca0ab5c79294e98a258e2f07f58f7d647c"} Mar 09 09:40:25 crc kubenswrapper[4861]: I0309 09:40:25.376708 4861 generic.go:334] "Generic (PLEG): container finished" podID="8bec06ec-962e-4cef-81a7-f81708d72f2c" containerID="3212fd9ed720b02f660ce47cea4956b86b519429b1811813a404249791b1dcfc" exitCode=0 Mar 09 09:40:25 crc kubenswrapper[4861]: I0309 09:40:25.376773 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5khxx" event={"ID":"8bec06ec-962e-4cef-81a7-f81708d72f2c","Type":"ContainerDied","Data":"3212fd9ed720b02f660ce47cea4956b86b519429b1811813a404249791b1dcfc"} Mar 09 09:40:25 crc kubenswrapper[4861]: I0309 09:40:25.376807 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5khxx" event={"ID":"8bec06ec-962e-4cef-81a7-f81708d72f2c","Type":"ContainerStarted","Data":"a80989d77f8438cbc93c9335e2cb3b960c911eff537d868639781e55f1a8c2e3"} Mar 09 09:40:25 crc kubenswrapper[4861]: I0309 09:40:25.379585 4861 generic.go:334] "Generic (PLEG): container finished" podID="d0523999-9c2d-4335-8c1e-249abc1099b9" containerID="9c00705657843f8a6366840a6ac70eaac2b613ad1f8766ce1e063a422529d8d5" exitCode=0 Mar 09 09:40:25 crc kubenswrapper[4861]: I0309 09:40:25.379639 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qjh8b" event={"ID":"d0523999-9c2d-4335-8c1e-249abc1099b9","Type":"ContainerDied","Data":"9c00705657843f8a6366840a6ac70eaac2b613ad1f8766ce1e063a422529d8d5"} Mar 09 09:40:26 crc kubenswrapper[4861]: I0309 09:40:26.402920 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ngps7" event={"ID":"d783d4c7-dfa9-4783-a80c-2938d2a5841d","Type":"ContainerStarted","Data":"6d420dd64bb340980c9305c084a933532f6376a5f09a7aa911792aa8c9aec1a4"} Mar 09 09:40:27 crc kubenswrapper[4861]: I0309 09:40:27.413208 4861 generic.go:334] "Generic (PLEG): container finished" podID="8bec06ec-962e-4cef-81a7-f81708d72f2c" containerID="8a9d4d6d9a83c4b4adca33da71daca39fc9ed3c45524a1171d03b2373445a024" exitCode=0 Mar 09 09:40:27 crc kubenswrapper[4861]: I0309 09:40:27.413278 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5khxx" event={"ID":"8bec06ec-962e-4cef-81a7-f81708d72f2c","Type":"ContainerDied","Data":"8a9d4d6d9a83c4b4adca33da71daca39fc9ed3c45524a1171d03b2373445a024"} Mar 09 09:40:27 crc kubenswrapper[4861]: I0309 09:40:27.416510 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qjh8b" event={"ID":"d0523999-9c2d-4335-8c1e-249abc1099b9","Type":"ContainerStarted","Data":"0df84067cc98849729fc39e4d221ab3a8f3f17455530f3286215af520cb110f7"} Mar 09 09:40:27 crc kubenswrapper[4861]: I0309 09:40:27.444958 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ngps7" podStartSLOduration=2.87924298 podStartE2EDuration="3.444937927s" podCreationTimestamp="2026-03-09 09:40:24 +0000 UTC" firstStartedPulling="2026-03-09 09:40:25.267769514 +0000 UTC m=+2068.352808915" lastFinishedPulling="2026-03-09 09:40:25.833464461 +0000 UTC m=+2068.918503862" observedRunningTime="2026-03-09 09:40:26.424331255 +0000 UTC m=+2069.509370736" watchObservedRunningTime="2026-03-09 09:40:27.444937927 +0000 UTC m=+2070.529977328" Mar 09 09:40:27 crc kubenswrapper[4861]: I0309 09:40:27.461261 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qjh8b" podStartSLOduration=3.339307219 podStartE2EDuration="9.461238888s" podCreationTimestamp="2026-03-09 09:40:18 +0000 UTC" firstStartedPulling="2026-03-09 09:40:20.32970747 +0000 UTC m=+2063.414746871" lastFinishedPulling="2026-03-09 09:40:26.451639129 +0000 UTC m=+2069.536678540" observedRunningTime="2026-03-09 09:40:27.459970202 +0000 UTC m=+2070.545009603" watchObservedRunningTime="2026-03-09 09:40:27.461238888 +0000 UTC m=+2070.546278289" Mar 09 09:40:28 crc kubenswrapper[4861]: I0309 09:40:28.427645 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5khxx" event={"ID":"8bec06ec-962e-4cef-81a7-f81708d72f2c","Type":"ContainerStarted","Data":"df2d435d0a51d419ddd04b1722e92c3ce0bcb0e968bc95163c20ff65d29fbc17"} Mar 09 09:40:28 crc kubenswrapper[4861]: I0309 09:40:28.448746 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5khxx" podStartSLOduration=3.98284241 podStartE2EDuration="6.448726094s" podCreationTimestamp="2026-03-09 09:40:22 +0000 UTC" firstStartedPulling="2026-03-09 09:40:25.378948317 +0000 UTC m=+2068.463987718" lastFinishedPulling="2026-03-09 09:40:27.844832001 +0000 UTC m=+2070.929871402" observedRunningTime="2026-03-09 09:40:28.445600948 +0000 UTC m=+2071.530640349" watchObservedRunningTime="2026-03-09 09:40:28.448726094 +0000 UTC m=+2071.533765515" Mar 09 09:40:28 crc kubenswrapper[4861]: I0309 09:40:28.884082 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qjh8b" Mar 09 09:40:28 crc kubenswrapper[4861]: I0309 09:40:28.884421 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qjh8b" Mar 09 09:40:28 crc kubenswrapper[4861]: I0309 09:40:28.931814 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qjh8b" Mar 09 09:40:32 crc kubenswrapper[4861]: I0309 09:40:32.641250 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5khxx" Mar 09 09:40:32 crc kubenswrapper[4861]: I0309 09:40:32.641631 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5khxx" Mar 09 09:40:32 crc kubenswrapper[4861]: I0309 09:40:32.700397 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5khxx" Mar 09 09:40:33 crc kubenswrapper[4861]: I0309 09:40:33.508658 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5khxx" Mar 09 09:40:34 crc kubenswrapper[4861]: I0309 09:40:34.489802 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5khxx"] Mar 09 09:40:35 crc kubenswrapper[4861]: I0309 09:40:35.489499 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5khxx" podUID="8bec06ec-962e-4cef-81a7-f81708d72f2c" containerName="registry-server" containerID="cri-o://df2d435d0a51d419ddd04b1722e92c3ce0bcb0e968bc95163c20ff65d29fbc17" gracePeriod=2 Mar 09 09:40:35 crc kubenswrapper[4861]: I0309 09:40:35.914329 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-srlcz"] Mar 09 09:40:35 crc kubenswrapper[4861]: I0309 09:40:35.917439 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-srlcz" Mar 09 09:40:35 crc kubenswrapper[4861]: I0309 09:40:35.928384 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-srlcz"] Mar 09 09:40:35 crc kubenswrapper[4861]: I0309 09:40:35.982639 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5khxx" Mar 09 09:40:36 crc kubenswrapper[4861]: I0309 09:40:36.086935 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bec06ec-962e-4cef-81a7-f81708d72f2c-utilities\") pod \"8bec06ec-962e-4cef-81a7-f81708d72f2c\" (UID: \"8bec06ec-962e-4cef-81a7-f81708d72f2c\") " Mar 09 09:40:36 crc kubenswrapper[4861]: I0309 09:40:36.087791 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8z82\" (UniqueName: \"kubernetes.io/projected/8bec06ec-962e-4cef-81a7-f81708d72f2c-kube-api-access-k8z82\") pod \"8bec06ec-962e-4cef-81a7-f81708d72f2c\" (UID: \"8bec06ec-962e-4cef-81a7-f81708d72f2c\") " Mar 09 09:40:36 crc kubenswrapper[4861]: I0309 09:40:36.088016 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bec06ec-962e-4cef-81a7-f81708d72f2c-catalog-content\") pod \"8bec06ec-962e-4cef-81a7-f81708d72f2c\" (UID: \"8bec06ec-962e-4cef-81a7-f81708d72f2c\") " Mar 09 09:40:36 crc kubenswrapper[4861]: I0309 09:40:36.088101 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bec06ec-962e-4cef-81a7-f81708d72f2c-utilities" (OuterVolumeSpecName: "utilities") pod "8bec06ec-962e-4cef-81a7-f81708d72f2c" (UID: "8bec06ec-962e-4cef-81a7-f81708d72f2c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:40:36 crc kubenswrapper[4861]: I0309 09:40:36.088418 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21ff87a9-2741-4c62-896a-98af7860574e-utilities\") pod \"redhat-operators-srlcz\" (UID: \"21ff87a9-2741-4c62-896a-98af7860574e\") " pod="openshift-marketplace/redhat-operators-srlcz" Mar 09 09:40:36 crc kubenswrapper[4861]: I0309 09:40:36.088542 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5sw7\" (UniqueName: \"kubernetes.io/projected/21ff87a9-2741-4c62-896a-98af7860574e-kube-api-access-t5sw7\") pod \"redhat-operators-srlcz\" (UID: \"21ff87a9-2741-4c62-896a-98af7860574e\") " pod="openshift-marketplace/redhat-operators-srlcz" Mar 09 09:40:36 crc kubenswrapper[4861]: I0309 09:40:36.088821 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21ff87a9-2741-4c62-896a-98af7860574e-catalog-content\") pod \"redhat-operators-srlcz\" (UID: \"21ff87a9-2741-4c62-896a-98af7860574e\") " pod="openshift-marketplace/redhat-operators-srlcz" Mar 09 09:40:36 crc kubenswrapper[4861]: I0309 09:40:36.088939 4861 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bec06ec-962e-4cef-81a7-f81708d72f2c-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:40:36 crc kubenswrapper[4861]: I0309 09:40:36.101873 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bec06ec-962e-4cef-81a7-f81708d72f2c-kube-api-access-k8z82" (OuterVolumeSpecName: "kube-api-access-k8z82") pod "8bec06ec-962e-4cef-81a7-f81708d72f2c" (UID: "8bec06ec-962e-4cef-81a7-f81708d72f2c"). InnerVolumeSpecName "kube-api-access-k8z82". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:40:36 crc kubenswrapper[4861]: I0309 09:40:36.144184 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bec06ec-962e-4cef-81a7-f81708d72f2c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8bec06ec-962e-4cef-81a7-f81708d72f2c" (UID: "8bec06ec-962e-4cef-81a7-f81708d72f2c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:40:36 crc kubenswrapper[4861]: I0309 09:40:36.191020 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21ff87a9-2741-4c62-896a-98af7860574e-catalog-content\") pod \"redhat-operators-srlcz\" (UID: \"21ff87a9-2741-4c62-896a-98af7860574e\") " pod="openshift-marketplace/redhat-operators-srlcz" Mar 09 09:40:36 crc kubenswrapper[4861]: I0309 09:40:36.191077 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21ff87a9-2741-4c62-896a-98af7860574e-utilities\") pod \"redhat-operators-srlcz\" (UID: \"21ff87a9-2741-4c62-896a-98af7860574e\") " pod="openshift-marketplace/redhat-operators-srlcz" Mar 09 09:40:36 crc kubenswrapper[4861]: I0309 09:40:36.191573 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5sw7\" (UniqueName: \"kubernetes.io/projected/21ff87a9-2741-4c62-896a-98af7860574e-kube-api-access-t5sw7\") pod \"redhat-operators-srlcz\" (UID: \"21ff87a9-2741-4c62-896a-98af7860574e\") " pod="openshift-marketplace/redhat-operators-srlcz" Mar 09 09:40:36 crc kubenswrapper[4861]: I0309 09:40:36.191519 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21ff87a9-2741-4c62-896a-98af7860574e-utilities\") pod \"redhat-operators-srlcz\" (UID: \"21ff87a9-2741-4c62-896a-98af7860574e\") " pod="openshift-marketplace/redhat-operators-srlcz" Mar 09 09:40:36 crc kubenswrapper[4861]: I0309 09:40:36.191661 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21ff87a9-2741-4c62-896a-98af7860574e-catalog-content\") pod \"redhat-operators-srlcz\" (UID: \"21ff87a9-2741-4c62-896a-98af7860574e\") " pod="openshift-marketplace/redhat-operators-srlcz" Mar 09 09:40:36 crc kubenswrapper[4861]: I0309 09:40:36.191837 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8z82\" (UniqueName: \"kubernetes.io/projected/8bec06ec-962e-4cef-81a7-f81708d72f2c-kube-api-access-k8z82\") on node \"crc\" DevicePath \"\"" Mar 09 09:40:36 crc kubenswrapper[4861]: I0309 09:40:36.191854 4861 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bec06ec-962e-4cef-81a7-f81708d72f2c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:40:36 crc kubenswrapper[4861]: I0309 09:40:36.210706 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5sw7\" (UniqueName: \"kubernetes.io/projected/21ff87a9-2741-4c62-896a-98af7860574e-kube-api-access-t5sw7\") pod \"redhat-operators-srlcz\" (UID: \"21ff87a9-2741-4c62-896a-98af7860574e\") " pod="openshift-marketplace/redhat-operators-srlcz" Mar 09 09:40:36 crc kubenswrapper[4861]: I0309 09:40:36.301824 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-srlcz" Mar 09 09:40:36 crc kubenswrapper[4861]: I0309 09:40:36.499711 4861 generic.go:334] "Generic (PLEG): container finished" podID="8bec06ec-962e-4cef-81a7-f81708d72f2c" containerID="df2d435d0a51d419ddd04b1722e92c3ce0bcb0e968bc95163c20ff65d29fbc17" exitCode=0 Mar 09 09:40:36 crc kubenswrapper[4861]: I0309 09:40:36.499952 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5khxx" event={"ID":"8bec06ec-962e-4cef-81a7-f81708d72f2c","Type":"ContainerDied","Data":"df2d435d0a51d419ddd04b1722e92c3ce0bcb0e968bc95163c20ff65d29fbc17"} Mar 09 09:40:36 crc kubenswrapper[4861]: I0309 09:40:36.500041 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5khxx" event={"ID":"8bec06ec-962e-4cef-81a7-f81708d72f2c","Type":"ContainerDied","Data":"a80989d77f8438cbc93c9335e2cb3b960c911eff537d868639781e55f1a8c2e3"} Mar 09 09:40:36 crc kubenswrapper[4861]: I0309 09:40:36.500063 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5khxx" Mar 09 09:40:36 crc kubenswrapper[4861]: I0309 09:40:36.500068 4861 scope.go:117] "RemoveContainer" containerID="df2d435d0a51d419ddd04b1722e92c3ce0bcb0e968bc95163c20ff65d29fbc17" Mar 09 09:40:36 crc kubenswrapper[4861]: I0309 09:40:36.536043 4861 scope.go:117] "RemoveContainer" containerID="8a9d4d6d9a83c4b4adca33da71daca39fc9ed3c45524a1171d03b2373445a024" Mar 09 09:40:36 crc kubenswrapper[4861]: I0309 09:40:36.544853 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5khxx"] Mar 09 09:40:36 crc kubenswrapper[4861]: I0309 09:40:36.550728 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5khxx"] Mar 09 09:40:36 crc kubenswrapper[4861]: I0309 09:40:36.569537 4861 scope.go:117] "RemoveContainer" containerID="3212fd9ed720b02f660ce47cea4956b86b519429b1811813a404249791b1dcfc" Mar 09 09:40:36 crc kubenswrapper[4861]: I0309 09:40:36.593492 4861 scope.go:117] "RemoveContainer" containerID="df2d435d0a51d419ddd04b1722e92c3ce0bcb0e968bc95163c20ff65d29fbc17" Mar 09 09:40:36 crc kubenswrapper[4861]: E0309 09:40:36.594001 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df2d435d0a51d419ddd04b1722e92c3ce0bcb0e968bc95163c20ff65d29fbc17\": container with ID starting with df2d435d0a51d419ddd04b1722e92c3ce0bcb0e968bc95163c20ff65d29fbc17 not found: ID does not exist" containerID="df2d435d0a51d419ddd04b1722e92c3ce0bcb0e968bc95163c20ff65d29fbc17" Mar 09 09:40:36 crc kubenswrapper[4861]: I0309 09:40:36.594039 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df2d435d0a51d419ddd04b1722e92c3ce0bcb0e968bc95163c20ff65d29fbc17"} err="failed to get container status \"df2d435d0a51d419ddd04b1722e92c3ce0bcb0e968bc95163c20ff65d29fbc17\": rpc error: code = NotFound desc = could not find container \"df2d435d0a51d419ddd04b1722e92c3ce0bcb0e968bc95163c20ff65d29fbc17\": container with ID starting with df2d435d0a51d419ddd04b1722e92c3ce0bcb0e968bc95163c20ff65d29fbc17 not found: ID does not exist" Mar 09 09:40:36 crc kubenswrapper[4861]: I0309 09:40:36.594061 4861 scope.go:117] "RemoveContainer" containerID="8a9d4d6d9a83c4b4adca33da71daca39fc9ed3c45524a1171d03b2373445a024" Mar 09 09:40:36 crc kubenswrapper[4861]: E0309 09:40:36.594463 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a9d4d6d9a83c4b4adca33da71daca39fc9ed3c45524a1171d03b2373445a024\": container with ID starting with 8a9d4d6d9a83c4b4adca33da71daca39fc9ed3c45524a1171d03b2373445a024 not found: ID does not exist" containerID="8a9d4d6d9a83c4b4adca33da71daca39fc9ed3c45524a1171d03b2373445a024" Mar 09 09:40:36 crc kubenswrapper[4861]: I0309 09:40:36.594519 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a9d4d6d9a83c4b4adca33da71daca39fc9ed3c45524a1171d03b2373445a024"} err="failed to get container status \"8a9d4d6d9a83c4b4adca33da71daca39fc9ed3c45524a1171d03b2373445a024\": rpc error: code = NotFound desc = could not find container \"8a9d4d6d9a83c4b4adca33da71daca39fc9ed3c45524a1171d03b2373445a024\": container with ID starting with 8a9d4d6d9a83c4b4adca33da71daca39fc9ed3c45524a1171d03b2373445a024 not found: ID does not exist" Mar 09 09:40:36 crc kubenswrapper[4861]: I0309 09:40:36.594553 4861 scope.go:117] "RemoveContainer" containerID="3212fd9ed720b02f660ce47cea4956b86b519429b1811813a404249791b1dcfc" Mar 09 09:40:36 crc kubenswrapper[4861]: E0309 09:40:36.597784 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3212fd9ed720b02f660ce47cea4956b86b519429b1811813a404249791b1dcfc\": container with ID starting with 3212fd9ed720b02f660ce47cea4956b86b519429b1811813a404249791b1dcfc not found: ID does not exist" containerID="3212fd9ed720b02f660ce47cea4956b86b519429b1811813a404249791b1dcfc" Mar 09 09:40:36 crc kubenswrapper[4861]: I0309 09:40:36.597824 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3212fd9ed720b02f660ce47cea4956b86b519429b1811813a404249791b1dcfc"} err="failed to get container status \"3212fd9ed720b02f660ce47cea4956b86b519429b1811813a404249791b1dcfc\": rpc error: code = NotFound desc = could not find container \"3212fd9ed720b02f660ce47cea4956b86b519429b1811813a404249791b1dcfc\": container with ID starting with 3212fd9ed720b02f660ce47cea4956b86b519429b1811813a404249791b1dcfc not found: ID does not exist" Mar 09 09:40:36 crc kubenswrapper[4861]: I0309 09:40:36.766467 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-srlcz"] Mar 09 09:40:37 crc kubenswrapper[4861]: I0309 09:40:37.509688 4861 generic.go:334] "Generic (PLEG): container finished" podID="21ff87a9-2741-4c62-896a-98af7860574e" containerID="a5bdb5ca4e0ede20fe927c02a957af790a14c8bb15569e7048d758ac840f3efa" exitCode=0 Mar 09 09:40:37 crc kubenswrapper[4861]: I0309 09:40:37.509762 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-srlcz" event={"ID":"21ff87a9-2741-4c62-896a-98af7860574e","Type":"ContainerDied","Data":"a5bdb5ca4e0ede20fe927c02a957af790a14c8bb15569e7048d758ac840f3efa"} Mar 09 09:40:37 crc kubenswrapper[4861]: I0309 09:40:37.510091 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-srlcz" event={"ID":"21ff87a9-2741-4c62-896a-98af7860574e","Type":"ContainerStarted","Data":"0b07a69138e0d36b4852f8eb1d0259ef03c709a2597e4c0ef7c8dffe7f0fd0ba"} Mar 09 09:40:37 crc kubenswrapper[4861]: I0309 09:40:37.667592 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bec06ec-962e-4cef-81a7-f81708d72f2c" path="/var/lib/kubelet/pods/8bec06ec-962e-4cef-81a7-f81708d72f2c/volumes" Mar 09 09:40:38 crc kubenswrapper[4861]: I0309 09:40:38.525730 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-srlcz" event={"ID":"21ff87a9-2741-4c62-896a-98af7860574e","Type":"ContainerStarted","Data":"8028aca1b326c2c2e184cd1035cf168eb80ba0fdb44c92a97fddb48aa022e711"} Mar 09 09:40:38 crc kubenswrapper[4861]: I0309 09:40:38.925635 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qjh8b" Mar 09 09:40:40 crc kubenswrapper[4861]: I0309 09:40:40.546305 4861 generic.go:334] "Generic (PLEG): container finished" podID="21ff87a9-2741-4c62-896a-98af7860574e" containerID="8028aca1b326c2c2e184cd1035cf168eb80ba0fdb44c92a97fddb48aa022e711" exitCode=0 Mar 09 09:40:40 crc kubenswrapper[4861]: I0309 09:40:40.546394 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-srlcz" event={"ID":"21ff87a9-2741-4c62-896a-98af7860574e","Type":"ContainerDied","Data":"8028aca1b326c2c2e184cd1035cf168eb80ba0fdb44c92a97fddb48aa022e711"} Mar 09 09:40:41 crc kubenswrapper[4861]: I0309 09:40:41.929683 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qjh8b"] Mar 09 09:40:42 crc kubenswrapper[4861]: I0309 09:40:42.287691 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-c7tjq"] Mar 09 09:40:42 crc kubenswrapper[4861]: I0309 09:40:42.287972 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-c7tjq" podUID="28409ae5-743b-4e9a-a432-7527ad656038" containerName="registry-server" containerID="cri-o://2476df60bfe8edebf5fbd478fc535b995acada9081305854ca705fbd49d8e73b" gracePeriod=2 Mar 09 09:40:42 crc kubenswrapper[4861]: I0309 09:40:42.566534 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-srlcz" event={"ID":"21ff87a9-2741-4c62-896a-98af7860574e","Type":"ContainerStarted","Data":"2d2057a818d159f9c85f7627dec1702a4b86d258b8a411e80c11f434ddc39dbb"} Mar 09 09:40:42 crc kubenswrapper[4861]: I0309 09:40:42.570051 4861 generic.go:334] "Generic (PLEG): container finished" podID="28409ae5-743b-4e9a-a432-7527ad656038" containerID="2476df60bfe8edebf5fbd478fc535b995acada9081305854ca705fbd49d8e73b" exitCode=0 Mar 09 09:40:42 crc kubenswrapper[4861]: I0309 09:40:42.570095 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c7tjq" event={"ID":"28409ae5-743b-4e9a-a432-7527ad656038","Type":"ContainerDied","Data":"2476df60bfe8edebf5fbd478fc535b995acada9081305854ca705fbd49d8e73b"} Mar 09 09:40:42 crc kubenswrapper[4861]: I0309 09:40:42.588168 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-srlcz" podStartSLOduration=3.3301701169999998 podStartE2EDuration="7.588151792s" podCreationTimestamp="2026-03-09 09:40:35 +0000 UTC" firstStartedPulling="2026-03-09 09:40:37.513431961 +0000 UTC m=+2080.598471362" lastFinishedPulling="2026-03-09 09:40:41.771413646 +0000 UTC m=+2084.856453037" observedRunningTime="2026-03-09 09:40:42.584353087 +0000 UTC m=+2085.669392498" watchObservedRunningTime="2026-03-09 09:40:42.588151792 +0000 UTC m=+2085.673191193" Mar 09 09:40:43 crc kubenswrapper[4861]: I0309 09:40:43.456824 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c7tjq" Mar 09 09:40:43 crc kubenswrapper[4861]: I0309 09:40:43.584019 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c7tjq" event={"ID":"28409ae5-743b-4e9a-a432-7527ad656038","Type":"ContainerDied","Data":"abab6ab16dedab94a71262e85f2e109e9dfc1f0909d09c1369d17858a96c60bf"} Mar 09 09:40:43 crc kubenswrapper[4861]: I0309 09:40:43.584070 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c7tjq" Mar 09 09:40:43 crc kubenswrapper[4861]: I0309 09:40:43.584099 4861 scope.go:117] "RemoveContainer" containerID="2476df60bfe8edebf5fbd478fc535b995acada9081305854ca705fbd49d8e73b" Mar 09 09:40:43 crc kubenswrapper[4861]: I0309 09:40:43.615612 4861 scope.go:117] "RemoveContainer" containerID="8167fe13a58e1a39990999833cbb0a4dace9670a61d2856f419c2e3301e4f4b0" Mar 09 09:40:43 crc kubenswrapper[4861]: I0309 09:40:43.636188 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsg94\" (UniqueName: \"kubernetes.io/projected/28409ae5-743b-4e9a-a432-7527ad656038-kube-api-access-nsg94\") pod \"28409ae5-743b-4e9a-a432-7527ad656038\" (UID: \"28409ae5-743b-4e9a-a432-7527ad656038\") " Mar 09 09:40:43 crc kubenswrapper[4861]: I0309 09:40:43.636410 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28409ae5-743b-4e9a-a432-7527ad656038-catalog-content\") pod \"28409ae5-743b-4e9a-a432-7527ad656038\" (UID: \"28409ae5-743b-4e9a-a432-7527ad656038\") " Mar 09 09:40:43 crc kubenswrapper[4861]: I0309 09:40:43.636596 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28409ae5-743b-4e9a-a432-7527ad656038-utilities\") pod \"28409ae5-743b-4e9a-a432-7527ad656038\" (UID: \"28409ae5-743b-4e9a-a432-7527ad656038\") " Mar 09 09:40:43 crc kubenswrapper[4861]: I0309 09:40:43.636943 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28409ae5-743b-4e9a-a432-7527ad656038-utilities" (OuterVolumeSpecName: "utilities") pod "28409ae5-743b-4e9a-a432-7527ad656038" (UID: "28409ae5-743b-4e9a-a432-7527ad656038"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:40:43 crc kubenswrapper[4861]: I0309 09:40:43.637474 4861 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28409ae5-743b-4e9a-a432-7527ad656038-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:40:43 crc kubenswrapper[4861]: I0309 09:40:43.644631 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28409ae5-743b-4e9a-a432-7527ad656038-kube-api-access-nsg94" (OuterVolumeSpecName: "kube-api-access-nsg94") pod "28409ae5-743b-4e9a-a432-7527ad656038" (UID: "28409ae5-743b-4e9a-a432-7527ad656038"). InnerVolumeSpecName "kube-api-access-nsg94". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:40:43 crc kubenswrapper[4861]: I0309 09:40:43.656305 4861 scope.go:117] "RemoveContainer" containerID="49201029376bd3de9273d910d2d3937761ed9b1ade2b44032a47f919695bde9a" Mar 09 09:40:43 crc kubenswrapper[4861]: I0309 09:40:43.739723 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nsg94\" (UniqueName: \"kubernetes.io/projected/28409ae5-743b-4e9a-a432-7527ad656038-kube-api-access-nsg94\") on node \"crc\" DevicePath \"\"" Mar 09 09:40:43 crc kubenswrapper[4861]: I0309 09:40:43.792303 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28409ae5-743b-4e9a-a432-7527ad656038-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "28409ae5-743b-4e9a-a432-7527ad656038" (UID: "28409ae5-743b-4e9a-a432-7527ad656038"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:40:43 crc kubenswrapper[4861]: I0309 09:40:43.841557 4861 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28409ae5-743b-4e9a-a432-7527ad656038-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:40:43 crc kubenswrapper[4861]: I0309 09:40:43.918200 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-c7tjq"] Mar 09 09:40:43 crc kubenswrapper[4861]: I0309 09:40:43.930112 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-c7tjq"] Mar 09 09:40:45 crc kubenswrapper[4861]: I0309 09:40:45.704599 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28409ae5-743b-4e9a-a432-7527ad656038" path="/var/lib/kubelet/pods/28409ae5-743b-4e9a-a432-7527ad656038/volumes" Mar 09 09:40:46 crc kubenswrapper[4861]: I0309 09:40:46.302699 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-srlcz" Mar 09 09:40:46 crc kubenswrapper[4861]: I0309 09:40:46.302775 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-srlcz" Mar 09 09:40:47 crc kubenswrapper[4861]: I0309 09:40:47.365229 4861 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-srlcz" podUID="21ff87a9-2741-4c62-896a-98af7860574e" containerName="registry-server" probeResult="failure" output=< Mar 09 09:40:47 crc kubenswrapper[4861]: timeout: failed to connect service ":50051" within 1s Mar 09 09:40:47 crc kubenswrapper[4861]: > Mar 09 09:40:52 crc kubenswrapper[4861]: I0309 09:40:52.914635 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dt9hl"] Mar 09 09:40:52 crc kubenswrapper[4861]: E0309 09:40:52.915540 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bec06ec-962e-4cef-81a7-f81708d72f2c" containerName="extract-utilities" Mar 09 09:40:52 crc kubenswrapper[4861]: I0309 09:40:52.915553 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bec06ec-962e-4cef-81a7-f81708d72f2c" containerName="extract-utilities" Mar 09 09:40:52 crc kubenswrapper[4861]: E0309 09:40:52.915565 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28409ae5-743b-4e9a-a432-7527ad656038" containerName="extract-utilities" Mar 09 09:40:52 crc kubenswrapper[4861]: I0309 09:40:52.915572 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="28409ae5-743b-4e9a-a432-7527ad656038" containerName="extract-utilities" Mar 09 09:40:52 crc kubenswrapper[4861]: E0309 09:40:52.915592 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bec06ec-962e-4cef-81a7-f81708d72f2c" containerName="registry-server" Mar 09 09:40:52 crc kubenswrapper[4861]: I0309 09:40:52.915600 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bec06ec-962e-4cef-81a7-f81708d72f2c" containerName="registry-server" Mar 09 09:40:52 crc kubenswrapper[4861]: E0309 09:40:52.915622 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28409ae5-743b-4e9a-a432-7527ad656038" containerName="registry-server" Mar 09 09:40:52 crc kubenswrapper[4861]: I0309 09:40:52.915629 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="28409ae5-743b-4e9a-a432-7527ad656038" containerName="registry-server" Mar 09 09:40:52 crc kubenswrapper[4861]: E0309 09:40:52.915643 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bec06ec-962e-4cef-81a7-f81708d72f2c" containerName="extract-content" Mar 09 09:40:52 crc kubenswrapper[4861]: I0309 09:40:52.915650 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bec06ec-962e-4cef-81a7-f81708d72f2c" containerName="extract-content" Mar 09 09:40:52 crc kubenswrapper[4861]: E0309 09:40:52.915676 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28409ae5-743b-4e9a-a432-7527ad656038" containerName="extract-content" Mar 09 09:40:52 crc kubenswrapper[4861]: I0309 09:40:52.915686 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="28409ae5-743b-4e9a-a432-7527ad656038" containerName="extract-content" Mar 09 09:40:52 crc kubenswrapper[4861]: I0309 09:40:52.915889 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="28409ae5-743b-4e9a-a432-7527ad656038" containerName="registry-server" Mar 09 09:40:52 crc kubenswrapper[4861]: I0309 09:40:52.915901 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bec06ec-962e-4cef-81a7-f81708d72f2c" containerName="registry-server" Mar 09 09:40:52 crc kubenswrapper[4861]: I0309 09:40:52.918144 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dt9hl" Mar 09 09:40:52 crc kubenswrapper[4861]: I0309 09:40:52.926834 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dt9hl"] Mar 09 09:40:53 crc kubenswrapper[4861]: I0309 09:40:53.032945 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6f03f22-bc4c-481e-a32f-c435c7d2207a-utilities\") pod \"redhat-marketplace-dt9hl\" (UID: \"c6f03f22-bc4c-481e-a32f-c435c7d2207a\") " pod="openshift-marketplace/redhat-marketplace-dt9hl" Mar 09 09:40:53 crc kubenswrapper[4861]: I0309 09:40:53.033025 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7pnj\" (UniqueName: \"kubernetes.io/projected/c6f03f22-bc4c-481e-a32f-c435c7d2207a-kube-api-access-d7pnj\") pod \"redhat-marketplace-dt9hl\" (UID: \"c6f03f22-bc4c-481e-a32f-c435c7d2207a\") " pod="openshift-marketplace/redhat-marketplace-dt9hl" Mar 09 09:40:53 crc kubenswrapper[4861]: I0309 09:40:53.033149 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6f03f22-bc4c-481e-a32f-c435c7d2207a-catalog-content\") pod \"redhat-marketplace-dt9hl\" (UID: \"c6f03f22-bc4c-481e-a32f-c435c7d2207a\") " pod="openshift-marketplace/redhat-marketplace-dt9hl" Mar 09 09:40:53 crc kubenswrapper[4861]: I0309 09:40:53.135304 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7pnj\" (UniqueName: \"kubernetes.io/projected/c6f03f22-bc4c-481e-a32f-c435c7d2207a-kube-api-access-d7pnj\") pod \"redhat-marketplace-dt9hl\" (UID: \"c6f03f22-bc4c-481e-a32f-c435c7d2207a\") " pod="openshift-marketplace/redhat-marketplace-dt9hl" Mar 09 09:40:53 crc kubenswrapper[4861]: I0309 09:40:53.135480 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6f03f22-bc4c-481e-a32f-c435c7d2207a-catalog-content\") pod \"redhat-marketplace-dt9hl\" (UID: \"c6f03f22-bc4c-481e-a32f-c435c7d2207a\") " pod="openshift-marketplace/redhat-marketplace-dt9hl" Mar 09 09:40:53 crc kubenswrapper[4861]: I0309 09:40:53.135589 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6f03f22-bc4c-481e-a32f-c435c7d2207a-utilities\") pod \"redhat-marketplace-dt9hl\" (UID: \"c6f03f22-bc4c-481e-a32f-c435c7d2207a\") " pod="openshift-marketplace/redhat-marketplace-dt9hl" Mar 09 09:40:53 crc kubenswrapper[4861]: I0309 09:40:53.136104 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6f03f22-bc4c-481e-a32f-c435c7d2207a-catalog-content\") pod \"redhat-marketplace-dt9hl\" (UID: \"c6f03f22-bc4c-481e-a32f-c435c7d2207a\") " pod="openshift-marketplace/redhat-marketplace-dt9hl" Mar 09 09:40:53 crc kubenswrapper[4861]: I0309 09:40:53.136134 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6f03f22-bc4c-481e-a32f-c435c7d2207a-utilities\") pod \"redhat-marketplace-dt9hl\" (UID: \"c6f03f22-bc4c-481e-a32f-c435c7d2207a\") " pod="openshift-marketplace/redhat-marketplace-dt9hl" Mar 09 09:40:53 crc kubenswrapper[4861]: I0309 09:40:53.155446 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7pnj\" (UniqueName: \"kubernetes.io/projected/c6f03f22-bc4c-481e-a32f-c435c7d2207a-kube-api-access-d7pnj\") pod \"redhat-marketplace-dt9hl\" (UID: \"c6f03f22-bc4c-481e-a32f-c435c7d2207a\") " pod="openshift-marketplace/redhat-marketplace-dt9hl" Mar 09 09:40:53 crc kubenswrapper[4861]: I0309 09:40:53.243301 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dt9hl" Mar 09 09:40:53 crc kubenswrapper[4861]: I0309 09:40:53.735506 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dt9hl"] Mar 09 09:40:54 crc kubenswrapper[4861]: I0309 09:40:54.605976 4861 patch_prober.go:28] interesting pod/machine-config-daemon-5g7gc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:40:54 crc kubenswrapper[4861]: I0309 09:40:54.606039 4861 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:40:54 crc kubenswrapper[4861]: I0309 09:40:54.606082 4861 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" Mar 09 09:40:54 crc kubenswrapper[4861]: I0309 09:40:54.606900 4861 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2cf1bc55664e082ce607bad22b0e501635384de314ffbc4f1270ecbbe7d97b60"} pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 09:40:54 crc kubenswrapper[4861]: I0309 09:40:54.606957 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" containerID="cri-o://2cf1bc55664e082ce607bad22b0e501635384de314ffbc4f1270ecbbe7d97b60" gracePeriod=600 Mar 09 09:40:54 crc kubenswrapper[4861]: I0309 09:40:54.691158 4861 generic.go:334] "Generic (PLEG): container finished" podID="c6f03f22-bc4c-481e-a32f-c435c7d2207a" containerID="865ba68ef9012188272282089420e87a38ea22f8270e5c7efcd3956aa54e9938" exitCode=0 Mar 09 09:40:54 crc kubenswrapper[4861]: I0309 09:40:54.691212 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dt9hl" event={"ID":"c6f03f22-bc4c-481e-a32f-c435c7d2207a","Type":"ContainerDied","Data":"865ba68ef9012188272282089420e87a38ea22f8270e5c7efcd3956aa54e9938"} Mar 09 09:40:54 crc kubenswrapper[4861]: I0309 09:40:54.691243 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dt9hl" event={"ID":"c6f03f22-bc4c-481e-a32f-c435c7d2207a","Type":"ContainerStarted","Data":"8a8634bf355436eef43d56cb418faeb92f699d2691e66d92060fa6d0797f7cf4"} Mar 09 09:40:55 crc kubenswrapper[4861]: I0309 09:40:55.701789 4861 generic.go:334] "Generic (PLEG): container finished" podID="c6f03f22-bc4c-481e-a32f-c435c7d2207a" containerID="778c9455d2136a198f151f68b56a1f72bca583d45797dba3b5eb605ffe1650dd" exitCode=0 Mar 09 09:40:55 crc kubenswrapper[4861]: I0309 09:40:55.701852 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dt9hl" event={"ID":"c6f03f22-bc4c-481e-a32f-c435c7d2207a","Type":"ContainerDied","Data":"778c9455d2136a198f151f68b56a1f72bca583d45797dba3b5eb605ffe1650dd"} Mar 09 09:40:55 crc kubenswrapper[4861]: I0309 09:40:55.707871 4861 generic.go:334] "Generic (PLEG): container finished" podID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerID="2cf1bc55664e082ce607bad22b0e501635384de314ffbc4f1270ecbbe7d97b60" exitCode=0 Mar 09 09:40:55 crc kubenswrapper[4861]: I0309 09:40:55.707905 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" event={"ID":"6f7875e3-174f-4c67-8675-d878de74aa4f","Type":"ContainerDied","Data":"2cf1bc55664e082ce607bad22b0e501635384de314ffbc4f1270ecbbe7d97b60"} Mar 09 09:40:55 crc kubenswrapper[4861]: I0309 09:40:55.707932 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" event={"ID":"6f7875e3-174f-4c67-8675-d878de74aa4f","Type":"ContainerStarted","Data":"480298580e8917b47d66a4586a753752f58dc7e4c2678e95158c475a3b504483"} Mar 09 09:40:55 crc kubenswrapper[4861]: I0309 09:40:55.707947 4861 scope.go:117] "RemoveContainer" containerID="c0726d3ac822004eacb4f8d12bb4cbaf2815fc9d29aaa8ba7db9d4fae1717ee1" Mar 09 09:40:56 crc kubenswrapper[4861]: I0309 09:40:56.352005 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-srlcz" Mar 09 09:40:56 crc kubenswrapper[4861]: I0309 09:40:56.410432 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-srlcz" Mar 09 09:40:56 crc kubenswrapper[4861]: I0309 09:40:56.719702 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dt9hl" event={"ID":"c6f03f22-bc4c-481e-a32f-c435c7d2207a","Type":"ContainerStarted","Data":"21965a6f35a4f734ad156343019ebc131c8274c8404088647e24c5946082da18"} Mar 09 09:40:56 crc kubenswrapper[4861]: I0309 09:40:56.742835 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dt9hl" podStartSLOduration=3.323797405 podStartE2EDuration="4.742816801s" podCreationTimestamp="2026-03-09 09:40:52 +0000 UTC" firstStartedPulling="2026-03-09 09:40:54.693689167 +0000 UTC m=+2097.778728568" lastFinishedPulling="2026-03-09 09:40:56.112708563 +0000 UTC m=+2099.197747964" observedRunningTime="2026-03-09 09:40:56.736150957 +0000 UTC m=+2099.821190358" watchObservedRunningTime="2026-03-09 09:40:56.742816801 +0000 UTC m=+2099.827856202" Mar 09 09:40:58 crc kubenswrapper[4861]: I0309 09:40:58.689343 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-srlcz"] Mar 09 09:40:58 crc kubenswrapper[4861]: I0309 09:40:58.689624 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-srlcz" podUID="21ff87a9-2741-4c62-896a-98af7860574e" containerName="registry-server" containerID="cri-o://2d2057a818d159f9c85f7627dec1702a4b86d258b8a411e80c11f434ddc39dbb" gracePeriod=2 Mar 09 09:40:59 crc kubenswrapper[4861]: I0309 09:40:59.155494 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-srlcz" Mar 09 09:40:59 crc kubenswrapper[4861]: I0309 09:40:59.261000 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5sw7\" (UniqueName: \"kubernetes.io/projected/21ff87a9-2741-4c62-896a-98af7860574e-kube-api-access-t5sw7\") pod \"21ff87a9-2741-4c62-896a-98af7860574e\" (UID: \"21ff87a9-2741-4c62-896a-98af7860574e\") " Mar 09 09:40:59 crc kubenswrapper[4861]: I0309 09:40:59.261476 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21ff87a9-2741-4c62-896a-98af7860574e-utilities\") pod \"21ff87a9-2741-4c62-896a-98af7860574e\" (UID: \"21ff87a9-2741-4c62-896a-98af7860574e\") " Mar 09 09:40:59 crc kubenswrapper[4861]: I0309 09:40:59.261520 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21ff87a9-2741-4c62-896a-98af7860574e-catalog-content\") pod \"21ff87a9-2741-4c62-896a-98af7860574e\" (UID: \"21ff87a9-2741-4c62-896a-98af7860574e\") " Mar 09 09:40:59 crc kubenswrapper[4861]: I0309 09:40:59.263222 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21ff87a9-2741-4c62-896a-98af7860574e-utilities" (OuterVolumeSpecName: "utilities") pod "21ff87a9-2741-4c62-896a-98af7860574e" (UID: "21ff87a9-2741-4c62-896a-98af7860574e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:40:59 crc kubenswrapper[4861]: I0309 09:40:59.268481 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21ff87a9-2741-4c62-896a-98af7860574e-kube-api-access-t5sw7" (OuterVolumeSpecName: "kube-api-access-t5sw7") pod "21ff87a9-2741-4c62-896a-98af7860574e" (UID: "21ff87a9-2741-4c62-896a-98af7860574e"). InnerVolumeSpecName "kube-api-access-t5sw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:40:59 crc kubenswrapper[4861]: I0309 09:40:59.363292 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5sw7\" (UniqueName: \"kubernetes.io/projected/21ff87a9-2741-4c62-896a-98af7860574e-kube-api-access-t5sw7\") on node \"crc\" DevicePath \"\"" Mar 09 09:40:59 crc kubenswrapper[4861]: I0309 09:40:59.363335 4861 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21ff87a9-2741-4c62-896a-98af7860574e-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:40:59 crc kubenswrapper[4861]: I0309 09:40:59.387002 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21ff87a9-2741-4c62-896a-98af7860574e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "21ff87a9-2741-4c62-896a-98af7860574e" (UID: "21ff87a9-2741-4c62-896a-98af7860574e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:40:59 crc kubenswrapper[4861]: I0309 09:40:59.465000 4861 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21ff87a9-2741-4c62-896a-98af7860574e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:40:59 crc kubenswrapper[4861]: I0309 09:40:59.748110 4861 generic.go:334] "Generic (PLEG): container finished" podID="21ff87a9-2741-4c62-896a-98af7860574e" containerID="2d2057a818d159f9c85f7627dec1702a4b86d258b8a411e80c11f434ddc39dbb" exitCode=0 Mar 09 09:40:59 crc kubenswrapper[4861]: I0309 09:40:59.748148 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-srlcz" event={"ID":"21ff87a9-2741-4c62-896a-98af7860574e","Type":"ContainerDied","Data":"2d2057a818d159f9c85f7627dec1702a4b86d258b8a411e80c11f434ddc39dbb"} Mar 09 09:40:59 crc kubenswrapper[4861]: I0309 09:40:59.748173 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-srlcz" event={"ID":"21ff87a9-2741-4c62-896a-98af7860574e","Type":"ContainerDied","Data":"0b07a69138e0d36b4852f8eb1d0259ef03c709a2597e4c0ef7c8dffe7f0fd0ba"} Mar 09 09:40:59 crc kubenswrapper[4861]: I0309 09:40:59.748188 4861 scope.go:117] "RemoveContainer" containerID="2d2057a818d159f9c85f7627dec1702a4b86d258b8a411e80c11f434ddc39dbb" Mar 09 09:40:59 crc kubenswrapper[4861]: I0309 09:40:59.748247 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-srlcz" Mar 09 09:40:59 crc kubenswrapper[4861]: I0309 09:40:59.773756 4861 scope.go:117] "RemoveContainer" containerID="8028aca1b326c2c2e184cd1035cf168eb80ba0fdb44c92a97fddb48aa022e711" Mar 09 09:40:59 crc kubenswrapper[4861]: I0309 09:40:59.774555 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-srlcz"] Mar 09 09:40:59 crc kubenswrapper[4861]: I0309 09:40:59.793816 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-srlcz"] Mar 09 09:40:59 crc kubenswrapper[4861]: I0309 09:40:59.801819 4861 scope.go:117] "RemoveContainer" containerID="a5bdb5ca4e0ede20fe927c02a957af790a14c8bb15569e7048d758ac840f3efa" Mar 09 09:40:59 crc kubenswrapper[4861]: I0309 09:40:59.847598 4861 scope.go:117] "RemoveContainer" containerID="2d2057a818d159f9c85f7627dec1702a4b86d258b8a411e80c11f434ddc39dbb" Mar 09 09:40:59 crc kubenswrapper[4861]: E0309 09:40:59.848005 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d2057a818d159f9c85f7627dec1702a4b86d258b8a411e80c11f434ddc39dbb\": container with ID starting with 2d2057a818d159f9c85f7627dec1702a4b86d258b8a411e80c11f434ddc39dbb not found: ID does not exist" containerID="2d2057a818d159f9c85f7627dec1702a4b86d258b8a411e80c11f434ddc39dbb" Mar 09 09:40:59 crc kubenswrapper[4861]: I0309 09:40:59.848142 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d2057a818d159f9c85f7627dec1702a4b86d258b8a411e80c11f434ddc39dbb"} err="failed to get container status \"2d2057a818d159f9c85f7627dec1702a4b86d258b8a411e80c11f434ddc39dbb\": rpc error: code = NotFound desc = could not find container \"2d2057a818d159f9c85f7627dec1702a4b86d258b8a411e80c11f434ddc39dbb\": container with ID starting with 2d2057a818d159f9c85f7627dec1702a4b86d258b8a411e80c11f434ddc39dbb not found: ID does not exist" Mar 09 09:40:59 crc kubenswrapper[4861]: I0309 09:40:59.848185 4861 scope.go:117] "RemoveContainer" containerID="8028aca1b326c2c2e184cd1035cf168eb80ba0fdb44c92a97fddb48aa022e711" Mar 09 09:40:59 crc kubenswrapper[4861]: E0309 09:40:59.848852 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8028aca1b326c2c2e184cd1035cf168eb80ba0fdb44c92a97fddb48aa022e711\": container with ID starting with 8028aca1b326c2c2e184cd1035cf168eb80ba0fdb44c92a97fddb48aa022e711 not found: ID does not exist" containerID="8028aca1b326c2c2e184cd1035cf168eb80ba0fdb44c92a97fddb48aa022e711" Mar 09 09:40:59 crc kubenswrapper[4861]: I0309 09:40:59.848883 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8028aca1b326c2c2e184cd1035cf168eb80ba0fdb44c92a97fddb48aa022e711"} err="failed to get container status \"8028aca1b326c2c2e184cd1035cf168eb80ba0fdb44c92a97fddb48aa022e711\": rpc error: code = NotFound desc = could not find container \"8028aca1b326c2c2e184cd1035cf168eb80ba0fdb44c92a97fddb48aa022e711\": container with ID starting with 8028aca1b326c2c2e184cd1035cf168eb80ba0fdb44c92a97fddb48aa022e711 not found: ID does not exist" Mar 09 09:40:59 crc kubenswrapper[4861]: I0309 09:40:59.848936 4861 scope.go:117] "RemoveContainer" containerID="a5bdb5ca4e0ede20fe927c02a957af790a14c8bb15569e7048d758ac840f3efa" Mar 09 09:40:59 crc kubenswrapper[4861]: E0309 09:40:59.850339 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5bdb5ca4e0ede20fe927c02a957af790a14c8bb15569e7048d758ac840f3efa\": container with ID starting with a5bdb5ca4e0ede20fe927c02a957af790a14c8bb15569e7048d758ac840f3efa not found: ID does not exist" containerID="a5bdb5ca4e0ede20fe927c02a957af790a14c8bb15569e7048d758ac840f3efa" Mar 09 09:40:59 crc kubenswrapper[4861]: I0309 09:40:59.850509 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5bdb5ca4e0ede20fe927c02a957af790a14c8bb15569e7048d758ac840f3efa"} err="failed to get container status \"a5bdb5ca4e0ede20fe927c02a957af790a14c8bb15569e7048d758ac840f3efa\": rpc error: code = NotFound desc = could not find container \"a5bdb5ca4e0ede20fe927c02a957af790a14c8bb15569e7048d758ac840f3efa\": container with ID starting with a5bdb5ca4e0ede20fe927c02a957af790a14c8bb15569e7048d758ac840f3efa not found: ID does not exist" Mar 09 09:41:01 crc kubenswrapper[4861]: I0309 09:41:01.669893 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21ff87a9-2741-4c62-896a-98af7860574e" path="/var/lib/kubelet/pods/21ff87a9-2741-4c62-896a-98af7860574e/volumes" Mar 09 09:41:03 crc kubenswrapper[4861]: I0309 09:41:03.244273 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dt9hl" Mar 09 09:41:03 crc kubenswrapper[4861]: I0309 09:41:03.244389 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dt9hl" Mar 09 09:41:03 crc kubenswrapper[4861]: I0309 09:41:03.294044 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dt9hl" Mar 09 09:41:03 crc kubenswrapper[4861]: I0309 09:41:03.859895 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dt9hl" Mar 09 09:41:03 crc kubenswrapper[4861]: I0309 09:41:03.903512 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dt9hl"] Mar 09 09:41:06 crc kubenswrapper[4861]: I0309 09:41:06.233982 4861 patch_prober.go:28] interesting pod/route-controller-manager-bbc7557c8-pw74s container/route-controller-manager namespace/openshift-route-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 09:41:06 crc kubenswrapper[4861]: I0309 09:41:06.234255 4861 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-route-controller-manager/route-controller-manager-bbc7557c8-pw74s" podUID="f881fd60-0955-4706-8e9e-bdbc5259248a" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 09:41:07 crc kubenswrapper[4861]: I0309 09:41:07.279408 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dt9hl" podUID="c6f03f22-bc4c-481e-a32f-c435c7d2207a" containerName="registry-server" containerID="cri-o://21965a6f35a4f734ad156343019ebc131c8274c8404088647e24c5946082da18" gracePeriod=2 Mar 09 09:41:07 crc kubenswrapper[4861]: I0309 09:41:07.732192 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dt9hl" Mar 09 09:41:07 crc kubenswrapper[4861]: I0309 09:41:07.858779 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7pnj\" (UniqueName: \"kubernetes.io/projected/c6f03f22-bc4c-481e-a32f-c435c7d2207a-kube-api-access-d7pnj\") pod \"c6f03f22-bc4c-481e-a32f-c435c7d2207a\" (UID: \"c6f03f22-bc4c-481e-a32f-c435c7d2207a\") " Mar 09 09:41:07 crc kubenswrapper[4861]: I0309 09:41:07.858916 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6f03f22-bc4c-481e-a32f-c435c7d2207a-catalog-content\") pod \"c6f03f22-bc4c-481e-a32f-c435c7d2207a\" (UID: \"c6f03f22-bc4c-481e-a32f-c435c7d2207a\") " Mar 09 09:41:07 crc kubenswrapper[4861]: I0309 09:41:07.859091 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6f03f22-bc4c-481e-a32f-c435c7d2207a-utilities\") pod \"c6f03f22-bc4c-481e-a32f-c435c7d2207a\" (UID: \"c6f03f22-bc4c-481e-a32f-c435c7d2207a\") " Mar 09 09:41:07 crc kubenswrapper[4861]: I0309 09:41:07.861772 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6f03f22-bc4c-481e-a32f-c435c7d2207a-utilities" (OuterVolumeSpecName: "utilities") pod "c6f03f22-bc4c-481e-a32f-c435c7d2207a" (UID: "c6f03f22-bc4c-481e-a32f-c435c7d2207a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:41:07 crc kubenswrapper[4861]: I0309 09:41:07.868613 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6f03f22-bc4c-481e-a32f-c435c7d2207a-kube-api-access-d7pnj" (OuterVolumeSpecName: "kube-api-access-d7pnj") pod "c6f03f22-bc4c-481e-a32f-c435c7d2207a" (UID: "c6f03f22-bc4c-481e-a32f-c435c7d2207a"). InnerVolumeSpecName "kube-api-access-d7pnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:41:07 crc kubenswrapper[4861]: I0309 09:41:07.887463 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6f03f22-bc4c-481e-a32f-c435c7d2207a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c6f03f22-bc4c-481e-a32f-c435c7d2207a" (UID: "c6f03f22-bc4c-481e-a32f-c435c7d2207a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:41:07 crc kubenswrapper[4861]: I0309 09:41:07.961944 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7pnj\" (UniqueName: \"kubernetes.io/projected/c6f03f22-bc4c-481e-a32f-c435c7d2207a-kube-api-access-d7pnj\") on node \"crc\" DevicePath \"\"" Mar 09 09:41:07 crc kubenswrapper[4861]: I0309 09:41:07.961979 4861 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6f03f22-bc4c-481e-a32f-c435c7d2207a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:41:07 crc kubenswrapper[4861]: I0309 09:41:07.961990 4861 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6f03f22-bc4c-481e-a32f-c435c7d2207a-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:41:08 crc kubenswrapper[4861]: I0309 09:41:08.291355 4861 generic.go:334] "Generic (PLEG): container finished" podID="c6f03f22-bc4c-481e-a32f-c435c7d2207a" containerID="21965a6f35a4f734ad156343019ebc131c8274c8404088647e24c5946082da18" exitCode=0 Mar 09 09:41:08 crc kubenswrapper[4861]: I0309 09:41:08.291848 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dt9hl" Mar 09 09:41:08 crc kubenswrapper[4861]: I0309 09:41:08.291845 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dt9hl" event={"ID":"c6f03f22-bc4c-481e-a32f-c435c7d2207a","Type":"ContainerDied","Data":"21965a6f35a4f734ad156343019ebc131c8274c8404088647e24c5946082da18"} Mar 09 09:41:08 crc kubenswrapper[4861]: I0309 09:41:08.292683 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dt9hl" event={"ID":"c6f03f22-bc4c-481e-a32f-c435c7d2207a","Type":"ContainerDied","Data":"8a8634bf355436eef43d56cb418faeb92f699d2691e66d92060fa6d0797f7cf4"} Mar 09 09:41:08 crc kubenswrapper[4861]: I0309 09:41:08.292715 4861 scope.go:117] "RemoveContainer" containerID="21965a6f35a4f734ad156343019ebc131c8274c8404088647e24c5946082da18" Mar 09 09:41:08 crc kubenswrapper[4861]: I0309 09:41:08.318587 4861 scope.go:117] "RemoveContainer" containerID="778c9455d2136a198f151f68b56a1f72bca583d45797dba3b5eb605ffe1650dd" Mar 09 09:41:08 crc kubenswrapper[4861]: I0309 09:41:08.331026 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dt9hl"] Mar 09 09:41:08 crc kubenswrapper[4861]: I0309 09:41:08.342989 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dt9hl"] Mar 09 09:41:08 crc kubenswrapper[4861]: I0309 09:41:08.356116 4861 scope.go:117] "RemoveContainer" containerID="865ba68ef9012188272282089420e87a38ea22f8270e5c7efcd3956aa54e9938" Mar 09 09:41:08 crc kubenswrapper[4861]: I0309 09:41:08.389204 4861 scope.go:117] "RemoveContainer" containerID="21965a6f35a4f734ad156343019ebc131c8274c8404088647e24c5946082da18" Mar 09 09:41:08 crc kubenswrapper[4861]: E0309 09:41:08.389866 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21965a6f35a4f734ad156343019ebc131c8274c8404088647e24c5946082da18\": container with ID starting with 21965a6f35a4f734ad156343019ebc131c8274c8404088647e24c5946082da18 not found: ID does not exist" containerID="21965a6f35a4f734ad156343019ebc131c8274c8404088647e24c5946082da18" Mar 09 09:41:08 crc kubenswrapper[4861]: I0309 09:41:08.390001 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21965a6f35a4f734ad156343019ebc131c8274c8404088647e24c5946082da18"} err="failed to get container status \"21965a6f35a4f734ad156343019ebc131c8274c8404088647e24c5946082da18\": rpc error: code = NotFound desc = could not find container \"21965a6f35a4f734ad156343019ebc131c8274c8404088647e24c5946082da18\": container with ID starting with 21965a6f35a4f734ad156343019ebc131c8274c8404088647e24c5946082da18 not found: ID does not exist" Mar 09 09:41:08 crc kubenswrapper[4861]: I0309 09:41:08.390120 4861 scope.go:117] "RemoveContainer" containerID="778c9455d2136a198f151f68b56a1f72bca583d45797dba3b5eb605ffe1650dd" Mar 09 09:41:08 crc kubenswrapper[4861]: E0309 09:41:08.390773 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"778c9455d2136a198f151f68b56a1f72bca583d45797dba3b5eb605ffe1650dd\": container with ID starting with 778c9455d2136a198f151f68b56a1f72bca583d45797dba3b5eb605ffe1650dd not found: ID does not exist" containerID="778c9455d2136a198f151f68b56a1f72bca583d45797dba3b5eb605ffe1650dd" Mar 09 09:41:08 crc kubenswrapper[4861]: I0309 09:41:08.390860 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"778c9455d2136a198f151f68b56a1f72bca583d45797dba3b5eb605ffe1650dd"} err="failed to get container status \"778c9455d2136a198f151f68b56a1f72bca583d45797dba3b5eb605ffe1650dd\": rpc error: code = NotFound desc = could not find container \"778c9455d2136a198f151f68b56a1f72bca583d45797dba3b5eb605ffe1650dd\": container with ID starting with 778c9455d2136a198f151f68b56a1f72bca583d45797dba3b5eb605ffe1650dd not found: ID does not exist" Mar 09 09:41:08 crc kubenswrapper[4861]: I0309 09:41:08.390893 4861 scope.go:117] "RemoveContainer" containerID="865ba68ef9012188272282089420e87a38ea22f8270e5c7efcd3956aa54e9938" Mar 09 09:41:08 crc kubenswrapper[4861]: E0309 09:41:08.391356 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"865ba68ef9012188272282089420e87a38ea22f8270e5c7efcd3956aa54e9938\": container with ID starting with 865ba68ef9012188272282089420e87a38ea22f8270e5c7efcd3956aa54e9938 not found: ID does not exist" containerID="865ba68ef9012188272282089420e87a38ea22f8270e5c7efcd3956aa54e9938" Mar 09 09:41:08 crc kubenswrapper[4861]: I0309 09:41:08.391403 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"865ba68ef9012188272282089420e87a38ea22f8270e5c7efcd3956aa54e9938"} err="failed to get container status \"865ba68ef9012188272282089420e87a38ea22f8270e5c7efcd3956aa54e9938\": rpc error: code = NotFound desc = could not find container \"865ba68ef9012188272282089420e87a38ea22f8270e5c7efcd3956aa54e9938\": container with ID starting with 865ba68ef9012188272282089420e87a38ea22f8270e5c7efcd3956aa54e9938 not found: ID does not exist" Mar 09 09:41:09 crc kubenswrapper[4861]: I0309 09:41:09.666715 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6f03f22-bc4c-481e-a32f-c435c7d2207a" path="/var/lib/kubelet/pods/c6f03f22-bc4c-481e-a32f-c435c7d2207a/volumes" Mar 09 09:42:00 crc kubenswrapper[4861]: I0309 09:42:00.156537 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550822-58lps"] Mar 09 09:42:00 crc kubenswrapper[4861]: E0309 09:42:00.157457 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21ff87a9-2741-4c62-896a-98af7860574e" containerName="registry-server" Mar 09 09:42:00 crc kubenswrapper[4861]: I0309 09:42:00.157474 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="21ff87a9-2741-4c62-896a-98af7860574e" containerName="registry-server" Mar 09 09:42:00 crc kubenswrapper[4861]: E0309 09:42:00.157498 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6f03f22-bc4c-481e-a32f-c435c7d2207a" containerName="extract-utilities" Mar 09 09:42:00 crc kubenswrapper[4861]: I0309 09:42:00.157506 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6f03f22-bc4c-481e-a32f-c435c7d2207a" containerName="extract-utilities" Mar 09 09:42:00 crc kubenswrapper[4861]: E0309 09:42:00.157542 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6f03f22-bc4c-481e-a32f-c435c7d2207a" containerName="registry-server" Mar 09 09:42:00 crc kubenswrapper[4861]: I0309 09:42:00.157550 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6f03f22-bc4c-481e-a32f-c435c7d2207a" containerName="registry-server" Mar 09 09:42:00 crc kubenswrapper[4861]: E0309 09:42:00.157567 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6f03f22-bc4c-481e-a32f-c435c7d2207a" containerName="extract-content" Mar 09 09:42:00 crc kubenswrapper[4861]: I0309 09:42:00.157574 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6f03f22-bc4c-481e-a32f-c435c7d2207a" containerName="extract-content" Mar 09 09:42:00 crc kubenswrapper[4861]: E0309 09:42:00.157595 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21ff87a9-2741-4c62-896a-98af7860574e" containerName="extract-utilities" Mar 09 09:42:00 crc kubenswrapper[4861]: I0309 09:42:00.157601 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="21ff87a9-2741-4c62-896a-98af7860574e" containerName="extract-utilities" Mar 09 09:42:00 crc kubenswrapper[4861]: E0309 09:42:00.157612 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21ff87a9-2741-4c62-896a-98af7860574e" containerName="extract-content" Mar 09 09:42:00 crc kubenswrapper[4861]: I0309 09:42:00.157617 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="21ff87a9-2741-4c62-896a-98af7860574e" containerName="extract-content" Mar 09 09:42:00 crc kubenswrapper[4861]: I0309 09:42:00.157785 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6f03f22-bc4c-481e-a32f-c435c7d2207a" containerName="registry-server" Mar 09 09:42:00 crc kubenswrapper[4861]: I0309 09:42:00.157802 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="21ff87a9-2741-4c62-896a-98af7860574e" containerName="registry-server" Mar 09 09:42:00 crc kubenswrapper[4861]: I0309 09:42:00.158361 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550822-58lps" Mar 09 09:42:00 crc kubenswrapper[4861]: I0309 09:42:00.160898 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k86l8" Mar 09 09:42:00 crc kubenswrapper[4861]: I0309 09:42:00.161146 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:42:00 crc kubenswrapper[4861]: I0309 09:42:00.162921 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:42:00 crc kubenswrapper[4861]: I0309 09:42:00.170644 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550822-58lps"] Mar 09 09:42:00 crc kubenswrapper[4861]: I0309 09:42:00.263988 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6jgk\" (UniqueName: \"kubernetes.io/projected/66860c2d-79a4-4a15-a54f-b73d7437ddf3-kube-api-access-p6jgk\") pod \"auto-csr-approver-29550822-58lps\" (UID: \"66860c2d-79a4-4a15-a54f-b73d7437ddf3\") " pod="openshift-infra/auto-csr-approver-29550822-58lps" Mar 09 09:42:00 crc kubenswrapper[4861]: I0309 09:42:00.366052 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6jgk\" (UniqueName: \"kubernetes.io/projected/66860c2d-79a4-4a15-a54f-b73d7437ddf3-kube-api-access-p6jgk\") pod \"auto-csr-approver-29550822-58lps\" (UID: \"66860c2d-79a4-4a15-a54f-b73d7437ddf3\") " pod="openshift-infra/auto-csr-approver-29550822-58lps" Mar 09 09:42:00 crc kubenswrapper[4861]: I0309 09:42:00.387534 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6jgk\" (UniqueName: \"kubernetes.io/projected/66860c2d-79a4-4a15-a54f-b73d7437ddf3-kube-api-access-p6jgk\") pod \"auto-csr-approver-29550822-58lps\" (UID: \"66860c2d-79a4-4a15-a54f-b73d7437ddf3\") " pod="openshift-infra/auto-csr-approver-29550822-58lps" Mar 09 09:42:00 crc kubenswrapper[4861]: I0309 09:42:00.476233 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550822-58lps" Mar 09 09:42:00 crc kubenswrapper[4861]: I0309 09:42:00.908929 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550822-58lps"] Mar 09 09:42:01 crc kubenswrapper[4861]: I0309 09:42:01.734426 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550822-58lps" event={"ID":"66860c2d-79a4-4a15-a54f-b73d7437ddf3","Type":"ContainerStarted","Data":"a17515d1c9951aded9dcc2a668ae6628762cbf6b22503bbdc01ed1d29a666f46"} Mar 09 09:42:02 crc kubenswrapper[4861]: I0309 09:42:02.747603 4861 generic.go:334] "Generic (PLEG): container finished" podID="66860c2d-79a4-4a15-a54f-b73d7437ddf3" containerID="8731182a628b4853ca5ecae70837211ca10e117901062127b0e4ec4447097c98" exitCode=0 Mar 09 09:42:02 crc kubenswrapper[4861]: I0309 09:42:02.751980 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550822-58lps" event={"ID":"66860c2d-79a4-4a15-a54f-b73d7437ddf3","Type":"ContainerDied","Data":"8731182a628b4853ca5ecae70837211ca10e117901062127b0e4ec4447097c98"} Mar 09 09:42:04 crc kubenswrapper[4861]: I0309 09:42:04.054366 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550822-58lps" Mar 09 09:42:04 crc kubenswrapper[4861]: I0309 09:42:04.135174 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6jgk\" (UniqueName: \"kubernetes.io/projected/66860c2d-79a4-4a15-a54f-b73d7437ddf3-kube-api-access-p6jgk\") pod \"66860c2d-79a4-4a15-a54f-b73d7437ddf3\" (UID: \"66860c2d-79a4-4a15-a54f-b73d7437ddf3\") " Mar 09 09:42:04 crc kubenswrapper[4861]: I0309 09:42:04.141130 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66860c2d-79a4-4a15-a54f-b73d7437ddf3-kube-api-access-p6jgk" (OuterVolumeSpecName: "kube-api-access-p6jgk") pod "66860c2d-79a4-4a15-a54f-b73d7437ddf3" (UID: "66860c2d-79a4-4a15-a54f-b73d7437ddf3"). InnerVolumeSpecName "kube-api-access-p6jgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:42:04 crc kubenswrapper[4861]: I0309 09:42:04.239422 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6jgk\" (UniqueName: \"kubernetes.io/projected/66860c2d-79a4-4a15-a54f-b73d7437ddf3-kube-api-access-p6jgk\") on node \"crc\" DevicePath \"\"" Mar 09 09:42:04 crc kubenswrapper[4861]: I0309 09:42:04.765470 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550822-58lps" event={"ID":"66860c2d-79a4-4a15-a54f-b73d7437ddf3","Type":"ContainerDied","Data":"a17515d1c9951aded9dcc2a668ae6628762cbf6b22503bbdc01ed1d29a666f46"} Mar 09 09:42:04 crc kubenswrapper[4861]: I0309 09:42:04.765508 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a17515d1c9951aded9dcc2a668ae6628762cbf6b22503bbdc01ed1d29a666f46" Mar 09 09:42:04 crc kubenswrapper[4861]: I0309 09:42:04.765528 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550822-58lps" Mar 09 09:42:05 crc kubenswrapper[4861]: I0309 09:42:05.116590 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550816-kr9kp"] Mar 09 09:42:05 crc kubenswrapper[4861]: I0309 09:42:05.124217 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550816-kr9kp"] Mar 09 09:42:05 crc kubenswrapper[4861]: I0309 09:42:05.668002 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad9cdaac-b9a0-401a-8095-8094dee9ce05" path="/var/lib/kubelet/pods/ad9cdaac-b9a0-401a-8095-8094dee9ce05/volumes" Mar 09 09:42:13 crc kubenswrapper[4861]: I0309 09:42:13.654624 4861 scope.go:117] "RemoveContainer" containerID="f77a4307056f9ad78d18c7e95129f670ac38d9210893427691e9fcf63cfda0d0" Mar 09 09:42:54 crc kubenswrapper[4861]: I0309 09:42:54.606306 4861 patch_prober.go:28] interesting pod/machine-config-daemon-5g7gc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:42:54 crc kubenswrapper[4861]: I0309 09:42:54.607056 4861 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:43:24 crc kubenswrapper[4861]: I0309 09:43:24.606349 4861 patch_prober.go:28] interesting pod/machine-config-daemon-5g7gc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:43:24 crc kubenswrapper[4861]: I0309 09:43:24.607153 4861 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:43:54 crc kubenswrapper[4861]: I0309 09:43:54.605770 4861 patch_prober.go:28] interesting pod/machine-config-daemon-5g7gc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:43:54 crc kubenswrapper[4861]: I0309 09:43:54.606289 4861 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:43:54 crc kubenswrapper[4861]: I0309 09:43:54.606339 4861 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" Mar 09 09:43:54 crc kubenswrapper[4861]: I0309 09:43:54.607183 4861 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"480298580e8917b47d66a4586a753752f58dc7e4c2678e95158c475a3b504483"} pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 09:43:54 crc kubenswrapper[4861]: I0309 09:43:54.607250 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" containerID="cri-o://480298580e8917b47d66a4586a753752f58dc7e4c2678e95158c475a3b504483" gracePeriod=600 Mar 09 09:43:55 crc kubenswrapper[4861]: I0309 09:43:55.083255 4861 generic.go:334] "Generic (PLEG): container finished" podID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerID="480298580e8917b47d66a4586a753752f58dc7e4c2678e95158c475a3b504483" exitCode=0 Mar 09 09:43:55 crc kubenswrapper[4861]: I0309 09:43:55.083283 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" event={"ID":"6f7875e3-174f-4c67-8675-d878de74aa4f","Type":"ContainerDied","Data":"480298580e8917b47d66a4586a753752f58dc7e4c2678e95158c475a3b504483"} Mar 09 09:43:55 crc kubenswrapper[4861]: I0309 09:43:55.083671 4861 scope.go:117] "RemoveContainer" containerID="2cf1bc55664e082ce607bad22b0e501635384de314ffbc4f1270ecbbe7d97b60" Mar 09 09:43:55 crc kubenswrapper[4861]: E0309 09:43:55.250558 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:43:56 crc kubenswrapper[4861]: I0309 09:43:56.094041 4861 scope.go:117] "RemoveContainer" containerID="480298580e8917b47d66a4586a753752f58dc7e4c2678e95158c475a3b504483" Mar 09 09:43:56 crc kubenswrapper[4861]: E0309 09:43:56.094322 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:44:00 crc kubenswrapper[4861]: I0309 09:44:00.143866 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550824-mzf4x"] Mar 09 09:44:00 crc kubenswrapper[4861]: E0309 09:44:00.144826 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66860c2d-79a4-4a15-a54f-b73d7437ddf3" containerName="oc" Mar 09 09:44:00 crc kubenswrapper[4861]: I0309 09:44:00.144840 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="66860c2d-79a4-4a15-a54f-b73d7437ddf3" containerName="oc" Mar 09 09:44:00 crc kubenswrapper[4861]: I0309 09:44:00.145053 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="66860c2d-79a4-4a15-a54f-b73d7437ddf3" containerName="oc" Mar 09 09:44:00 crc kubenswrapper[4861]: I0309 09:44:00.145760 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550824-mzf4x" Mar 09 09:44:00 crc kubenswrapper[4861]: I0309 09:44:00.148125 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:44:00 crc kubenswrapper[4861]: I0309 09:44:00.148168 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:44:00 crc kubenswrapper[4861]: I0309 09:44:00.149234 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k86l8" Mar 09 09:44:00 crc kubenswrapper[4861]: I0309 09:44:00.153170 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550824-mzf4x"] Mar 09 09:44:00 crc kubenswrapper[4861]: I0309 09:44:00.277650 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nb8w\" (UniqueName: \"kubernetes.io/projected/42ad43dd-8bcf-403a-b428-cd7371f0bd1e-kube-api-access-9nb8w\") pod \"auto-csr-approver-29550824-mzf4x\" (UID: \"42ad43dd-8bcf-403a-b428-cd7371f0bd1e\") " pod="openshift-infra/auto-csr-approver-29550824-mzf4x" Mar 09 09:44:00 crc kubenswrapper[4861]: I0309 09:44:00.379325 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nb8w\" (UniqueName: \"kubernetes.io/projected/42ad43dd-8bcf-403a-b428-cd7371f0bd1e-kube-api-access-9nb8w\") pod \"auto-csr-approver-29550824-mzf4x\" (UID: \"42ad43dd-8bcf-403a-b428-cd7371f0bd1e\") " pod="openshift-infra/auto-csr-approver-29550824-mzf4x" Mar 09 09:44:00 crc kubenswrapper[4861]: I0309 09:44:00.397878 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nb8w\" (UniqueName: \"kubernetes.io/projected/42ad43dd-8bcf-403a-b428-cd7371f0bd1e-kube-api-access-9nb8w\") pod \"auto-csr-approver-29550824-mzf4x\" (UID: \"42ad43dd-8bcf-403a-b428-cd7371f0bd1e\") " pod="openshift-infra/auto-csr-approver-29550824-mzf4x" Mar 09 09:44:00 crc kubenswrapper[4861]: I0309 09:44:00.462739 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550824-mzf4x" Mar 09 09:44:00 crc kubenswrapper[4861]: I0309 09:44:00.912108 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550824-mzf4x"] Mar 09 09:44:01 crc kubenswrapper[4861]: I0309 09:44:01.135389 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550824-mzf4x" event={"ID":"42ad43dd-8bcf-403a-b428-cd7371f0bd1e","Type":"ContainerStarted","Data":"3ee111b55bf4de58f2fb7266975f568a6c54b8a3e4289e3f7e550313db470e65"} Mar 09 09:44:03 crc kubenswrapper[4861]: I0309 09:44:03.154067 4861 generic.go:334] "Generic (PLEG): container finished" podID="42ad43dd-8bcf-403a-b428-cd7371f0bd1e" containerID="871a945e24976a7d9cd96a754a3e83c5bc50b4a40a0eca9b1e9acb585ac6fb7c" exitCode=0 Mar 09 09:44:03 crc kubenswrapper[4861]: I0309 09:44:03.154158 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550824-mzf4x" event={"ID":"42ad43dd-8bcf-403a-b428-cd7371f0bd1e","Type":"ContainerDied","Data":"871a945e24976a7d9cd96a754a3e83c5bc50b4a40a0eca9b1e9acb585ac6fb7c"} Mar 09 09:44:04 crc kubenswrapper[4861]: I0309 09:44:04.511884 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550824-mzf4x" Mar 09 09:44:04 crc kubenswrapper[4861]: I0309 09:44:04.664515 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nb8w\" (UniqueName: \"kubernetes.io/projected/42ad43dd-8bcf-403a-b428-cd7371f0bd1e-kube-api-access-9nb8w\") pod \"42ad43dd-8bcf-403a-b428-cd7371f0bd1e\" (UID: \"42ad43dd-8bcf-403a-b428-cd7371f0bd1e\") " Mar 09 09:44:04 crc kubenswrapper[4861]: I0309 09:44:04.670874 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42ad43dd-8bcf-403a-b428-cd7371f0bd1e-kube-api-access-9nb8w" (OuterVolumeSpecName: "kube-api-access-9nb8w") pod "42ad43dd-8bcf-403a-b428-cd7371f0bd1e" (UID: "42ad43dd-8bcf-403a-b428-cd7371f0bd1e"). InnerVolumeSpecName "kube-api-access-9nb8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:44:04 crc kubenswrapper[4861]: I0309 09:44:04.768700 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nb8w\" (UniqueName: \"kubernetes.io/projected/42ad43dd-8bcf-403a-b428-cd7371f0bd1e-kube-api-access-9nb8w\") on node \"crc\" DevicePath \"\"" Mar 09 09:44:05 crc kubenswrapper[4861]: I0309 09:44:05.173064 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550824-mzf4x" event={"ID":"42ad43dd-8bcf-403a-b428-cd7371f0bd1e","Type":"ContainerDied","Data":"3ee111b55bf4de58f2fb7266975f568a6c54b8a3e4289e3f7e550313db470e65"} Mar 09 09:44:05 crc kubenswrapper[4861]: I0309 09:44:05.173122 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ee111b55bf4de58f2fb7266975f568a6c54b8a3e4289e3f7e550313db470e65" Mar 09 09:44:05 crc kubenswrapper[4861]: I0309 09:44:05.173458 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550824-mzf4x" Mar 09 09:44:05 crc kubenswrapper[4861]: I0309 09:44:05.575848 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550818-9ttd2"] Mar 09 09:44:05 crc kubenswrapper[4861]: I0309 09:44:05.584786 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550818-9ttd2"] Mar 09 09:44:05 crc kubenswrapper[4861]: I0309 09:44:05.734925 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eaaefb0a-991f-42b0-9474-43af65c61889" path="/var/lib/kubelet/pods/eaaefb0a-991f-42b0-9474-43af65c61889/volumes" Mar 09 09:44:09 crc kubenswrapper[4861]: I0309 09:44:09.658494 4861 scope.go:117] "RemoveContainer" containerID="480298580e8917b47d66a4586a753752f58dc7e4c2678e95158c475a3b504483" Mar 09 09:44:09 crc kubenswrapper[4861]: E0309 09:44:09.659005 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:44:13 crc kubenswrapper[4861]: I0309 09:44:13.752672 4861 scope.go:117] "RemoveContainer" containerID="cbcf8d3ddb6b7e7c6231812579d51b9e4a1075aba3a1d47372344e8c62d3a1dd" Mar 09 09:44:22 crc kubenswrapper[4861]: I0309 09:44:22.658020 4861 scope.go:117] "RemoveContainer" containerID="480298580e8917b47d66a4586a753752f58dc7e4c2678e95158c475a3b504483" Mar 09 09:44:22 crc kubenswrapper[4861]: E0309 09:44:22.659059 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:44:36 crc kubenswrapper[4861]: I0309 09:44:36.658342 4861 scope.go:117] "RemoveContainer" containerID="480298580e8917b47d66a4586a753752f58dc7e4c2678e95158c475a3b504483" Mar 09 09:44:36 crc kubenswrapper[4861]: E0309 09:44:36.659102 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:44:51 crc kubenswrapper[4861]: I0309 09:44:51.658185 4861 scope.go:117] "RemoveContainer" containerID="480298580e8917b47d66a4586a753752f58dc7e4c2678e95158c475a3b504483" Mar 09 09:44:51 crc kubenswrapper[4861]: E0309 09:44:51.658956 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:45:00 crc kubenswrapper[4861]: I0309 09:45:00.150783 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550825-c44b9"] Mar 09 09:45:00 crc kubenswrapper[4861]: E0309 09:45:00.152153 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42ad43dd-8bcf-403a-b428-cd7371f0bd1e" containerName="oc" Mar 09 09:45:00 crc kubenswrapper[4861]: I0309 09:45:00.152171 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="42ad43dd-8bcf-403a-b428-cd7371f0bd1e" containerName="oc" Mar 09 09:45:00 crc kubenswrapper[4861]: I0309 09:45:00.152407 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="42ad43dd-8bcf-403a-b428-cd7371f0bd1e" containerName="oc" Mar 09 09:45:00 crc kubenswrapper[4861]: I0309 09:45:00.153180 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550825-c44b9" Mar 09 09:45:00 crc kubenswrapper[4861]: I0309 09:45:00.165954 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550825-c44b9"] Mar 09 09:45:00 crc kubenswrapper[4861]: I0309 09:45:00.176637 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 09 09:45:00 crc kubenswrapper[4861]: I0309 09:45:00.176813 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 09 09:45:00 crc kubenswrapper[4861]: I0309 09:45:00.198017 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2cd5f8cb-9957-4a7e-939d-0ad3db002702-config-volume\") pod \"collect-profiles-29550825-c44b9\" (UID: \"2cd5f8cb-9957-4a7e-939d-0ad3db002702\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550825-c44b9" Mar 09 09:45:00 crc kubenswrapper[4861]: I0309 09:45:00.198546 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2cd5f8cb-9957-4a7e-939d-0ad3db002702-secret-volume\") pod \"collect-profiles-29550825-c44b9\" (UID: \"2cd5f8cb-9957-4a7e-939d-0ad3db002702\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550825-c44b9" Mar 09 09:45:00 crc kubenswrapper[4861]: I0309 09:45:00.198592 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mfkd\" (UniqueName: \"kubernetes.io/projected/2cd5f8cb-9957-4a7e-939d-0ad3db002702-kube-api-access-7mfkd\") pod \"collect-profiles-29550825-c44b9\" (UID: \"2cd5f8cb-9957-4a7e-939d-0ad3db002702\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550825-c44b9" Mar 09 09:45:00 crc kubenswrapper[4861]: I0309 09:45:00.299967 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2cd5f8cb-9957-4a7e-939d-0ad3db002702-config-volume\") pod \"collect-profiles-29550825-c44b9\" (UID: \"2cd5f8cb-9957-4a7e-939d-0ad3db002702\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550825-c44b9" Mar 09 09:45:00 crc kubenswrapper[4861]: I0309 09:45:00.300333 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2cd5f8cb-9957-4a7e-939d-0ad3db002702-secret-volume\") pod \"collect-profiles-29550825-c44b9\" (UID: \"2cd5f8cb-9957-4a7e-939d-0ad3db002702\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550825-c44b9" Mar 09 09:45:00 crc kubenswrapper[4861]: I0309 09:45:00.300494 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mfkd\" (UniqueName: \"kubernetes.io/projected/2cd5f8cb-9957-4a7e-939d-0ad3db002702-kube-api-access-7mfkd\") pod \"collect-profiles-29550825-c44b9\" (UID: \"2cd5f8cb-9957-4a7e-939d-0ad3db002702\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550825-c44b9" Mar 09 09:45:00 crc kubenswrapper[4861]: I0309 09:45:00.300821 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2cd5f8cb-9957-4a7e-939d-0ad3db002702-config-volume\") pod \"collect-profiles-29550825-c44b9\" (UID: \"2cd5f8cb-9957-4a7e-939d-0ad3db002702\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550825-c44b9" Mar 09 09:45:00 crc kubenswrapper[4861]: I0309 09:45:00.310082 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2cd5f8cb-9957-4a7e-939d-0ad3db002702-secret-volume\") pod \"collect-profiles-29550825-c44b9\" (UID: \"2cd5f8cb-9957-4a7e-939d-0ad3db002702\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550825-c44b9" Mar 09 09:45:00 crc kubenswrapper[4861]: I0309 09:45:00.318459 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mfkd\" (UniqueName: \"kubernetes.io/projected/2cd5f8cb-9957-4a7e-939d-0ad3db002702-kube-api-access-7mfkd\") pod \"collect-profiles-29550825-c44b9\" (UID: \"2cd5f8cb-9957-4a7e-939d-0ad3db002702\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550825-c44b9" Mar 09 09:45:00 crc kubenswrapper[4861]: I0309 09:45:00.499623 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550825-c44b9" Mar 09 09:45:00 crc kubenswrapper[4861]: I0309 09:45:00.939479 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550825-c44b9"] Mar 09 09:45:01 crc kubenswrapper[4861]: I0309 09:45:01.684106 4861 generic.go:334] "Generic (PLEG): container finished" podID="2cd5f8cb-9957-4a7e-939d-0ad3db002702" containerID="49bfb17a1e3fc0bd4a45ee62296ee7f712d97bad9399b83c2d5c54dd488895a4" exitCode=0 Mar 09 09:45:01 crc kubenswrapper[4861]: I0309 09:45:01.684150 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550825-c44b9" event={"ID":"2cd5f8cb-9957-4a7e-939d-0ad3db002702","Type":"ContainerDied","Data":"49bfb17a1e3fc0bd4a45ee62296ee7f712d97bad9399b83c2d5c54dd488895a4"} Mar 09 09:45:01 crc kubenswrapper[4861]: I0309 09:45:01.684174 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550825-c44b9" event={"ID":"2cd5f8cb-9957-4a7e-939d-0ad3db002702","Type":"ContainerStarted","Data":"b60da7992659ddf442d2f2b3ec8bda93a98ee75ac1ee32b3a88762d409bab283"} Mar 09 09:45:03 crc kubenswrapper[4861]: I0309 09:45:03.051182 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550825-c44b9" Mar 09 09:45:03 crc kubenswrapper[4861]: I0309 09:45:03.152416 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2cd5f8cb-9957-4a7e-939d-0ad3db002702-config-volume\") pod \"2cd5f8cb-9957-4a7e-939d-0ad3db002702\" (UID: \"2cd5f8cb-9957-4a7e-939d-0ad3db002702\") " Mar 09 09:45:03 crc kubenswrapper[4861]: I0309 09:45:03.152988 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2cd5f8cb-9957-4a7e-939d-0ad3db002702-secret-volume\") pod \"2cd5f8cb-9957-4a7e-939d-0ad3db002702\" (UID: \"2cd5f8cb-9957-4a7e-939d-0ad3db002702\") " Mar 09 09:45:03 crc kubenswrapper[4861]: I0309 09:45:03.153059 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mfkd\" (UniqueName: \"kubernetes.io/projected/2cd5f8cb-9957-4a7e-939d-0ad3db002702-kube-api-access-7mfkd\") pod \"2cd5f8cb-9957-4a7e-939d-0ad3db002702\" (UID: \"2cd5f8cb-9957-4a7e-939d-0ad3db002702\") " Mar 09 09:45:03 crc kubenswrapper[4861]: I0309 09:45:03.153297 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cd5f8cb-9957-4a7e-939d-0ad3db002702-config-volume" (OuterVolumeSpecName: "config-volume") pod "2cd5f8cb-9957-4a7e-939d-0ad3db002702" (UID: "2cd5f8cb-9957-4a7e-939d-0ad3db002702"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:45:03 crc kubenswrapper[4861]: I0309 09:45:03.153582 4861 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2cd5f8cb-9957-4a7e-939d-0ad3db002702-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 09:45:03 crc kubenswrapper[4861]: I0309 09:45:03.159187 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cd5f8cb-9957-4a7e-939d-0ad3db002702-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2cd5f8cb-9957-4a7e-939d-0ad3db002702" (UID: "2cd5f8cb-9957-4a7e-939d-0ad3db002702"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:45:03 crc kubenswrapper[4861]: I0309 09:45:03.162395 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cd5f8cb-9957-4a7e-939d-0ad3db002702-kube-api-access-7mfkd" (OuterVolumeSpecName: "kube-api-access-7mfkd") pod "2cd5f8cb-9957-4a7e-939d-0ad3db002702" (UID: "2cd5f8cb-9957-4a7e-939d-0ad3db002702"). InnerVolumeSpecName "kube-api-access-7mfkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:45:03 crc kubenswrapper[4861]: I0309 09:45:03.256111 4861 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2cd5f8cb-9957-4a7e-939d-0ad3db002702-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 09 09:45:03 crc kubenswrapper[4861]: I0309 09:45:03.256188 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mfkd\" (UniqueName: \"kubernetes.io/projected/2cd5f8cb-9957-4a7e-939d-0ad3db002702-kube-api-access-7mfkd\") on node \"crc\" DevicePath \"\"" Mar 09 09:45:03 crc kubenswrapper[4861]: I0309 09:45:03.699595 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550825-c44b9" event={"ID":"2cd5f8cb-9957-4a7e-939d-0ad3db002702","Type":"ContainerDied","Data":"b60da7992659ddf442d2f2b3ec8bda93a98ee75ac1ee32b3a88762d409bab283"} Mar 09 09:45:03 crc kubenswrapper[4861]: I0309 09:45:03.699639 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b60da7992659ddf442d2f2b3ec8bda93a98ee75ac1ee32b3a88762d409bab283" Mar 09 09:45:03 crc kubenswrapper[4861]: I0309 09:45:03.699666 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550825-c44b9" Mar 09 09:45:03 crc kubenswrapper[4861]: E0309 09:45:03.778820 4861 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2cd5f8cb_9957_4a7e_939d_0ad3db002702.slice/crio-b60da7992659ddf442d2f2b3ec8bda93a98ee75ac1ee32b3a88762d409bab283\": RecentStats: unable to find data in memory cache]" Mar 09 09:45:04 crc kubenswrapper[4861]: I0309 09:45:04.125936 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550780-wpnp9"] Mar 09 09:45:04 crc kubenswrapper[4861]: I0309 09:45:04.135942 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550780-wpnp9"] Mar 09 09:45:05 crc kubenswrapper[4861]: I0309 09:45:05.658579 4861 scope.go:117] "RemoveContainer" containerID="480298580e8917b47d66a4586a753752f58dc7e4c2678e95158c475a3b504483" Mar 09 09:45:05 crc kubenswrapper[4861]: E0309 09:45:05.659182 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:45:05 crc kubenswrapper[4861]: I0309 09:45:05.669075 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="608b7b11-f38a-4c4b-9e61-dab4f84c34c1" path="/var/lib/kubelet/pods/608b7b11-f38a-4c4b-9e61-dab4f84c34c1/volumes" Mar 09 09:45:13 crc kubenswrapper[4861]: I0309 09:45:13.821125 4861 scope.go:117] "RemoveContainer" containerID="32abcafa9c87fb0e758e3fc32846e0cfb9678f874a5c21171ddebf46b78f0189" Mar 09 09:45:16 crc kubenswrapper[4861]: I0309 09:45:16.657711 4861 scope.go:117] "RemoveContainer" containerID="480298580e8917b47d66a4586a753752f58dc7e4c2678e95158c475a3b504483" Mar 09 09:45:16 crc kubenswrapper[4861]: E0309 09:45:16.658410 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:45:28 crc kubenswrapper[4861]: I0309 09:45:28.928301 4861 generic.go:334] "Generic (PLEG): container finished" podID="d783d4c7-dfa9-4783-a80c-2938d2a5841d" containerID="6d420dd64bb340980c9305c084a933532f6376a5f09a7aa911792aa8c9aec1a4" exitCode=0 Mar 09 09:45:28 crc kubenswrapper[4861]: I0309 09:45:28.928414 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ngps7" event={"ID":"d783d4c7-dfa9-4783-a80c-2938d2a5841d","Type":"ContainerDied","Data":"6d420dd64bb340980c9305c084a933532f6376a5f09a7aa911792aa8c9aec1a4"} Mar 09 09:45:29 crc kubenswrapper[4861]: I0309 09:45:29.659926 4861 scope.go:117] "RemoveContainer" containerID="480298580e8917b47d66a4586a753752f58dc7e4c2678e95158c475a3b504483" Mar 09 09:45:29 crc kubenswrapper[4861]: E0309 09:45:29.660329 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:45:30 crc kubenswrapper[4861]: I0309 09:45:30.337668 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ngps7" Mar 09 09:45:30 crc kubenswrapper[4861]: I0309 09:45:30.481753 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rgcs\" (UniqueName: \"kubernetes.io/projected/d783d4c7-dfa9-4783-a80c-2938d2a5841d-kube-api-access-9rgcs\") pod \"d783d4c7-dfa9-4783-a80c-2938d2a5841d\" (UID: \"d783d4c7-dfa9-4783-a80c-2938d2a5841d\") " Mar 09 09:45:30 crc kubenswrapper[4861]: I0309 09:45:30.481847 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/d783d4c7-dfa9-4783-a80c-2938d2a5841d-libvirt-secret-0\") pod \"d783d4c7-dfa9-4783-a80c-2938d2a5841d\" (UID: \"d783d4c7-dfa9-4783-a80c-2938d2a5841d\") " Mar 09 09:45:30 crc kubenswrapper[4861]: I0309 09:45:30.481903 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d783d4c7-dfa9-4783-a80c-2938d2a5841d-ssh-key-openstack-edpm-ipam\") pod \"d783d4c7-dfa9-4783-a80c-2938d2a5841d\" (UID: \"d783d4c7-dfa9-4783-a80c-2938d2a5841d\") " Mar 09 09:45:30 crc kubenswrapper[4861]: I0309 09:45:30.482016 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d783d4c7-dfa9-4783-a80c-2938d2a5841d-libvirt-combined-ca-bundle\") pod \"d783d4c7-dfa9-4783-a80c-2938d2a5841d\" (UID: \"d783d4c7-dfa9-4783-a80c-2938d2a5841d\") " Mar 09 09:45:30 crc kubenswrapper[4861]: I0309 09:45:30.482084 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d783d4c7-dfa9-4783-a80c-2938d2a5841d-inventory\") pod \"d783d4c7-dfa9-4783-a80c-2938d2a5841d\" (UID: \"d783d4c7-dfa9-4783-a80c-2938d2a5841d\") " Mar 09 09:45:30 crc kubenswrapper[4861]: I0309 09:45:30.487667 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d783d4c7-dfa9-4783-a80c-2938d2a5841d-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "d783d4c7-dfa9-4783-a80c-2938d2a5841d" (UID: "d783d4c7-dfa9-4783-a80c-2938d2a5841d"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:45:30 crc kubenswrapper[4861]: I0309 09:45:30.488285 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d783d4c7-dfa9-4783-a80c-2938d2a5841d-kube-api-access-9rgcs" (OuterVolumeSpecName: "kube-api-access-9rgcs") pod "d783d4c7-dfa9-4783-a80c-2938d2a5841d" (UID: "d783d4c7-dfa9-4783-a80c-2938d2a5841d"). InnerVolumeSpecName "kube-api-access-9rgcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:45:30 crc kubenswrapper[4861]: I0309 09:45:30.511711 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d783d4c7-dfa9-4783-a80c-2938d2a5841d-inventory" (OuterVolumeSpecName: "inventory") pod "d783d4c7-dfa9-4783-a80c-2938d2a5841d" (UID: "d783d4c7-dfa9-4783-a80c-2938d2a5841d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:45:30 crc kubenswrapper[4861]: I0309 09:45:30.512984 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d783d4c7-dfa9-4783-a80c-2938d2a5841d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d783d4c7-dfa9-4783-a80c-2938d2a5841d" (UID: "d783d4c7-dfa9-4783-a80c-2938d2a5841d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:45:30 crc kubenswrapper[4861]: I0309 09:45:30.513873 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d783d4c7-dfa9-4783-a80c-2938d2a5841d-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "d783d4c7-dfa9-4783-a80c-2938d2a5841d" (UID: "d783d4c7-dfa9-4783-a80c-2938d2a5841d"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:45:30 crc kubenswrapper[4861]: I0309 09:45:30.584622 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rgcs\" (UniqueName: \"kubernetes.io/projected/d783d4c7-dfa9-4783-a80c-2938d2a5841d-kube-api-access-9rgcs\") on node \"crc\" DevicePath \"\"" Mar 09 09:45:30 crc kubenswrapper[4861]: I0309 09:45:30.584673 4861 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/d783d4c7-dfa9-4783-a80c-2938d2a5841d-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 09 09:45:30 crc kubenswrapper[4861]: I0309 09:45:30.584683 4861 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d783d4c7-dfa9-4783-a80c-2938d2a5841d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 09:45:30 crc kubenswrapper[4861]: I0309 09:45:30.584692 4861 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d783d4c7-dfa9-4783-a80c-2938d2a5841d-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:45:30 crc kubenswrapper[4861]: I0309 09:45:30.584702 4861 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d783d4c7-dfa9-4783-a80c-2938d2a5841d-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 09:45:30 crc kubenswrapper[4861]: I0309 09:45:30.945199 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ngps7" event={"ID":"d783d4c7-dfa9-4783-a80c-2938d2a5841d","Type":"ContainerDied","Data":"ac1bdee8e2a19add7534a707c6b9e5ca0ab5c79294e98a258e2f07f58f7d647c"} Mar 09 09:45:30 crc kubenswrapper[4861]: I0309 09:45:30.945484 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac1bdee8e2a19add7534a707c6b9e5ca0ab5c79294e98a258e2f07f58f7d647c" Mar 09 09:45:30 crc kubenswrapper[4861]: I0309 09:45:30.945293 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-ngps7" Mar 09 09:45:31 crc kubenswrapper[4861]: I0309 09:45:31.043304 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-mkzpx"] Mar 09 09:45:31 crc kubenswrapper[4861]: E0309 09:45:31.043833 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cd5f8cb-9957-4a7e-939d-0ad3db002702" containerName="collect-profiles" Mar 09 09:45:31 crc kubenswrapper[4861]: I0309 09:45:31.043858 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cd5f8cb-9957-4a7e-939d-0ad3db002702" containerName="collect-profiles" Mar 09 09:45:31 crc kubenswrapper[4861]: E0309 09:45:31.043880 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d783d4c7-dfa9-4783-a80c-2938d2a5841d" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 09 09:45:31 crc kubenswrapper[4861]: I0309 09:45:31.043891 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="d783d4c7-dfa9-4783-a80c-2938d2a5841d" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 09 09:45:31 crc kubenswrapper[4861]: I0309 09:45:31.044156 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cd5f8cb-9957-4a7e-939d-0ad3db002702" containerName="collect-profiles" Mar 09 09:45:31 crc kubenswrapper[4861]: I0309 09:45:31.044179 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="d783d4c7-dfa9-4783-a80c-2938d2a5841d" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 09 09:45:31 crc kubenswrapper[4861]: I0309 09:45:31.044990 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mkzpx" Mar 09 09:45:31 crc kubenswrapper[4861]: I0309 09:45:31.049250 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lkd5q" Mar 09 09:45:31 crc kubenswrapper[4861]: I0309 09:45:31.049611 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 09:45:31 crc kubenswrapper[4861]: I0309 09:45:31.049780 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 09:45:31 crc kubenswrapper[4861]: I0309 09:45:31.050044 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 09:45:31 crc kubenswrapper[4861]: I0309 09:45:31.050198 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 09 09:45:31 crc kubenswrapper[4861]: I0309 09:45:31.050269 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 09 09:45:31 crc kubenswrapper[4861]: I0309 09:45:31.050429 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Mar 09 09:45:31 crc kubenswrapper[4861]: I0309 09:45:31.058704 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-mkzpx"] Mar 09 09:45:31 crc kubenswrapper[4861]: I0309 09:45:31.196334 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/23236f7d-915f-4619-b5ba-611375aef594-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mkzpx\" (UID: \"23236f7d-915f-4619-b5ba-611375aef594\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mkzpx" Mar 09 09:45:31 crc kubenswrapper[4861]: I0309 09:45:31.196569 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g682l\" (UniqueName: \"kubernetes.io/projected/23236f7d-915f-4619-b5ba-611375aef594-kube-api-access-g682l\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mkzpx\" (UID: \"23236f7d-915f-4619-b5ba-611375aef594\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mkzpx" Mar 09 09:45:31 crc kubenswrapper[4861]: I0309 09:45:31.196632 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23236f7d-915f-4619-b5ba-611375aef594-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mkzpx\" (UID: \"23236f7d-915f-4619-b5ba-611375aef594\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mkzpx" Mar 09 09:45:31 crc kubenswrapper[4861]: I0309 09:45:31.196659 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/23236f7d-915f-4619-b5ba-611375aef594-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mkzpx\" (UID: \"23236f7d-915f-4619-b5ba-611375aef594\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mkzpx" Mar 09 09:45:31 crc kubenswrapper[4861]: I0309 09:45:31.196724 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/23236f7d-915f-4619-b5ba-611375aef594-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mkzpx\" (UID: \"23236f7d-915f-4619-b5ba-611375aef594\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mkzpx" Mar 09 09:45:31 crc kubenswrapper[4861]: I0309 09:45:31.196800 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23236f7d-915f-4619-b5ba-611375aef594-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mkzpx\" (UID: \"23236f7d-915f-4619-b5ba-611375aef594\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mkzpx" Mar 09 09:45:31 crc kubenswrapper[4861]: I0309 09:45:31.196929 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/23236f7d-915f-4619-b5ba-611375aef594-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mkzpx\" (UID: \"23236f7d-915f-4619-b5ba-611375aef594\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mkzpx" Mar 09 09:45:31 crc kubenswrapper[4861]: I0309 09:45:31.196970 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/23236f7d-915f-4619-b5ba-611375aef594-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mkzpx\" (UID: \"23236f7d-915f-4619-b5ba-611375aef594\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mkzpx" Mar 09 09:45:31 crc kubenswrapper[4861]: I0309 09:45:31.197000 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/23236f7d-915f-4619-b5ba-611375aef594-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mkzpx\" (UID: \"23236f7d-915f-4619-b5ba-611375aef594\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mkzpx" Mar 09 09:45:31 crc kubenswrapper[4861]: I0309 09:45:31.197060 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/23236f7d-915f-4619-b5ba-611375aef594-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mkzpx\" (UID: \"23236f7d-915f-4619-b5ba-611375aef594\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mkzpx" Mar 09 09:45:31 crc kubenswrapper[4861]: I0309 09:45:31.197091 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/23236f7d-915f-4619-b5ba-611375aef594-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mkzpx\" (UID: \"23236f7d-915f-4619-b5ba-611375aef594\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mkzpx" Mar 09 09:45:31 crc kubenswrapper[4861]: I0309 09:45:31.299196 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g682l\" (UniqueName: \"kubernetes.io/projected/23236f7d-915f-4619-b5ba-611375aef594-kube-api-access-g682l\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mkzpx\" (UID: \"23236f7d-915f-4619-b5ba-611375aef594\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mkzpx" Mar 09 09:45:31 crc kubenswrapper[4861]: I0309 09:45:31.299672 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23236f7d-915f-4619-b5ba-611375aef594-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mkzpx\" (UID: \"23236f7d-915f-4619-b5ba-611375aef594\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mkzpx" Mar 09 09:45:31 crc kubenswrapper[4861]: I0309 09:45:31.299696 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/23236f7d-915f-4619-b5ba-611375aef594-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mkzpx\" (UID: \"23236f7d-915f-4619-b5ba-611375aef594\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mkzpx" Mar 09 09:45:31 crc kubenswrapper[4861]: I0309 09:45:31.300329 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/23236f7d-915f-4619-b5ba-611375aef594-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mkzpx\" (UID: \"23236f7d-915f-4619-b5ba-611375aef594\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mkzpx" Mar 09 09:45:31 crc kubenswrapper[4861]: I0309 09:45:31.300392 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23236f7d-915f-4619-b5ba-611375aef594-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mkzpx\" (UID: \"23236f7d-915f-4619-b5ba-611375aef594\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mkzpx" Mar 09 09:45:31 crc kubenswrapper[4861]: I0309 09:45:31.300462 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/23236f7d-915f-4619-b5ba-611375aef594-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mkzpx\" (UID: \"23236f7d-915f-4619-b5ba-611375aef594\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mkzpx" Mar 09 09:45:31 crc kubenswrapper[4861]: I0309 09:45:31.300494 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/23236f7d-915f-4619-b5ba-611375aef594-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mkzpx\" (UID: \"23236f7d-915f-4619-b5ba-611375aef594\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mkzpx" Mar 09 09:45:31 crc kubenswrapper[4861]: I0309 09:45:31.300516 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/23236f7d-915f-4619-b5ba-611375aef594-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mkzpx\" (UID: \"23236f7d-915f-4619-b5ba-611375aef594\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mkzpx" Mar 09 09:45:31 crc kubenswrapper[4861]: I0309 09:45:31.300547 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/23236f7d-915f-4619-b5ba-611375aef594-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mkzpx\" (UID: \"23236f7d-915f-4619-b5ba-611375aef594\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mkzpx" Mar 09 09:45:31 crc kubenswrapper[4861]: I0309 09:45:31.300584 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/23236f7d-915f-4619-b5ba-611375aef594-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mkzpx\" (UID: \"23236f7d-915f-4619-b5ba-611375aef594\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mkzpx" Mar 09 09:45:31 crc kubenswrapper[4861]: I0309 09:45:31.300646 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/23236f7d-915f-4619-b5ba-611375aef594-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mkzpx\" (UID: \"23236f7d-915f-4619-b5ba-611375aef594\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mkzpx" Mar 09 09:45:31 crc kubenswrapper[4861]: I0309 09:45:31.301877 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/23236f7d-915f-4619-b5ba-611375aef594-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mkzpx\" (UID: \"23236f7d-915f-4619-b5ba-611375aef594\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mkzpx" Mar 09 09:45:31 crc kubenswrapper[4861]: I0309 09:45:31.304848 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/23236f7d-915f-4619-b5ba-611375aef594-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mkzpx\" (UID: \"23236f7d-915f-4619-b5ba-611375aef594\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mkzpx" Mar 09 09:45:31 crc kubenswrapper[4861]: I0309 09:45:31.304904 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23236f7d-915f-4619-b5ba-611375aef594-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mkzpx\" (UID: \"23236f7d-915f-4619-b5ba-611375aef594\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mkzpx" Mar 09 09:45:31 crc kubenswrapper[4861]: I0309 09:45:31.305352 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/23236f7d-915f-4619-b5ba-611375aef594-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mkzpx\" (UID: \"23236f7d-915f-4619-b5ba-611375aef594\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mkzpx" Mar 09 09:45:31 crc kubenswrapper[4861]: I0309 09:45:31.305505 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/23236f7d-915f-4619-b5ba-611375aef594-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mkzpx\" (UID: \"23236f7d-915f-4619-b5ba-611375aef594\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mkzpx" Mar 09 09:45:31 crc kubenswrapper[4861]: I0309 09:45:31.305632 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23236f7d-915f-4619-b5ba-611375aef594-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mkzpx\" (UID: \"23236f7d-915f-4619-b5ba-611375aef594\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mkzpx" Mar 09 09:45:31 crc kubenswrapper[4861]: I0309 09:45:31.306310 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/23236f7d-915f-4619-b5ba-611375aef594-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mkzpx\" (UID: \"23236f7d-915f-4619-b5ba-611375aef594\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mkzpx" Mar 09 09:45:31 crc kubenswrapper[4861]: I0309 09:45:31.307476 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/23236f7d-915f-4619-b5ba-611375aef594-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mkzpx\" (UID: \"23236f7d-915f-4619-b5ba-611375aef594\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mkzpx" Mar 09 09:45:31 crc kubenswrapper[4861]: I0309 09:45:31.308178 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/23236f7d-915f-4619-b5ba-611375aef594-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mkzpx\" (UID: \"23236f7d-915f-4619-b5ba-611375aef594\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mkzpx" Mar 09 09:45:31 crc kubenswrapper[4861]: I0309 09:45:31.313877 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/23236f7d-915f-4619-b5ba-611375aef594-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mkzpx\" (UID: \"23236f7d-915f-4619-b5ba-611375aef594\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mkzpx" Mar 09 09:45:31 crc kubenswrapper[4861]: I0309 09:45:31.331978 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g682l\" (UniqueName: \"kubernetes.io/projected/23236f7d-915f-4619-b5ba-611375aef594-kube-api-access-g682l\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mkzpx\" (UID: \"23236f7d-915f-4619-b5ba-611375aef594\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mkzpx" Mar 09 09:45:31 crc kubenswrapper[4861]: I0309 09:45:31.362268 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mkzpx" Mar 09 09:45:31 crc kubenswrapper[4861]: I0309 09:45:31.947237 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-mkzpx"] Mar 09 09:45:31 crc kubenswrapper[4861]: I0309 09:45:31.953018 4861 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 09:45:32 crc kubenswrapper[4861]: I0309 09:45:32.968411 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mkzpx" event={"ID":"23236f7d-915f-4619-b5ba-611375aef594","Type":"ContainerStarted","Data":"940efbf486e67295a669b29294f94d534ebd3c0351d2be20ee1ec7f02970902c"} Mar 09 09:45:32 crc kubenswrapper[4861]: I0309 09:45:32.968771 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mkzpx" event={"ID":"23236f7d-915f-4619-b5ba-611375aef594","Type":"ContainerStarted","Data":"bcf93ccc5094f213b7946b5b02304908fdf4b19f39de16134ebb696495a554c8"} Mar 09 09:45:32 crc kubenswrapper[4861]: I0309 09:45:32.990817 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mkzpx" podStartSLOduration=1.5283654420000001 podStartE2EDuration="1.990793459s" podCreationTimestamp="2026-03-09 09:45:31 +0000 UTC" firstStartedPulling="2026-03-09 09:45:31.952777878 +0000 UTC m=+2375.037817279" lastFinishedPulling="2026-03-09 09:45:32.415205895 +0000 UTC m=+2375.500245296" observedRunningTime="2026-03-09 09:45:32.986320464 +0000 UTC m=+2376.071359885" watchObservedRunningTime="2026-03-09 09:45:32.990793459 +0000 UTC m=+2376.075832860" Mar 09 09:45:41 crc kubenswrapper[4861]: I0309 09:45:41.658179 4861 scope.go:117] "RemoveContainer" containerID="480298580e8917b47d66a4586a753752f58dc7e4c2678e95158c475a3b504483" Mar 09 09:45:41 crc kubenswrapper[4861]: E0309 09:45:41.658857 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:45:55 crc kubenswrapper[4861]: I0309 09:45:55.658349 4861 scope.go:117] "RemoveContainer" containerID="480298580e8917b47d66a4586a753752f58dc7e4c2678e95158c475a3b504483" Mar 09 09:45:55 crc kubenswrapper[4861]: E0309 09:45:55.659385 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:46:00 crc kubenswrapper[4861]: I0309 09:46:00.141734 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550826-ht7dz"] Mar 09 09:46:00 crc kubenswrapper[4861]: I0309 09:46:00.145123 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550826-ht7dz" Mar 09 09:46:00 crc kubenswrapper[4861]: I0309 09:46:00.147654 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k86l8" Mar 09 09:46:00 crc kubenswrapper[4861]: I0309 09:46:00.147878 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:46:00 crc kubenswrapper[4861]: I0309 09:46:00.147995 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:46:00 crc kubenswrapper[4861]: I0309 09:46:00.149665 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550826-ht7dz"] Mar 09 09:46:00 crc kubenswrapper[4861]: I0309 09:46:00.299025 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttnk5\" (UniqueName: \"kubernetes.io/projected/d4785c3f-1721-4529-b665-f43afc2e2691-kube-api-access-ttnk5\") pod \"auto-csr-approver-29550826-ht7dz\" (UID: \"d4785c3f-1721-4529-b665-f43afc2e2691\") " pod="openshift-infra/auto-csr-approver-29550826-ht7dz" Mar 09 09:46:00 crc kubenswrapper[4861]: I0309 09:46:00.400560 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttnk5\" (UniqueName: \"kubernetes.io/projected/d4785c3f-1721-4529-b665-f43afc2e2691-kube-api-access-ttnk5\") pod \"auto-csr-approver-29550826-ht7dz\" (UID: \"d4785c3f-1721-4529-b665-f43afc2e2691\") " pod="openshift-infra/auto-csr-approver-29550826-ht7dz" Mar 09 09:46:00 crc kubenswrapper[4861]: I0309 09:46:00.421209 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttnk5\" (UniqueName: \"kubernetes.io/projected/d4785c3f-1721-4529-b665-f43afc2e2691-kube-api-access-ttnk5\") pod \"auto-csr-approver-29550826-ht7dz\" (UID: \"d4785c3f-1721-4529-b665-f43afc2e2691\") " pod="openshift-infra/auto-csr-approver-29550826-ht7dz" Mar 09 09:46:00 crc kubenswrapper[4861]: I0309 09:46:00.468239 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550826-ht7dz" Mar 09 09:46:00 crc kubenswrapper[4861]: I0309 09:46:00.893870 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550826-ht7dz"] Mar 09 09:46:01 crc kubenswrapper[4861]: I0309 09:46:01.195627 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550826-ht7dz" event={"ID":"d4785c3f-1721-4529-b665-f43afc2e2691","Type":"ContainerStarted","Data":"f3d13f714c37397bfd16fcb28bb1674408b812f38a9e9f2dc2a7c021f8bf1b8a"} Mar 09 09:46:03 crc kubenswrapper[4861]: I0309 09:46:03.214992 4861 generic.go:334] "Generic (PLEG): container finished" podID="d4785c3f-1721-4529-b665-f43afc2e2691" containerID="60af282ae6a3914029203361477e4ed680273abf708b540b0ee65159e0872db0" exitCode=0 Mar 09 09:46:03 crc kubenswrapper[4861]: I0309 09:46:03.215257 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550826-ht7dz" event={"ID":"d4785c3f-1721-4529-b665-f43afc2e2691","Type":"ContainerDied","Data":"60af282ae6a3914029203361477e4ed680273abf708b540b0ee65159e0872db0"} Mar 09 09:46:04 crc kubenswrapper[4861]: I0309 09:46:04.486190 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550826-ht7dz" Mar 09 09:46:04 crc kubenswrapper[4861]: I0309 09:46:04.585602 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttnk5\" (UniqueName: \"kubernetes.io/projected/d4785c3f-1721-4529-b665-f43afc2e2691-kube-api-access-ttnk5\") pod \"d4785c3f-1721-4529-b665-f43afc2e2691\" (UID: \"d4785c3f-1721-4529-b665-f43afc2e2691\") " Mar 09 09:46:04 crc kubenswrapper[4861]: I0309 09:46:04.591429 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4785c3f-1721-4529-b665-f43afc2e2691-kube-api-access-ttnk5" (OuterVolumeSpecName: "kube-api-access-ttnk5") pod "d4785c3f-1721-4529-b665-f43afc2e2691" (UID: "d4785c3f-1721-4529-b665-f43afc2e2691"). InnerVolumeSpecName "kube-api-access-ttnk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:46:04 crc kubenswrapper[4861]: I0309 09:46:04.688139 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttnk5\" (UniqueName: \"kubernetes.io/projected/d4785c3f-1721-4529-b665-f43afc2e2691-kube-api-access-ttnk5\") on node \"crc\" DevicePath \"\"" Mar 09 09:46:05 crc kubenswrapper[4861]: I0309 09:46:05.234353 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550826-ht7dz" event={"ID":"d4785c3f-1721-4529-b665-f43afc2e2691","Type":"ContainerDied","Data":"f3d13f714c37397bfd16fcb28bb1674408b812f38a9e9f2dc2a7c021f8bf1b8a"} Mar 09 09:46:05 crc kubenswrapper[4861]: I0309 09:46:05.234704 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3d13f714c37397bfd16fcb28bb1674408b812f38a9e9f2dc2a7c021f8bf1b8a" Mar 09 09:46:05 crc kubenswrapper[4861]: I0309 09:46:05.234438 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550826-ht7dz" Mar 09 09:46:05 crc kubenswrapper[4861]: I0309 09:46:05.549684 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550820-zv5t2"] Mar 09 09:46:05 crc kubenswrapper[4861]: I0309 09:46:05.557262 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550820-zv5t2"] Mar 09 09:46:05 crc kubenswrapper[4861]: I0309 09:46:05.668672 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1574dfe2-f3f7-4f85-9a01-4b436cb68a9c" path="/var/lib/kubelet/pods/1574dfe2-f3f7-4f85-9a01-4b436cb68a9c/volumes" Mar 09 09:46:08 crc kubenswrapper[4861]: I0309 09:46:08.658722 4861 scope.go:117] "RemoveContainer" containerID="480298580e8917b47d66a4586a753752f58dc7e4c2678e95158c475a3b504483" Mar 09 09:46:08 crc kubenswrapper[4861]: E0309 09:46:08.659466 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:46:13 crc kubenswrapper[4861]: I0309 09:46:13.884635 4861 scope.go:117] "RemoveContainer" containerID="68ea9335b1b681a65de954bfe47de66d6b6b1f67d9d8220c96e492d1331065c6" Mar 09 09:46:21 crc kubenswrapper[4861]: I0309 09:46:21.658464 4861 scope.go:117] "RemoveContainer" containerID="480298580e8917b47d66a4586a753752f58dc7e4c2678e95158c475a3b504483" Mar 09 09:46:21 crc kubenswrapper[4861]: E0309 09:46:21.660144 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:46:32 crc kubenswrapper[4861]: I0309 09:46:32.658436 4861 scope.go:117] "RemoveContainer" containerID="480298580e8917b47d66a4586a753752f58dc7e4c2678e95158c475a3b504483" Mar 09 09:46:32 crc kubenswrapper[4861]: E0309 09:46:32.659079 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:46:44 crc kubenswrapper[4861]: I0309 09:46:44.658026 4861 scope.go:117] "RemoveContainer" containerID="480298580e8917b47d66a4586a753752f58dc7e4c2678e95158c475a3b504483" Mar 09 09:46:44 crc kubenswrapper[4861]: E0309 09:46:44.658830 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:46:55 crc kubenswrapper[4861]: I0309 09:46:55.658315 4861 scope.go:117] "RemoveContainer" containerID="480298580e8917b47d66a4586a753752f58dc7e4c2678e95158c475a3b504483" Mar 09 09:46:55 crc kubenswrapper[4861]: E0309 09:46:55.659406 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:47:08 crc kubenswrapper[4861]: I0309 09:47:08.658801 4861 scope.go:117] "RemoveContainer" containerID="480298580e8917b47d66a4586a753752f58dc7e4c2678e95158c475a3b504483" Mar 09 09:47:08 crc kubenswrapper[4861]: E0309 09:47:08.659644 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:47:22 crc kubenswrapper[4861]: I0309 09:47:22.658381 4861 scope.go:117] "RemoveContainer" containerID="480298580e8917b47d66a4586a753752f58dc7e4c2678e95158c475a3b504483" Mar 09 09:47:22 crc kubenswrapper[4861]: E0309 09:47:22.659082 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:47:37 crc kubenswrapper[4861]: I0309 09:47:37.665520 4861 scope.go:117] "RemoveContainer" containerID="480298580e8917b47d66a4586a753752f58dc7e4c2678e95158c475a3b504483" Mar 09 09:47:37 crc kubenswrapper[4861]: E0309 09:47:37.666393 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:47:50 crc kubenswrapper[4861]: I0309 09:47:50.390258 4861 generic.go:334] "Generic (PLEG): container finished" podID="23236f7d-915f-4619-b5ba-611375aef594" containerID="940efbf486e67295a669b29294f94d534ebd3c0351d2be20ee1ec7f02970902c" exitCode=0 Mar 09 09:47:50 crc kubenswrapper[4861]: I0309 09:47:50.390355 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mkzpx" event={"ID":"23236f7d-915f-4619-b5ba-611375aef594","Type":"ContainerDied","Data":"940efbf486e67295a669b29294f94d534ebd3c0351d2be20ee1ec7f02970902c"} Mar 09 09:47:51 crc kubenswrapper[4861]: I0309 09:47:51.666000 4861 scope.go:117] "RemoveContainer" containerID="480298580e8917b47d66a4586a753752f58dc7e4c2678e95158c475a3b504483" Mar 09 09:47:51 crc kubenswrapper[4861]: E0309 09:47:51.666850 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:47:51 crc kubenswrapper[4861]: I0309 09:47:51.853515 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mkzpx" Mar 09 09:47:51 crc kubenswrapper[4861]: I0309 09:47:51.988778 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/23236f7d-915f-4619-b5ba-611375aef594-nova-cell1-compute-config-0\") pod \"23236f7d-915f-4619-b5ba-611375aef594\" (UID: \"23236f7d-915f-4619-b5ba-611375aef594\") " Mar 09 09:47:51 crc kubenswrapper[4861]: I0309 09:47:51.988838 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/23236f7d-915f-4619-b5ba-611375aef594-nova-migration-ssh-key-0\") pod \"23236f7d-915f-4619-b5ba-611375aef594\" (UID: \"23236f7d-915f-4619-b5ba-611375aef594\") " Mar 09 09:47:51 crc kubenswrapper[4861]: I0309 09:47:51.988923 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/23236f7d-915f-4619-b5ba-611375aef594-nova-cell1-compute-config-2\") pod \"23236f7d-915f-4619-b5ba-611375aef594\" (UID: \"23236f7d-915f-4619-b5ba-611375aef594\") " Mar 09 09:47:51 crc kubenswrapper[4861]: I0309 09:47:51.988972 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/23236f7d-915f-4619-b5ba-611375aef594-nova-cell1-compute-config-1\") pod \"23236f7d-915f-4619-b5ba-611375aef594\" (UID: \"23236f7d-915f-4619-b5ba-611375aef594\") " Mar 09 09:47:51 crc kubenswrapper[4861]: I0309 09:47:51.989001 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/23236f7d-915f-4619-b5ba-611375aef594-nova-cell1-compute-config-3\") pod \"23236f7d-915f-4619-b5ba-611375aef594\" (UID: \"23236f7d-915f-4619-b5ba-611375aef594\") " Mar 09 09:47:51 crc kubenswrapper[4861]: I0309 09:47:51.989028 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g682l\" (UniqueName: \"kubernetes.io/projected/23236f7d-915f-4619-b5ba-611375aef594-kube-api-access-g682l\") pod \"23236f7d-915f-4619-b5ba-611375aef594\" (UID: \"23236f7d-915f-4619-b5ba-611375aef594\") " Mar 09 09:47:51 crc kubenswrapper[4861]: I0309 09:47:51.989117 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23236f7d-915f-4619-b5ba-611375aef594-nova-combined-ca-bundle\") pod \"23236f7d-915f-4619-b5ba-611375aef594\" (UID: \"23236f7d-915f-4619-b5ba-611375aef594\") " Mar 09 09:47:51 crc kubenswrapper[4861]: I0309 09:47:51.989143 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/23236f7d-915f-4619-b5ba-611375aef594-nova-migration-ssh-key-1\") pod \"23236f7d-915f-4619-b5ba-611375aef594\" (UID: \"23236f7d-915f-4619-b5ba-611375aef594\") " Mar 09 09:47:51 crc kubenswrapper[4861]: I0309 09:47:51.989229 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/23236f7d-915f-4619-b5ba-611375aef594-ssh-key-openstack-edpm-ipam\") pod \"23236f7d-915f-4619-b5ba-611375aef594\" (UID: \"23236f7d-915f-4619-b5ba-611375aef594\") " Mar 09 09:47:51 crc kubenswrapper[4861]: I0309 09:47:51.989262 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23236f7d-915f-4619-b5ba-611375aef594-inventory\") pod \"23236f7d-915f-4619-b5ba-611375aef594\" (UID: \"23236f7d-915f-4619-b5ba-611375aef594\") " Mar 09 09:47:51 crc kubenswrapper[4861]: I0309 09:47:51.989302 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/23236f7d-915f-4619-b5ba-611375aef594-nova-extra-config-0\") pod \"23236f7d-915f-4619-b5ba-611375aef594\" (UID: \"23236f7d-915f-4619-b5ba-611375aef594\") " Mar 09 09:47:51 crc kubenswrapper[4861]: I0309 09:47:51.995572 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23236f7d-915f-4619-b5ba-611375aef594-kube-api-access-g682l" (OuterVolumeSpecName: "kube-api-access-g682l") pod "23236f7d-915f-4619-b5ba-611375aef594" (UID: "23236f7d-915f-4619-b5ba-611375aef594"). InnerVolumeSpecName "kube-api-access-g682l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:47:52 crc kubenswrapper[4861]: I0309 09:47:52.007586 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23236f7d-915f-4619-b5ba-611375aef594-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "23236f7d-915f-4619-b5ba-611375aef594" (UID: "23236f7d-915f-4619-b5ba-611375aef594"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:47:52 crc kubenswrapper[4861]: I0309 09:47:52.021285 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23236f7d-915f-4619-b5ba-611375aef594-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "23236f7d-915f-4619-b5ba-611375aef594" (UID: "23236f7d-915f-4619-b5ba-611375aef594"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 09:47:52 crc kubenswrapper[4861]: I0309 09:47:52.021690 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23236f7d-915f-4619-b5ba-611375aef594-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "23236f7d-915f-4619-b5ba-611375aef594" (UID: "23236f7d-915f-4619-b5ba-611375aef594"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:47:52 crc kubenswrapper[4861]: I0309 09:47:52.021903 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23236f7d-915f-4619-b5ba-611375aef594-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "23236f7d-915f-4619-b5ba-611375aef594" (UID: "23236f7d-915f-4619-b5ba-611375aef594"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:47:52 crc kubenswrapper[4861]: I0309 09:47:52.022250 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23236f7d-915f-4619-b5ba-611375aef594-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "23236f7d-915f-4619-b5ba-611375aef594" (UID: "23236f7d-915f-4619-b5ba-611375aef594"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:47:52 crc kubenswrapper[4861]: I0309 09:47:52.022706 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23236f7d-915f-4619-b5ba-611375aef594-inventory" (OuterVolumeSpecName: "inventory") pod "23236f7d-915f-4619-b5ba-611375aef594" (UID: "23236f7d-915f-4619-b5ba-611375aef594"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:47:52 crc kubenswrapper[4861]: I0309 09:47:52.023322 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23236f7d-915f-4619-b5ba-611375aef594-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "23236f7d-915f-4619-b5ba-611375aef594" (UID: "23236f7d-915f-4619-b5ba-611375aef594"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:47:52 crc kubenswrapper[4861]: I0309 09:47:52.029350 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23236f7d-915f-4619-b5ba-611375aef594-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "23236f7d-915f-4619-b5ba-611375aef594" (UID: "23236f7d-915f-4619-b5ba-611375aef594"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:47:52 crc kubenswrapper[4861]: I0309 09:47:52.031305 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23236f7d-915f-4619-b5ba-611375aef594-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "23236f7d-915f-4619-b5ba-611375aef594" (UID: "23236f7d-915f-4619-b5ba-611375aef594"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:47:52 crc kubenswrapper[4861]: I0309 09:47:52.031589 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23236f7d-915f-4619-b5ba-611375aef594-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "23236f7d-915f-4619-b5ba-611375aef594" (UID: "23236f7d-915f-4619-b5ba-611375aef594"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:47:52 crc kubenswrapper[4861]: I0309 09:47:52.091511 4861 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/23236f7d-915f-4619-b5ba-611375aef594-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 09 09:47:52 crc kubenswrapper[4861]: I0309 09:47:52.091551 4861 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/23236f7d-915f-4619-b5ba-611375aef594-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 09 09:47:52 crc kubenswrapper[4861]: I0309 09:47:52.091565 4861 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/23236f7d-915f-4619-b5ba-611375aef594-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 09 09:47:52 crc kubenswrapper[4861]: I0309 09:47:52.091579 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g682l\" (UniqueName: \"kubernetes.io/projected/23236f7d-915f-4619-b5ba-611375aef594-kube-api-access-g682l\") on node \"crc\" DevicePath \"\"" Mar 09 09:47:52 crc kubenswrapper[4861]: I0309 09:47:52.091590 4861 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23236f7d-915f-4619-b5ba-611375aef594-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:47:52 crc kubenswrapper[4861]: I0309 09:47:52.091602 4861 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/23236f7d-915f-4619-b5ba-611375aef594-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 09 09:47:52 crc kubenswrapper[4861]: I0309 09:47:52.091614 4861 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/23236f7d-915f-4619-b5ba-611375aef594-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 09:47:52 crc kubenswrapper[4861]: I0309 09:47:52.091625 4861 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/23236f7d-915f-4619-b5ba-611375aef594-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 09:47:52 crc kubenswrapper[4861]: I0309 09:47:52.091636 4861 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/23236f7d-915f-4619-b5ba-611375aef594-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Mar 09 09:47:52 crc kubenswrapper[4861]: I0309 09:47:52.091647 4861 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/23236f7d-915f-4619-b5ba-611375aef594-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 09 09:47:52 crc kubenswrapper[4861]: I0309 09:47:52.091657 4861 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/23236f7d-915f-4619-b5ba-611375aef594-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 09 09:47:52 crc kubenswrapper[4861]: I0309 09:47:52.409530 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mkzpx" event={"ID":"23236f7d-915f-4619-b5ba-611375aef594","Type":"ContainerDied","Data":"bcf93ccc5094f213b7946b5b02304908fdf4b19f39de16134ebb696495a554c8"} Mar 09 09:47:52 crc kubenswrapper[4861]: I0309 09:47:52.409622 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bcf93ccc5094f213b7946b5b02304908fdf4b19f39de16134ebb696495a554c8" Mar 09 09:47:52 crc kubenswrapper[4861]: I0309 09:47:52.409656 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mkzpx" Mar 09 09:47:52 crc kubenswrapper[4861]: I0309 09:47:52.522509 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jbbn4"] Mar 09 09:47:52 crc kubenswrapper[4861]: E0309 09:47:52.522956 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23236f7d-915f-4619-b5ba-611375aef594" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 09 09:47:52 crc kubenswrapper[4861]: I0309 09:47:52.522974 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="23236f7d-915f-4619-b5ba-611375aef594" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 09 09:47:52 crc kubenswrapper[4861]: E0309 09:47:52.522996 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4785c3f-1721-4529-b665-f43afc2e2691" containerName="oc" Mar 09 09:47:52 crc kubenswrapper[4861]: I0309 09:47:52.523004 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4785c3f-1721-4529-b665-f43afc2e2691" containerName="oc" Mar 09 09:47:52 crc kubenswrapper[4861]: I0309 09:47:52.523238 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="23236f7d-915f-4619-b5ba-611375aef594" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 09 09:47:52 crc kubenswrapper[4861]: I0309 09:47:52.523267 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4785c3f-1721-4529-b665-f43afc2e2691" containerName="oc" Mar 09 09:47:52 crc kubenswrapper[4861]: I0309 09:47:52.524133 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jbbn4" Mar 09 09:47:52 crc kubenswrapper[4861]: I0309 09:47:52.526202 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 09:47:52 crc kubenswrapper[4861]: I0309 09:47:52.526608 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 09:47:52 crc kubenswrapper[4861]: I0309 09:47:52.526745 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Mar 09 09:47:52 crc kubenswrapper[4861]: I0309 09:47:52.526777 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lkd5q" Mar 09 09:47:52 crc kubenswrapper[4861]: I0309 09:47:52.528077 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 09:47:52 crc kubenswrapper[4861]: I0309 09:47:52.531142 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jbbn4"] Mar 09 09:47:52 crc kubenswrapper[4861]: I0309 09:47:52.599015 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c47d068-c590-40cb-aeb0-1cc5132d40dd-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jbbn4\" (UID: \"7c47d068-c590-40cb-aeb0-1cc5132d40dd\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jbbn4" Mar 09 09:47:52 crc kubenswrapper[4861]: I0309 09:47:52.599118 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7c47d068-c590-40cb-aeb0-1cc5132d40dd-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jbbn4\" (UID: \"7c47d068-c590-40cb-aeb0-1cc5132d40dd\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jbbn4" Mar 09 09:47:52 crc kubenswrapper[4861]: I0309 09:47:52.599183 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/7c47d068-c590-40cb-aeb0-1cc5132d40dd-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jbbn4\" (UID: \"7c47d068-c590-40cb-aeb0-1cc5132d40dd\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jbbn4" Mar 09 09:47:52 crc kubenswrapper[4861]: I0309 09:47:52.599210 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgpgj\" (UniqueName: \"kubernetes.io/projected/7c47d068-c590-40cb-aeb0-1cc5132d40dd-kube-api-access-zgpgj\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jbbn4\" (UID: \"7c47d068-c590-40cb-aeb0-1cc5132d40dd\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jbbn4" Mar 09 09:47:52 crc kubenswrapper[4861]: I0309 09:47:52.599227 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7c47d068-c590-40cb-aeb0-1cc5132d40dd-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jbbn4\" (UID: \"7c47d068-c590-40cb-aeb0-1cc5132d40dd\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jbbn4" Mar 09 09:47:52 crc kubenswrapper[4861]: I0309 09:47:52.599294 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/7c47d068-c590-40cb-aeb0-1cc5132d40dd-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jbbn4\" (UID: \"7c47d068-c590-40cb-aeb0-1cc5132d40dd\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jbbn4" Mar 09 09:47:52 crc kubenswrapper[4861]: I0309 09:47:52.599311 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/7c47d068-c590-40cb-aeb0-1cc5132d40dd-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jbbn4\" (UID: \"7c47d068-c590-40cb-aeb0-1cc5132d40dd\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jbbn4" Mar 09 09:47:52 crc kubenswrapper[4861]: I0309 09:47:52.701042 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgpgj\" (UniqueName: \"kubernetes.io/projected/7c47d068-c590-40cb-aeb0-1cc5132d40dd-kube-api-access-zgpgj\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jbbn4\" (UID: \"7c47d068-c590-40cb-aeb0-1cc5132d40dd\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jbbn4" Mar 09 09:47:52 crc kubenswrapper[4861]: I0309 09:47:52.701097 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7c47d068-c590-40cb-aeb0-1cc5132d40dd-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jbbn4\" (UID: \"7c47d068-c590-40cb-aeb0-1cc5132d40dd\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jbbn4" Mar 09 09:47:52 crc kubenswrapper[4861]: I0309 09:47:52.701209 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/7c47d068-c590-40cb-aeb0-1cc5132d40dd-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jbbn4\" (UID: \"7c47d068-c590-40cb-aeb0-1cc5132d40dd\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jbbn4" Mar 09 09:47:52 crc kubenswrapper[4861]: I0309 09:47:52.701235 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/7c47d068-c590-40cb-aeb0-1cc5132d40dd-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jbbn4\" (UID: \"7c47d068-c590-40cb-aeb0-1cc5132d40dd\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jbbn4" Mar 09 09:47:52 crc kubenswrapper[4861]: I0309 09:47:52.701289 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c47d068-c590-40cb-aeb0-1cc5132d40dd-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jbbn4\" (UID: \"7c47d068-c590-40cb-aeb0-1cc5132d40dd\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jbbn4" Mar 09 09:47:52 crc kubenswrapper[4861]: I0309 09:47:52.701397 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7c47d068-c590-40cb-aeb0-1cc5132d40dd-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jbbn4\" (UID: \"7c47d068-c590-40cb-aeb0-1cc5132d40dd\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jbbn4" Mar 09 09:47:52 crc kubenswrapper[4861]: I0309 09:47:52.701496 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/7c47d068-c590-40cb-aeb0-1cc5132d40dd-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jbbn4\" (UID: \"7c47d068-c590-40cb-aeb0-1cc5132d40dd\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jbbn4" Mar 09 09:47:52 crc kubenswrapper[4861]: I0309 09:47:52.705257 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/7c47d068-c590-40cb-aeb0-1cc5132d40dd-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jbbn4\" (UID: \"7c47d068-c590-40cb-aeb0-1cc5132d40dd\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jbbn4" Mar 09 09:47:52 crc kubenswrapper[4861]: I0309 09:47:52.706408 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7c47d068-c590-40cb-aeb0-1cc5132d40dd-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jbbn4\" (UID: \"7c47d068-c590-40cb-aeb0-1cc5132d40dd\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jbbn4" Mar 09 09:47:52 crc kubenswrapper[4861]: I0309 09:47:52.707340 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7c47d068-c590-40cb-aeb0-1cc5132d40dd-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jbbn4\" (UID: \"7c47d068-c590-40cb-aeb0-1cc5132d40dd\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jbbn4" Mar 09 09:47:52 crc kubenswrapper[4861]: I0309 09:47:52.707447 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c47d068-c590-40cb-aeb0-1cc5132d40dd-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jbbn4\" (UID: \"7c47d068-c590-40cb-aeb0-1cc5132d40dd\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jbbn4" Mar 09 09:47:52 crc kubenswrapper[4861]: I0309 09:47:52.707767 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/7c47d068-c590-40cb-aeb0-1cc5132d40dd-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jbbn4\" (UID: \"7c47d068-c590-40cb-aeb0-1cc5132d40dd\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jbbn4" Mar 09 09:47:52 crc kubenswrapper[4861]: I0309 09:47:52.708588 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/7c47d068-c590-40cb-aeb0-1cc5132d40dd-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jbbn4\" (UID: \"7c47d068-c590-40cb-aeb0-1cc5132d40dd\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jbbn4" Mar 09 09:47:52 crc kubenswrapper[4861]: I0309 09:47:52.723295 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgpgj\" (UniqueName: \"kubernetes.io/projected/7c47d068-c590-40cb-aeb0-1cc5132d40dd-kube-api-access-zgpgj\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jbbn4\" (UID: \"7c47d068-c590-40cb-aeb0-1cc5132d40dd\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jbbn4" Mar 09 09:47:52 crc kubenswrapper[4861]: I0309 09:47:52.846650 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jbbn4" Mar 09 09:47:53 crc kubenswrapper[4861]: I0309 09:47:53.377849 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jbbn4"] Mar 09 09:47:53 crc kubenswrapper[4861]: I0309 09:47:53.423793 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jbbn4" event={"ID":"7c47d068-c590-40cb-aeb0-1cc5132d40dd","Type":"ContainerStarted","Data":"4d67ee5c709b4749c8e321dd249b807841e4764d228d2bb3f5b950ed87db18fa"} Mar 09 09:47:54 crc kubenswrapper[4861]: I0309 09:47:54.433455 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jbbn4" event={"ID":"7c47d068-c590-40cb-aeb0-1cc5132d40dd","Type":"ContainerStarted","Data":"446f3208fec488850997331fb4b8bebe46db061f436e28183892be6177203e1b"} Mar 09 09:47:54 crc kubenswrapper[4861]: I0309 09:47:54.459868 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jbbn4" podStartSLOduration=1.995280732 podStartE2EDuration="2.459838489s" podCreationTimestamp="2026-03-09 09:47:52 +0000 UTC" firstStartedPulling="2026-03-09 09:47:53.389687474 +0000 UTC m=+2516.474726875" lastFinishedPulling="2026-03-09 09:47:53.854245241 +0000 UTC m=+2516.939284632" observedRunningTime="2026-03-09 09:47:54.453024903 +0000 UTC m=+2517.538064314" watchObservedRunningTime="2026-03-09 09:47:54.459838489 +0000 UTC m=+2517.544877890" Mar 09 09:48:00 crc kubenswrapper[4861]: I0309 09:48:00.135568 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550828-2595v"] Mar 09 09:48:00 crc kubenswrapper[4861]: I0309 09:48:00.137982 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550828-2595v" Mar 09 09:48:00 crc kubenswrapper[4861]: I0309 09:48:00.141000 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:48:00 crc kubenswrapper[4861]: I0309 09:48:00.141020 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:48:00 crc kubenswrapper[4861]: I0309 09:48:00.141132 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k86l8" Mar 09 09:48:00 crc kubenswrapper[4861]: I0309 09:48:00.160949 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550828-2595v"] Mar 09 09:48:00 crc kubenswrapper[4861]: I0309 09:48:00.250515 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8q6hj\" (UniqueName: \"kubernetes.io/projected/ae7917dd-e857-4eb8-bbc3-bcc88452f4d5-kube-api-access-8q6hj\") pod \"auto-csr-approver-29550828-2595v\" (UID: \"ae7917dd-e857-4eb8-bbc3-bcc88452f4d5\") " pod="openshift-infra/auto-csr-approver-29550828-2595v" Mar 09 09:48:00 crc kubenswrapper[4861]: I0309 09:48:00.352547 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8q6hj\" (UniqueName: \"kubernetes.io/projected/ae7917dd-e857-4eb8-bbc3-bcc88452f4d5-kube-api-access-8q6hj\") pod \"auto-csr-approver-29550828-2595v\" (UID: \"ae7917dd-e857-4eb8-bbc3-bcc88452f4d5\") " pod="openshift-infra/auto-csr-approver-29550828-2595v" Mar 09 09:48:00 crc kubenswrapper[4861]: I0309 09:48:00.373172 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8q6hj\" (UniqueName: \"kubernetes.io/projected/ae7917dd-e857-4eb8-bbc3-bcc88452f4d5-kube-api-access-8q6hj\") pod \"auto-csr-approver-29550828-2595v\" (UID: \"ae7917dd-e857-4eb8-bbc3-bcc88452f4d5\") " pod="openshift-infra/auto-csr-approver-29550828-2595v" Mar 09 09:48:00 crc kubenswrapper[4861]: I0309 09:48:00.460041 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550828-2595v" Mar 09 09:48:00 crc kubenswrapper[4861]: I0309 09:48:00.891426 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550828-2595v"] Mar 09 09:48:01 crc kubenswrapper[4861]: I0309 09:48:01.494149 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550828-2595v" event={"ID":"ae7917dd-e857-4eb8-bbc3-bcc88452f4d5","Type":"ContainerStarted","Data":"f2304e9da715dc2d01b45461f360a961d44b7008acd79f4ca451f570fb3b97e3"} Mar 09 09:48:02 crc kubenswrapper[4861]: I0309 09:48:02.504059 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550828-2595v" event={"ID":"ae7917dd-e857-4eb8-bbc3-bcc88452f4d5","Type":"ContainerStarted","Data":"9ec00bfcb40742a5d129a60b8b76d443a9f3b4fc4afda680a4afb795f16d066e"} Mar 09 09:48:02 crc kubenswrapper[4861]: I0309 09:48:02.522160 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550828-2595v" podStartSLOduration=1.228551491 podStartE2EDuration="2.522136507s" podCreationTimestamp="2026-03-09 09:48:00 +0000 UTC" firstStartedPulling="2026-03-09 09:48:00.903626828 +0000 UTC m=+2523.988666229" lastFinishedPulling="2026-03-09 09:48:02.197211854 +0000 UTC m=+2525.282251245" observedRunningTime="2026-03-09 09:48:02.518319333 +0000 UTC m=+2525.603358734" watchObservedRunningTime="2026-03-09 09:48:02.522136507 +0000 UTC m=+2525.607175908" Mar 09 09:48:02 crc kubenswrapper[4861]: I0309 09:48:02.658565 4861 scope.go:117] "RemoveContainer" containerID="480298580e8917b47d66a4586a753752f58dc7e4c2678e95158c475a3b504483" Mar 09 09:48:02 crc kubenswrapper[4861]: E0309 09:48:02.658842 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:48:03 crc kubenswrapper[4861]: I0309 09:48:03.513965 4861 generic.go:334] "Generic (PLEG): container finished" podID="ae7917dd-e857-4eb8-bbc3-bcc88452f4d5" containerID="9ec00bfcb40742a5d129a60b8b76d443a9f3b4fc4afda680a4afb795f16d066e" exitCode=0 Mar 09 09:48:03 crc kubenswrapper[4861]: I0309 09:48:03.514016 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550828-2595v" event={"ID":"ae7917dd-e857-4eb8-bbc3-bcc88452f4d5","Type":"ContainerDied","Data":"9ec00bfcb40742a5d129a60b8b76d443a9f3b4fc4afda680a4afb795f16d066e"} Mar 09 09:48:04 crc kubenswrapper[4861]: I0309 09:48:04.845736 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550828-2595v" Mar 09 09:48:04 crc kubenswrapper[4861]: I0309 09:48:04.947923 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8q6hj\" (UniqueName: \"kubernetes.io/projected/ae7917dd-e857-4eb8-bbc3-bcc88452f4d5-kube-api-access-8q6hj\") pod \"ae7917dd-e857-4eb8-bbc3-bcc88452f4d5\" (UID: \"ae7917dd-e857-4eb8-bbc3-bcc88452f4d5\") " Mar 09 09:48:04 crc kubenswrapper[4861]: I0309 09:48:04.953831 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae7917dd-e857-4eb8-bbc3-bcc88452f4d5-kube-api-access-8q6hj" (OuterVolumeSpecName: "kube-api-access-8q6hj") pod "ae7917dd-e857-4eb8-bbc3-bcc88452f4d5" (UID: "ae7917dd-e857-4eb8-bbc3-bcc88452f4d5"). InnerVolumeSpecName "kube-api-access-8q6hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:48:05 crc kubenswrapper[4861]: I0309 09:48:05.050240 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8q6hj\" (UniqueName: \"kubernetes.io/projected/ae7917dd-e857-4eb8-bbc3-bcc88452f4d5-kube-api-access-8q6hj\") on node \"crc\" DevicePath \"\"" Mar 09 09:48:05 crc kubenswrapper[4861]: I0309 09:48:05.531881 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550828-2595v" event={"ID":"ae7917dd-e857-4eb8-bbc3-bcc88452f4d5","Type":"ContainerDied","Data":"f2304e9da715dc2d01b45461f360a961d44b7008acd79f4ca451f570fb3b97e3"} Mar 09 09:48:05 crc kubenswrapper[4861]: I0309 09:48:05.531952 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2304e9da715dc2d01b45461f360a961d44b7008acd79f4ca451f570fb3b97e3" Mar 09 09:48:05 crc kubenswrapper[4861]: I0309 09:48:05.531923 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550828-2595v" Mar 09 09:48:05 crc kubenswrapper[4861]: I0309 09:48:05.586560 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550822-58lps"] Mar 09 09:48:05 crc kubenswrapper[4861]: I0309 09:48:05.593803 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550822-58lps"] Mar 09 09:48:05 crc kubenswrapper[4861]: I0309 09:48:05.669248 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66860c2d-79a4-4a15-a54f-b73d7437ddf3" path="/var/lib/kubelet/pods/66860c2d-79a4-4a15-a54f-b73d7437ddf3/volumes" Mar 09 09:48:13 crc kubenswrapper[4861]: I0309 09:48:13.658282 4861 scope.go:117] "RemoveContainer" containerID="480298580e8917b47d66a4586a753752f58dc7e4c2678e95158c475a3b504483" Mar 09 09:48:13 crc kubenswrapper[4861]: E0309 09:48:13.659195 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:48:13 crc kubenswrapper[4861]: I0309 09:48:13.982635 4861 scope.go:117] "RemoveContainer" containerID="8731182a628b4853ca5ecae70837211ca10e117901062127b0e4ec4447097c98" Mar 09 09:48:27 crc kubenswrapper[4861]: I0309 09:48:27.665565 4861 scope.go:117] "RemoveContainer" containerID="480298580e8917b47d66a4586a753752f58dc7e4c2678e95158c475a3b504483" Mar 09 09:48:27 crc kubenswrapper[4861]: E0309 09:48:27.666650 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:48:42 crc kubenswrapper[4861]: I0309 09:48:42.658100 4861 scope.go:117] "RemoveContainer" containerID="480298580e8917b47d66a4586a753752f58dc7e4c2678e95158c475a3b504483" Mar 09 09:48:42 crc kubenswrapper[4861]: E0309 09:48:42.658821 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:48:53 crc kubenswrapper[4861]: I0309 09:48:53.658930 4861 scope.go:117] "RemoveContainer" containerID="480298580e8917b47d66a4586a753752f58dc7e4c2678e95158c475a3b504483" Mar 09 09:48:53 crc kubenswrapper[4861]: E0309 09:48:53.660007 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:49:06 crc kubenswrapper[4861]: I0309 09:49:06.658430 4861 scope.go:117] "RemoveContainer" containerID="480298580e8917b47d66a4586a753752f58dc7e4c2678e95158c475a3b504483" Mar 09 09:49:08 crc kubenswrapper[4861]: I0309 09:49:08.300790 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" event={"ID":"6f7875e3-174f-4c67-8675-d878de74aa4f","Type":"ContainerStarted","Data":"e048477124348cf603efe8e3f38bd683df1e311fb377d2679ee9dcf66a493f5e"} Mar 09 09:50:00 crc kubenswrapper[4861]: I0309 09:50:00.147185 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550830-xb7vc"] Mar 09 09:50:00 crc kubenswrapper[4861]: E0309 09:50:00.148153 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae7917dd-e857-4eb8-bbc3-bcc88452f4d5" containerName="oc" Mar 09 09:50:00 crc kubenswrapper[4861]: I0309 09:50:00.148166 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae7917dd-e857-4eb8-bbc3-bcc88452f4d5" containerName="oc" Mar 09 09:50:00 crc kubenswrapper[4861]: I0309 09:50:00.148387 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae7917dd-e857-4eb8-bbc3-bcc88452f4d5" containerName="oc" Mar 09 09:50:00 crc kubenswrapper[4861]: I0309 09:50:00.148993 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550830-xb7vc" Mar 09 09:50:00 crc kubenswrapper[4861]: I0309 09:50:00.151315 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:50:00 crc kubenswrapper[4861]: I0309 09:50:00.151390 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:50:00 crc kubenswrapper[4861]: I0309 09:50:00.151424 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k86l8" Mar 09 09:50:00 crc kubenswrapper[4861]: I0309 09:50:00.158763 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550830-xb7vc"] Mar 09 09:50:00 crc kubenswrapper[4861]: I0309 09:50:00.309931 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lk7ms\" (UniqueName: \"kubernetes.io/projected/4c10f203-a354-498a-9c0e-07a612cd16b0-kube-api-access-lk7ms\") pod \"auto-csr-approver-29550830-xb7vc\" (UID: \"4c10f203-a354-498a-9c0e-07a612cd16b0\") " pod="openshift-infra/auto-csr-approver-29550830-xb7vc" Mar 09 09:50:00 crc kubenswrapper[4861]: I0309 09:50:00.412285 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lk7ms\" (UniqueName: \"kubernetes.io/projected/4c10f203-a354-498a-9c0e-07a612cd16b0-kube-api-access-lk7ms\") pod \"auto-csr-approver-29550830-xb7vc\" (UID: \"4c10f203-a354-498a-9c0e-07a612cd16b0\") " pod="openshift-infra/auto-csr-approver-29550830-xb7vc" Mar 09 09:50:00 crc kubenswrapper[4861]: I0309 09:50:00.436335 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lk7ms\" (UniqueName: \"kubernetes.io/projected/4c10f203-a354-498a-9c0e-07a612cd16b0-kube-api-access-lk7ms\") pod \"auto-csr-approver-29550830-xb7vc\" (UID: \"4c10f203-a354-498a-9c0e-07a612cd16b0\") " pod="openshift-infra/auto-csr-approver-29550830-xb7vc" Mar 09 09:50:00 crc kubenswrapper[4861]: I0309 09:50:00.472950 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550830-xb7vc" Mar 09 09:50:00 crc kubenswrapper[4861]: I0309 09:50:00.917586 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550830-xb7vc"] Mar 09 09:50:01 crc kubenswrapper[4861]: I0309 09:50:01.796014 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550830-xb7vc" event={"ID":"4c10f203-a354-498a-9c0e-07a612cd16b0","Type":"ContainerStarted","Data":"d1c879ad95fec791f0d4c4bc17f06a055b2ca09ad0faea2b581b3a810cdaf8cd"} Mar 09 09:50:02 crc kubenswrapper[4861]: I0309 09:50:02.806811 4861 generic.go:334] "Generic (PLEG): container finished" podID="4c10f203-a354-498a-9c0e-07a612cd16b0" containerID="c57bf3d45e8f27c7e72879defa1c19e7e9eef9510d7a7f05aa700a45703446dc" exitCode=0 Mar 09 09:50:02 crc kubenswrapper[4861]: I0309 09:50:02.806894 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550830-xb7vc" event={"ID":"4c10f203-a354-498a-9c0e-07a612cd16b0","Type":"ContainerDied","Data":"c57bf3d45e8f27c7e72879defa1c19e7e9eef9510d7a7f05aa700a45703446dc"} Mar 09 09:50:04 crc kubenswrapper[4861]: I0309 09:50:04.132680 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550830-xb7vc" Mar 09 09:50:04 crc kubenswrapper[4861]: I0309 09:50:04.286527 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lk7ms\" (UniqueName: \"kubernetes.io/projected/4c10f203-a354-498a-9c0e-07a612cd16b0-kube-api-access-lk7ms\") pod \"4c10f203-a354-498a-9c0e-07a612cd16b0\" (UID: \"4c10f203-a354-498a-9c0e-07a612cd16b0\") " Mar 09 09:50:04 crc kubenswrapper[4861]: I0309 09:50:04.294098 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c10f203-a354-498a-9c0e-07a612cd16b0-kube-api-access-lk7ms" (OuterVolumeSpecName: "kube-api-access-lk7ms") pod "4c10f203-a354-498a-9c0e-07a612cd16b0" (UID: "4c10f203-a354-498a-9c0e-07a612cd16b0"). InnerVolumeSpecName "kube-api-access-lk7ms". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:50:04 crc kubenswrapper[4861]: I0309 09:50:04.388715 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lk7ms\" (UniqueName: \"kubernetes.io/projected/4c10f203-a354-498a-9c0e-07a612cd16b0-kube-api-access-lk7ms\") on node \"crc\" DevicePath \"\"" Mar 09 09:50:04 crc kubenswrapper[4861]: I0309 09:50:04.830708 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550830-xb7vc" event={"ID":"4c10f203-a354-498a-9c0e-07a612cd16b0","Type":"ContainerDied","Data":"d1c879ad95fec791f0d4c4bc17f06a055b2ca09ad0faea2b581b3a810cdaf8cd"} Mar 09 09:50:04 crc kubenswrapper[4861]: I0309 09:50:04.830773 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1c879ad95fec791f0d4c4bc17f06a055b2ca09ad0faea2b581b3a810cdaf8cd" Mar 09 09:50:04 crc kubenswrapper[4861]: I0309 09:50:04.830836 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550830-xb7vc" Mar 09 09:50:05 crc kubenswrapper[4861]: I0309 09:50:05.204145 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550824-mzf4x"] Mar 09 09:50:05 crc kubenswrapper[4861]: I0309 09:50:05.213259 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550824-mzf4x"] Mar 09 09:50:05 crc kubenswrapper[4861]: I0309 09:50:05.671531 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42ad43dd-8bcf-403a-b428-cd7371f0bd1e" path="/var/lib/kubelet/pods/42ad43dd-8bcf-403a-b428-cd7371f0bd1e/volumes" Mar 09 09:50:07 crc kubenswrapper[4861]: I0309 09:50:07.859603 4861 generic.go:334] "Generic (PLEG): container finished" podID="7c47d068-c590-40cb-aeb0-1cc5132d40dd" containerID="446f3208fec488850997331fb4b8bebe46db061f436e28183892be6177203e1b" exitCode=0 Mar 09 09:50:07 crc kubenswrapper[4861]: I0309 09:50:07.859672 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jbbn4" event={"ID":"7c47d068-c590-40cb-aeb0-1cc5132d40dd","Type":"ContainerDied","Data":"446f3208fec488850997331fb4b8bebe46db061f436e28183892be6177203e1b"} Mar 09 09:50:09 crc kubenswrapper[4861]: I0309 09:50:09.299601 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jbbn4" Mar 09 09:50:09 crc kubenswrapper[4861]: I0309 09:50:09.483616 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/7c47d068-c590-40cb-aeb0-1cc5132d40dd-ceilometer-compute-config-data-1\") pod \"7c47d068-c590-40cb-aeb0-1cc5132d40dd\" (UID: \"7c47d068-c590-40cb-aeb0-1cc5132d40dd\") " Mar 09 09:50:09 crc kubenswrapper[4861]: I0309 09:50:09.483686 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7c47d068-c590-40cb-aeb0-1cc5132d40dd-ssh-key-openstack-edpm-ipam\") pod \"7c47d068-c590-40cb-aeb0-1cc5132d40dd\" (UID: \"7c47d068-c590-40cb-aeb0-1cc5132d40dd\") " Mar 09 09:50:09 crc kubenswrapper[4861]: I0309 09:50:09.483849 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7c47d068-c590-40cb-aeb0-1cc5132d40dd-inventory\") pod \"7c47d068-c590-40cb-aeb0-1cc5132d40dd\" (UID: \"7c47d068-c590-40cb-aeb0-1cc5132d40dd\") " Mar 09 09:50:09 crc kubenswrapper[4861]: I0309 09:50:09.483914 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/7c47d068-c590-40cb-aeb0-1cc5132d40dd-ceilometer-compute-config-data-0\") pod \"7c47d068-c590-40cb-aeb0-1cc5132d40dd\" (UID: \"7c47d068-c590-40cb-aeb0-1cc5132d40dd\") " Mar 09 09:50:09 crc kubenswrapper[4861]: I0309 09:50:09.483975 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c47d068-c590-40cb-aeb0-1cc5132d40dd-telemetry-combined-ca-bundle\") pod \"7c47d068-c590-40cb-aeb0-1cc5132d40dd\" (UID: \"7c47d068-c590-40cb-aeb0-1cc5132d40dd\") " Mar 09 09:50:09 crc kubenswrapper[4861]: I0309 09:50:09.484070 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/7c47d068-c590-40cb-aeb0-1cc5132d40dd-ceilometer-compute-config-data-2\") pod \"7c47d068-c590-40cb-aeb0-1cc5132d40dd\" (UID: \"7c47d068-c590-40cb-aeb0-1cc5132d40dd\") " Mar 09 09:50:09 crc kubenswrapper[4861]: I0309 09:50:09.484119 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgpgj\" (UniqueName: \"kubernetes.io/projected/7c47d068-c590-40cb-aeb0-1cc5132d40dd-kube-api-access-zgpgj\") pod \"7c47d068-c590-40cb-aeb0-1cc5132d40dd\" (UID: \"7c47d068-c590-40cb-aeb0-1cc5132d40dd\") " Mar 09 09:50:09 crc kubenswrapper[4861]: I0309 09:50:09.491879 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c47d068-c590-40cb-aeb0-1cc5132d40dd-kube-api-access-zgpgj" (OuterVolumeSpecName: "kube-api-access-zgpgj") pod "7c47d068-c590-40cb-aeb0-1cc5132d40dd" (UID: "7c47d068-c590-40cb-aeb0-1cc5132d40dd"). InnerVolumeSpecName "kube-api-access-zgpgj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:50:09 crc kubenswrapper[4861]: I0309 09:50:09.494192 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c47d068-c590-40cb-aeb0-1cc5132d40dd-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "7c47d068-c590-40cb-aeb0-1cc5132d40dd" (UID: "7c47d068-c590-40cb-aeb0-1cc5132d40dd"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:50:09 crc kubenswrapper[4861]: I0309 09:50:09.519698 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c47d068-c590-40cb-aeb0-1cc5132d40dd-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "7c47d068-c590-40cb-aeb0-1cc5132d40dd" (UID: "7c47d068-c590-40cb-aeb0-1cc5132d40dd"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:50:09 crc kubenswrapper[4861]: I0309 09:50:09.519848 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c47d068-c590-40cb-aeb0-1cc5132d40dd-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "7c47d068-c590-40cb-aeb0-1cc5132d40dd" (UID: "7c47d068-c590-40cb-aeb0-1cc5132d40dd"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:50:09 crc kubenswrapper[4861]: I0309 09:50:09.523255 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c47d068-c590-40cb-aeb0-1cc5132d40dd-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "7c47d068-c590-40cb-aeb0-1cc5132d40dd" (UID: "7c47d068-c590-40cb-aeb0-1cc5132d40dd"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:50:09 crc kubenswrapper[4861]: I0309 09:50:09.523882 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c47d068-c590-40cb-aeb0-1cc5132d40dd-inventory" (OuterVolumeSpecName: "inventory") pod "7c47d068-c590-40cb-aeb0-1cc5132d40dd" (UID: "7c47d068-c590-40cb-aeb0-1cc5132d40dd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:50:09 crc kubenswrapper[4861]: I0309 09:50:09.528584 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c47d068-c590-40cb-aeb0-1cc5132d40dd-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7c47d068-c590-40cb-aeb0-1cc5132d40dd" (UID: "7c47d068-c590-40cb-aeb0-1cc5132d40dd"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 09:50:09 crc kubenswrapper[4861]: I0309 09:50:09.586815 4861 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7c47d068-c590-40cb-aeb0-1cc5132d40dd-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 09:50:09 crc kubenswrapper[4861]: I0309 09:50:09.587043 4861 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/7c47d068-c590-40cb-aeb0-1cc5132d40dd-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 09 09:50:09 crc kubenswrapper[4861]: I0309 09:50:09.587128 4861 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c47d068-c590-40cb-aeb0-1cc5132d40dd-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 09:50:09 crc kubenswrapper[4861]: I0309 09:50:09.587201 4861 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/7c47d068-c590-40cb-aeb0-1cc5132d40dd-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 09 09:50:09 crc kubenswrapper[4861]: I0309 09:50:09.587305 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgpgj\" (UniqueName: \"kubernetes.io/projected/7c47d068-c590-40cb-aeb0-1cc5132d40dd-kube-api-access-zgpgj\") on node \"crc\" DevicePath \"\"" Mar 09 09:50:09 crc kubenswrapper[4861]: I0309 09:50:09.587406 4861 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/7c47d068-c590-40cb-aeb0-1cc5132d40dd-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 09 09:50:09 crc kubenswrapper[4861]: I0309 09:50:09.587498 4861 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7c47d068-c590-40cb-aeb0-1cc5132d40dd-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 09:50:09 crc kubenswrapper[4861]: I0309 09:50:09.879852 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jbbn4" event={"ID":"7c47d068-c590-40cb-aeb0-1cc5132d40dd","Type":"ContainerDied","Data":"4d67ee5c709b4749c8e321dd249b807841e4764d228d2bb3f5b950ed87db18fa"} Mar 09 09:50:09 crc kubenswrapper[4861]: I0309 09:50:09.880095 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d67ee5c709b4749c8e321dd249b807841e4764d228d2bb3f5b950ed87db18fa" Mar 09 09:50:09 crc kubenswrapper[4861]: I0309 09:50:09.879914 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jbbn4" Mar 09 09:50:14 crc kubenswrapper[4861]: I0309 09:50:14.103282 4861 scope.go:117] "RemoveContainer" containerID="871a945e24976a7d9cd96a754a3e83c5bc50b4a40a0eca9b1e9acb585ac6fb7c" Mar 09 09:50:38 crc kubenswrapper[4861]: I0309 09:50:38.351009 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-r4fdl"] Mar 09 09:50:38 crc kubenswrapper[4861]: E0309 09:50:38.352029 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c10f203-a354-498a-9c0e-07a612cd16b0" containerName="oc" Mar 09 09:50:38 crc kubenswrapper[4861]: I0309 09:50:38.352042 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c10f203-a354-498a-9c0e-07a612cd16b0" containerName="oc" Mar 09 09:50:38 crc kubenswrapper[4861]: E0309 09:50:38.352070 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c47d068-c590-40cb-aeb0-1cc5132d40dd" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 09 09:50:38 crc kubenswrapper[4861]: I0309 09:50:38.352078 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c47d068-c590-40cb-aeb0-1cc5132d40dd" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 09 09:50:38 crc kubenswrapper[4861]: I0309 09:50:38.352261 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c10f203-a354-498a-9c0e-07a612cd16b0" containerName="oc" Mar 09 09:50:38 crc kubenswrapper[4861]: I0309 09:50:38.352287 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c47d068-c590-40cb-aeb0-1cc5132d40dd" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 09 09:50:38 crc kubenswrapper[4861]: I0309 09:50:38.362239 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r4fdl"] Mar 09 09:50:38 crc kubenswrapper[4861]: I0309 09:50:38.362348 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r4fdl" Mar 09 09:50:38 crc kubenswrapper[4861]: I0309 09:50:38.436852 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0245bd08-438e-4a4a-820f-68b40ce22632-utilities\") pod \"community-operators-r4fdl\" (UID: \"0245bd08-438e-4a4a-820f-68b40ce22632\") " pod="openshift-marketplace/community-operators-r4fdl" Mar 09 09:50:38 crc kubenswrapper[4861]: I0309 09:50:38.436910 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqt6k\" (UniqueName: \"kubernetes.io/projected/0245bd08-438e-4a4a-820f-68b40ce22632-kube-api-access-mqt6k\") pod \"community-operators-r4fdl\" (UID: \"0245bd08-438e-4a4a-820f-68b40ce22632\") " pod="openshift-marketplace/community-operators-r4fdl" Mar 09 09:50:38 crc kubenswrapper[4861]: I0309 09:50:38.437024 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0245bd08-438e-4a4a-820f-68b40ce22632-catalog-content\") pod \"community-operators-r4fdl\" (UID: \"0245bd08-438e-4a4a-820f-68b40ce22632\") " pod="openshift-marketplace/community-operators-r4fdl" Mar 09 09:50:38 crc kubenswrapper[4861]: I0309 09:50:38.539454 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0245bd08-438e-4a4a-820f-68b40ce22632-catalog-content\") pod \"community-operators-r4fdl\" (UID: \"0245bd08-438e-4a4a-820f-68b40ce22632\") " pod="openshift-marketplace/community-operators-r4fdl" Mar 09 09:50:38 crc kubenswrapper[4861]: I0309 09:50:38.539599 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0245bd08-438e-4a4a-820f-68b40ce22632-utilities\") pod \"community-operators-r4fdl\" (UID: \"0245bd08-438e-4a4a-820f-68b40ce22632\") " pod="openshift-marketplace/community-operators-r4fdl" Mar 09 09:50:38 crc kubenswrapper[4861]: I0309 09:50:38.539641 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqt6k\" (UniqueName: \"kubernetes.io/projected/0245bd08-438e-4a4a-820f-68b40ce22632-kube-api-access-mqt6k\") pod \"community-operators-r4fdl\" (UID: \"0245bd08-438e-4a4a-820f-68b40ce22632\") " pod="openshift-marketplace/community-operators-r4fdl" Mar 09 09:50:38 crc kubenswrapper[4861]: I0309 09:50:38.540031 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0245bd08-438e-4a4a-820f-68b40ce22632-catalog-content\") pod \"community-operators-r4fdl\" (UID: \"0245bd08-438e-4a4a-820f-68b40ce22632\") " pod="openshift-marketplace/community-operators-r4fdl" Mar 09 09:50:38 crc kubenswrapper[4861]: I0309 09:50:38.540144 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0245bd08-438e-4a4a-820f-68b40ce22632-utilities\") pod \"community-operators-r4fdl\" (UID: \"0245bd08-438e-4a4a-820f-68b40ce22632\") " pod="openshift-marketplace/community-operators-r4fdl" Mar 09 09:50:38 crc kubenswrapper[4861]: I0309 09:50:38.560225 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqt6k\" (UniqueName: \"kubernetes.io/projected/0245bd08-438e-4a4a-820f-68b40ce22632-kube-api-access-mqt6k\") pod \"community-operators-r4fdl\" (UID: \"0245bd08-438e-4a4a-820f-68b40ce22632\") " pod="openshift-marketplace/community-operators-r4fdl" Mar 09 09:50:38 crc kubenswrapper[4861]: I0309 09:50:38.690414 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r4fdl" Mar 09 09:50:39 crc kubenswrapper[4861]: I0309 09:50:39.317249 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r4fdl"] Mar 09 09:50:40 crc kubenswrapper[4861]: I0309 09:50:40.146923 4861 generic.go:334] "Generic (PLEG): container finished" podID="0245bd08-438e-4a4a-820f-68b40ce22632" containerID="8dfd68a5130da373c0ade3c9149e3ca9b9e97035785c50fe209a9d18b08799e4" exitCode=0 Mar 09 09:50:40 crc kubenswrapper[4861]: I0309 09:50:40.146986 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r4fdl" event={"ID":"0245bd08-438e-4a4a-820f-68b40ce22632","Type":"ContainerDied","Data":"8dfd68a5130da373c0ade3c9149e3ca9b9e97035785c50fe209a9d18b08799e4"} Mar 09 09:50:40 crc kubenswrapper[4861]: I0309 09:50:40.147228 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r4fdl" event={"ID":"0245bd08-438e-4a4a-820f-68b40ce22632","Type":"ContainerStarted","Data":"4e19012ef39a8c2a4bd8b9f53e2bbea80e6a6186cefd8ac4d86417d8ecf5c2fa"} Mar 09 09:50:40 crc kubenswrapper[4861]: I0309 09:50:40.151531 4861 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 09:50:41 crc kubenswrapper[4861]: I0309 09:50:41.156890 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r4fdl" event={"ID":"0245bd08-438e-4a4a-820f-68b40ce22632","Type":"ContainerStarted","Data":"7102b122279fce487db974a970aa7efecc8b7368691c6f6b1a39a0d755425956"} Mar 09 09:50:42 crc kubenswrapper[4861]: I0309 09:50:42.168409 4861 generic.go:334] "Generic (PLEG): container finished" podID="0245bd08-438e-4a4a-820f-68b40ce22632" containerID="7102b122279fce487db974a970aa7efecc8b7368691c6f6b1a39a0d755425956" exitCode=0 Mar 09 09:50:42 crc kubenswrapper[4861]: I0309 09:50:42.168523 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r4fdl" event={"ID":"0245bd08-438e-4a4a-820f-68b40ce22632","Type":"ContainerDied","Data":"7102b122279fce487db974a970aa7efecc8b7368691c6f6b1a39a0d755425956"} Mar 09 09:50:43 crc kubenswrapper[4861]: I0309 09:50:43.182232 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r4fdl" event={"ID":"0245bd08-438e-4a4a-820f-68b40ce22632","Type":"ContainerStarted","Data":"c2703b5623481d90feb61c00fb002d129859c87b37289f0c70d66d313e3b1f83"} Mar 09 09:50:43 crc kubenswrapper[4861]: I0309 09:50:43.209112 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-r4fdl" podStartSLOduration=2.762430979 podStartE2EDuration="5.209087664s" podCreationTimestamp="2026-03-09 09:50:38 +0000 UTC" firstStartedPulling="2026-03-09 09:50:40.151235139 +0000 UTC m=+2683.236274540" lastFinishedPulling="2026-03-09 09:50:42.597891824 +0000 UTC m=+2685.682931225" observedRunningTime="2026-03-09 09:50:43.202508495 +0000 UTC m=+2686.287547916" watchObservedRunningTime="2026-03-09 09:50:43.209087664 +0000 UTC m=+2686.294127085" Mar 09 09:50:48 crc kubenswrapper[4861]: I0309 09:50:48.691331 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-r4fdl" Mar 09 09:50:48 crc kubenswrapper[4861]: I0309 09:50:48.691950 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-r4fdl" Mar 09 09:50:48 crc kubenswrapper[4861]: I0309 09:50:48.741885 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-r4fdl" Mar 09 09:50:49 crc kubenswrapper[4861]: I0309 09:50:49.288783 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-r4fdl" Mar 09 09:50:49 crc kubenswrapper[4861]: I0309 09:50:49.346122 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r4fdl"] Mar 09 09:50:51 crc kubenswrapper[4861]: I0309 09:50:51.259528 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-r4fdl" podUID="0245bd08-438e-4a4a-820f-68b40ce22632" containerName="registry-server" containerID="cri-o://c2703b5623481d90feb61c00fb002d129859c87b37289f0c70d66d313e3b1f83" gracePeriod=2 Mar 09 09:50:51 crc kubenswrapper[4861]: I0309 09:50:51.698836 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r4fdl" Mar 09 09:50:51 crc kubenswrapper[4861]: I0309 09:50:51.827881 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0245bd08-438e-4a4a-820f-68b40ce22632-utilities\") pod \"0245bd08-438e-4a4a-820f-68b40ce22632\" (UID: \"0245bd08-438e-4a4a-820f-68b40ce22632\") " Mar 09 09:50:51 crc kubenswrapper[4861]: I0309 09:50:51.827936 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqt6k\" (UniqueName: \"kubernetes.io/projected/0245bd08-438e-4a4a-820f-68b40ce22632-kube-api-access-mqt6k\") pod \"0245bd08-438e-4a4a-820f-68b40ce22632\" (UID: \"0245bd08-438e-4a4a-820f-68b40ce22632\") " Mar 09 09:50:51 crc kubenswrapper[4861]: I0309 09:50:51.828199 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0245bd08-438e-4a4a-820f-68b40ce22632-catalog-content\") pod \"0245bd08-438e-4a4a-820f-68b40ce22632\" (UID: \"0245bd08-438e-4a4a-820f-68b40ce22632\") " Mar 09 09:50:51 crc kubenswrapper[4861]: I0309 09:50:51.829593 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0245bd08-438e-4a4a-820f-68b40ce22632-utilities" (OuterVolumeSpecName: "utilities") pod "0245bd08-438e-4a4a-820f-68b40ce22632" (UID: "0245bd08-438e-4a4a-820f-68b40ce22632"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:50:51 crc kubenswrapper[4861]: I0309 09:50:51.833805 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0245bd08-438e-4a4a-820f-68b40ce22632-kube-api-access-mqt6k" (OuterVolumeSpecName: "kube-api-access-mqt6k") pod "0245bd08-438e-4a4a-820f-68b40ce22632" (UID: "0245bd08-438e-4a4a-820f-68b40ce22632"). InnerVolumeSpecName "kube-api-access-mqt6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:50:51 crc kubenswrapper[4861]: I0309 09:50:51.887810 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0245bd08-438e-4a4a-820f-68b40ce22632-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0245bd08-438e-4a4a-820f-68b40ce22632" (UID: "0245bd08-438e-4a4a-820f-68b40ce22632"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:50:51 crc kubenswrapper[4861]: I0309 09:50:51.930025 4861 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0245bd08-438e-4a4a-820f-68b40ce22632-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:50:51 crc kubenswrapper[4861]: I0309 09:50:51.930063 4861 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0245bd08-438e-4a4a-820f-68b40ce22632-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:50:51 crc kubenswrapper[4861]: I0309 09:50:51.930075 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqt6k\" (UniqueName: \"kubernetes.io/projected/0245bd08-438e-4a4a-820f-68b40ce22632-kube-api-access-mqt6k\") on node \"crc\" DevicePath \"\"" Mar 09 09:50:52 crc kubenswrapper[4861]: I0309 09:50:52.270512 4861 generic.go:334] "Generic (PLEG): container finished" podID="0245bd08-438e-4a4a-820f-68b40ce22632" containerID="c2703b5623481d90feb61c00fb002d129859c87b37289f0c70d66d313e3b1f83" exitCode=0 Mar 09 09:50:52 crc kubenswrapper[4861]: I0309 09:50:52.270563 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r4fdl" Mar 09 09:50:52 crc kubenswrapper[4861]: I0309 09:50:52.270598 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r4fdl" event={"ID":"0245bd08-438e-4a4a-820f-68b40ce22632","Type":"ContainerDied","Data":"c2703b5623481d90feb61c00fb002d129859c87b37289f0c70d66d313e3b1f83"} Mar 09 09:50:52 crc kubenswrapper[4861]: I0309 09:50:52.270955 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r4fdl" event={"ID":"0245bd08-438e-4a4a-820f-68b40ce22632","Type":"ContainerDied","Data":"4e19012ef39a8c2a4bd8b9f53e2bbea80e6a6186cefd8ac4d86417d8ecf5c2fa"} Mar 09 09:50:52 crc kubenswrapper[4861]: I0309 09:50:52.271007 4861 scope.go:117] "RemoveContainer" containerID="c2703b5623481d90feb61c00fb002d129859c87b37289f0c70d66d313e3b1f83" Mar 09 09:50:52 crc kubenswrapper[4861]: I0309 09:50:52.299631 4861 scope.go:117] "RemoveContainer" containerID="7102b122279fce487db974a970aa7efecc8b7368691c6f6b1a39a0d755425956" Mar 09 09:50:52 crc kubenswrapper[4861]: I0309 09:50:52.308634 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r4fdl"] Mar 09 09:50:52 crc kubenswrapper[4861]: I0309 09:50:52.317964 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-r4fdl"] Mar 09 09:50:52 crc kubenswrapper[4861]: I0309 09:50:52.321083 4861 scope.go:117] "RemoveContainer" containerID="8dfd68a5130da373c0ade3c9149e3ca9b9e97035785c50fe209a9d18b08799e4" Mar 09 09:50:52 crc kubenswrapper[4861]: I0309 09:50:52.374253 4861 scope.go:117] "RemoveContainer" containerID="c2703b5623481d90feb61c00fb002d129859c87b37289f0c70d66d313e3b1f83" Mar 09 09:50:52 crc kubenswrapper[4861]: E0309 09:50:52.374710 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2703b5623481d90feb61c00fb002d129859c87b37289f0c70d66d313e3b1f83\": container with ID starting with c2703b5623481d90feb61c00fb002d129859c87b37289f0c70d66d313e3b1f83 not found: ID does not exist" containerID="c2703b5623481d90feb61c00fb002d129859c87b37289f0c70d66d313e3b1f83" Mar 09 09:50:52 crc kubenswrapper[4861]: I0309 09:50:52.374750 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2703b5623481d90feb61c00fb002d129859c87b37289f0c70d66d313e3b1f83"} err="failed to get container status \"c2703b5623481d90feb61c00fb002d129859c87b37289f0c70d66d313e3b1f83\": rpc error: code = NotFound desc = could not find container \"c2703b5623481d90feb61c00fb002d129859c87b37289f0c70d66d313e3b1f83\": container with ID starting with c2703b5623481d90feb61c00fb002d129859c87b37289f0c70d66d313e3b1f83 not found: ID does not exist" Mar 09 09:50:52 crc kubenswrapper[4861]: I0309 09:50:52.374772 4861 scope.go:117] "RemoveContainer" containerID="7102b122279fce487db974a970aa7efecc8b7368691c6f6b1a39a0d755425956" Mar 09 09:50:52 crc kubenswrapper[4861]: E0309 09:50:52.375052 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7102b122279fce487db974a970aa7efecc8b7368691c6f6b1a39a0d755425956\": container with ID starting with 7102b122279fce487db974a970aa7efecc8b7368691c6f6b1a39a0d755425956 not found: ID does not exist" containerID="7102b122279fce487db974a970aa7efecc8b7368691c6f6b1a39a0d755425956" Mar 09 09:50:52 crc kubenswrapper[4861]: I0309 09:50:52.375079 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7102b122279fce487db974a970aa7efecc8b7368691c6f6b1a39a0d755425956"} err="failed to get container status \"7102b122279fce487db974a970aa7efecc8b7368691c6f6b1a39a0d755425956\": rpc error: code = NotFound desc = could not find container \"7102b122279fce487db974a970aa7efecc8b7368691c6f6b1a39a0d755425956\": container with ID starting with 7102b122279fce487db974a970aa7efecc8b7368691c6f6b1a39a0d755425956 not found: ID does not exist" Mar 09 09:50:52 crc kubenswrapper[4861]: I0309 09:50:52.375093 4861 scope.go:117] "RemoveContainer" containerID="8dfd68a5130da373c0ade3c9149e3ca9b9e97035785c50fe209a9d18b08799e4" Mar 09 09:50:52 crc kubenswrapper[4861]: E0309 09:50:52.375363 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8dfd68a5130da373c0ade3c9149e3ca9b9e97035785c50fe209a9d18b08799e4\": container with ID starting with 8dfd68a5130da373c0ade3c9149e3ca9b9e97035785c50fe209a9d18b08799e4 not found: ID does not exist" containerID="8dfd68a5130da373c0ade3c9149e3ca9b9e97035785c50fe209a9d18b08799e4" Mar 09 09:50:52 crc kubenswrapper[4861]: I0309 09:50:52.375406 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8dfd68a5130da373c0ade3c9149e3ca9b9e97035785c50fe209a9d18b08799e4"} err="failed to get container status \"8dfd68a5130da373c0ade3c9149e3ca9b9e97035785c50fe209a9d18b08799e4\": rpc error: code = NotFound desc = could not find container \"8dfd68a5130da373c0ade3c9149e3ca9b9e97035785c50fe209a9d18b08799e4\": container with ID starting with 8dfd68a5130da373c0ade3c9149e3ca9b9e97035785c50fe209a9d18b08799e4 not found: ID does not exist" Mar 09 09:50:53 crc kubenswrapper[4861]: I0309 09:50:53.668394 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0245bd08-438e-4a4a-820f-68b40ce22632" path="/var/lib/kubelet/pods/0245bd08-438e-4a4a-820f-68b40ce22632/volumes" Mar 09 09:50:55 crc kubenswrapper[4861]: I0309 09:50:55.387130 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7sklc"] Mar 09 09:50:55 crc kubenswrapper[4861]: E0309 09:50:55.388326 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0245bd08-438e-4a4a-820f-68b40ce22632" containerName="extract-utilities" Mar 09 09:50:55 crc kubenswrapper[4861]: I0309 09:50:55.388422 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="0245bd08-438e-4a4a-820f-68b40ce22632" containerName="extract-utilities" Mar 09 09:50:55 crc kubenswrapper[4861]: E0309 09:50:55.388495 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0245bd08-438e-4a4a-820f-68b40ce22632" containerName="registry-server" Mar 09 09:50:55 crc kubenswrapper[4861]: I0309 09:50:55.388548 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="0245bd08-438e-4a4a-820f-68b40ce22632" containerName="registry-server" Mar 09 09:50:55 crc kubenswrapper[4861]: E0309 09:50:55.388602 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0245bd08-438e-4a4a-820f-68b40ce22632" containerName="extract-content" Mar 09 09:50:55 crc kubenswrapper[4861]: I0309 09:50:55.388652 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="0245bd08-438e-4a4a-820f-68b40ce22632" containerName="extract-content" Mar 09 09:50:55 crc kubenswrapper[4861]: I0309 09:50:55.388925 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="0245bd08-438e-4a4a-820f-68b40ce22632" containerName="registry-server" Mar 09 09:50:55 crc kubenswrapper[4861]: I0309 09:50:55.390512 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7sklc" Mar 09 09:50:55 crc kubenswrapper[4861]: I0309 09:50:55.404614 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7sklc"] Mar 09 09:50:55 crc kubenswrapper[4861]: I0309 09:50:55.495707 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8757a5e-38eb-423f-b20c-e5ef7d947958-catalog-content\") pod \"redhat-operators-7sklc\" (UID: \"e8757a5e-38eb-423f-b20c-e5ef7d947958\") " pod="openshift-marketplace/redhat-operators-7sklc" Mar 09 09:50:55 crc kubenswrapper[4861]: I0309 09:50:55.496115 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlkvx\" (UniqueName: \"kubernetes.io/projected/e8757a5e-38eb-423f-b20c-e5ef7d947958-kube-api-access-qlkvx\") pod \"redhat-operators-7sklc\" (UID: \"e8757a5e-38eb-423f-b20c-e5ef7d947958\") " pod="openshift-marketplace/redhat-operators-7sklc" Mar 09 09:50:55 crc kubenswrapper[4861]: I0309 09:50:55.496313 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8757a5e-38eb-423f-b20c-e5ef7d947958-utilities\") pod \"redhat-operators-7sklc\" (UID: \"e8757a5e-38eb-423f-b20c-e5ef7d947958\") " pod="openshift-marketplace/redhat-operators-7sklc" Mar 09 09:50:55 crc kubenswrapper[4861]: I0309 09:50:55.597803 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8757a5e-38eb-423f-b20c-e5ef7d947958-catalog-content\") pod \"redhat-operators-7sklc\" (UID: \"e8757a5e-38eb-423f-b20c-e5ef7d947958\") " pod="openshift-marketplace/redhat-operators-7sklc" Mar 09 09:50:55 crc kubenswrapper[4861]: I0309 09:50:55.597944 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlkvx\" (UniqueName: \"kubernetes.io/projected/e8757a5e-38eb-423f-b20c-e5ef7d947958-kube-api-access-qlkvx\") pod \"redhat-operators-7sklc\" (UID: \"e8757a5e-38eb-423f-b20c-e5ef7d947958\") " pod="openshift-marketplace/redhat-operators-7sklc" Mar 09 09:50:55 crc kubenswrapper[4861]: I0309 09:50:55.598016 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8757a5e-38eb-423f-b20c-e5ef7d947958-utilities\") pod \"redhat-operators-7sklc\" (UID: \"e8757a5e-38eb-423f-b20c-e5ef7d947958\") " pod="openshift-marketplace/redhat-operators-7sklc" Mar 09 09:50:55 crc kubenswrapper[4861]: I0309 09:50:55.598339 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8757a5e-38eb-423f-b20c-e5ef7d947958-utilities\") pod \"redhat-operators-7sklc\" (UID: \"e8757a5e-38eb-423f-b20c-e5ef7d947958\") " pod="openshift-marketplace/redhat-operators-7sklc" Mar 09 09:50:55 crc kubenswrapper[4861]: I0309 09:50:55.598340 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8757a5e-38eb-423f-b20c-e5ef7d947958-catalog-content\") pod \"redhat-operators-7sklc\" (UID: \"e8757a5e-38eb-423f-b20c-e5ef7d947958\") " pod="openshift-marketplace/redhat-operators-7sklc" Mar 09 09:50:55 crc kubenswrapper[4861]: I0309 09:50:55.621744 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlkvx\" (UniqueName: \"kubernetes.io/projected/e8757a5e-38eb-423f-b20c-e5ef7d947958-kube-api-access-qlkvx\") pod \"redhat-operators-7sklc\" (UID: \"e8757a5e-38eb-423f-b20c-e5ef7d947958\") " pod="openshift-marketplace/redhat-operators-7sklc" Mar 09 09:50:55 crc kubenswrapper[4861]: I0309 09:50:55.723357 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7sklc" Mar 09 09:50:56 crc kubenswrapper[4861]: I0309 09:50:56.235241 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7sklc"] Mar 09 09:50:56 crc kubenswrapper[4861]: I0309 09:50:56.307294 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7sklc" event={"ID":"e8757a5e-38eb-423f-b20c-e5ef7d947958","Type":"ContainerStarted","Data":"61a4b3595991412cb0aa0df46ee07f12e6224038c37477cc9c05e810835a30ec"} Mar 09 09:50:57 crc kubenswrapper[4861]: I0309 09:50:57.317241 4861 generic.go:334] "Generic (PLEG): container finished" podID="e8757a5e-38eb-423f-b20c-e5ef7d947958" containerID="ca30a97092e41c8173c8748073949870c7034239781f56c0bc1310049508a473" exitCode=0 Mar 09 09:50:57 crc kubenswrapper[4861]: I0309 09:50:57.317355 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7sklc" event={"ID":"e8757a5e-38eb-423f-b20c-e5ef7d947958","Type":"ContainerDied","Data":"ca30a97092e41c8173c8748073949870c7034239781f56c0bc1310049508a473"} Mar 09 09:50:58 crc kubenswrapper[4861]: I0309 09:50:58.330324 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7sklc" event={"ID":"e8757a5e-38eb-423f-b20c-e5ef7d947958","Type":"ContainerStarted","Data":"b8fd48a0256a65eca391fe60597d733d78940edeb4189c48255e0b5193c8b286"} Mar 09 09:50:59 crc kubenswrapper[4861]: I0309 09:50:59.340693 4861 generic.go:334] "Generic (PLEG): container finished" podID="e8757a5e-38eb-423f-b20c-e5ef7d947958" containerID="b8fd48a0256a65eca391fe60597d733d78940edeb4189c48255e0b5193c8b286" exitCode=0 Mar 09 09:50:59 crc kubenswrapper[4861]: I0309 09:50:59.340896 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7sklc" event={"ID":"e8757a5e-38eb-423f-b20c-e5ef7d947958","Type":"ContainerDied","Data":"b8fd48a0256a65eca391fe60597d733d78940edeb4189c48255e0b5193c8b286"} Mar 09 09:50:59 crc kubenswrapper[4861]: I0309 09:50:59.351290 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Mar 09 09:50:59 crc kubenswrapper[4861]: I0309 09:50:59.353021 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 09 09:50:59 crc kubenswrapper[4861]: I0309 09:50:59.355487 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Mar 09 09:50:59 crc kubenswrapper[4861]: I0309 09:50:59.355534 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 09 09:50:59 crc kubenswrapper[4861]: I0309 09:50:59.355721 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Mar 09 09:50:59 crc kubenswrapper[4861]: I0309 09:50:59.355823 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-mkjkn" Mar 09 09:50:59 crc kubenswrapper[4861]: I0309 09:50:59.368634 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 09 09:50:59 crc kubenswrapper[4861]: I0309 09:50:59.371422 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"f7ed5e40-0dc4-417c-bef9-cbf919777c67\") " pod="openstack/tempest-tests-tempest" Mar 09 09:50:59 crc kubenswrapper[4861]: I0309 09:50:59.371580 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/f7ed5e40-0dc4-417c-bef9-cbf919777c67-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"f7ed5e40-0dc4-417c-bef9-cbf919777c67\") " pod="openstack/tempest-tests-tempest" Mar 09 09:50:59 crc kubenswrapper[4861]: I0309 09:50:59.371654 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f7ed5e40-0dc4-417c-bef9-cbf919777c67-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"f7ed5e40-0dc4-417c-bef9-cbf919777c67\") " pod="openstack/tempest-tests-tempest" Mar 09 09:50:59 crc kubenswrapper[4861]: I0309 09:50:59.371685 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f7ed5e40-0dc4-417c-bef9-cbf919777c67-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"f7ed5e40-0dc4-417c-bef9-cbf919777c67\") " pod="openstack/tempest-tests-tempest" Mar 09 09:50:59 crc kubenswrapper[4861]: I0309 09:50:59.371715 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/f7ed5e40-0dc4-417c-bef9-cbf919777c67-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"f7ed5e40-0dc4-417c-bef9-cbf919777c67\") " pod="openstack/tempest-tests-tempest" Mar 09 09:50:59 crc kubenswrapper[4861]: I0309 09:50:59.371753 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/f7ed5e40-0dc4-417c-bef9-cbf919777c67-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"f7ed5e40-0dc4-417c-bef9-cbf919777c67\") " pod="openstack/tempest-tests-tempest" Mar 09 09:50:59 crc kubenswrapper[4861]: I0309 09:50:59.371779 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f7ed5e40-0dc4-417c-bef9-cbf919777c67-config-data\") pod \"tempest-tests-tempest\" (UID: \"f7ed5e40-0dc4-417c-bef9-cbf919777c67\") " pod="openstack/tempest-tests-tempest" Mar 09 09:50:59 crc kubenswrapper[4861]: I0309 09:50:59.371811 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bghsg\" (UniqueName: \"kubernetes.io/projected/f7ed5e40-0dc4-417c-bef9-cbf919777c67-kube-api-access-bghsg\") pod \"tempest-tests-tempest\" (UID: \"f7ed5e40-0dc4-417c-bef9-cbf919777c67\") " pod="openstack/tempest-tests-tempest" Mar 09 09:50:59 crc kubenswrapper[4861]: I0309 09:50:59.371863 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f7ed5e40-0dc4-417c-bef9-cbf919777c67-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"f7ed5e40-0dc4-417c-bef9-cbf919777c67\") " pod="openstack/tempest-tests-tempest" Mar 09 09:50:59 crc kubenswrapper[4861]: I0309 09:50:59.474301 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/f7ed5e40-0dc4-417c-bef9-cbf919777c67-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"f7ed5e40-0dc4-417c-bef9-cbf919777c67\") " pod="openstack/tempest-tests-tempest" Mar 09 09:50:59 crc kubenswrapper[4861]: I0309 09:50:59.474387 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f7ed5e40-0dc4-417c-bef9-cbf919777c67-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"f7ed5e40-0dc4-417c-bef9-cbf919777c67\") " pod="openstack/tempest-tests-tempest" Mar 09 09:50:59 crc kubenswrapper[4861]: I0309 09:50:59.474407 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f7ed5e40-0dc4-417c-bef9-cbf919777c67-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"f7ed5e40-0dc4-417c-bef9-cbf919777c67\") " pod="openstack/tempest-tests-tempest" Mar 09 09:50:59 crc kubenswrapper[4861]: I0309 09:50:59.474431 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/f7ed5e40-0dc4-417c-bef9-cbf919777c67-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"f7ed5e40-0dc4-417c-bef9-cbf919777c67\") " pod="openstack/tempest-tests-tempest" Mar 09 09:50:59 crc kubenswrapper[4861]: I0309 09:50:59.474460 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/f7ed5e40-0dc4-417c-bef9-cbf919777c67-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"f7ed5e40-0dc4-417c-bef9-cbf919777c67\") " pod="openstack/tempest-tests-tempest" Mar 09 09:50:59 crc kubenswrapper[4861]: I0309 09:50:59.474480 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f7ed5e40-0dc4-417c-bef9-cbf919777c67-config-data\") pod \"tempest-tests-tempest\" (UID: \"f7ed5e40-0dc4-417c-bef9-cbf919777c67\") " pod="openstack/tempest-tests-tempest" Mar 09 09:50:59 crc kubenswrapper[4861]: I0309 09:50:59.474505 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bghsg\" (UniqueName: \"kubernetes.io/projected/f7ed5e40-0dc4-417c-bef9-cbf919777c67-kube-api-access-bghsg\") pod \"tempest-tests-tempest\" (UID: \"f7ed5e40-0dc4-417c-bef9-cbf919777c67\") " pod="openstack/tempest-tests-tempest" Mar 09 09:50:59 crc kubenswrapper[4861]: I0309 09:50:59.474543 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f7ed5e40-0dc4-417c-bef9-cbf919777c67-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"f7ed5e40-0dc4-417c-bef9-cbf919777c67\") " pod="openstack/tempest-tests-tempest" Mar 09 09:50:59 crc kubenswrapper[4861]: I0309 09:50:59.474578 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"f7ed5e40-0dc4-417c-bef9-cbf919777c67\") " pod="openstack/tempest-tests-tempest" Mar 09 09:50:59 crc kubenswrapper[4861]: I0309 09:50:59.475039 4861 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"f7ed5e40-0dc4-417c-bef9-cbf919777c67\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/tempest-tests-tempest" Mar 09 09:50:59 crc kubenswrapper[4861]: I0309 09:50:59.476264 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f7ed5e40-0dc4-417c-bef9-cbf919777c67-config-data\") pod \"tempest-tests-tempest\" (UID: \"f7ed5e40-0dc4-417c-bef9-cbf919777c67\") " pod="openstack/tempest-tests-tempest" Mar 09 09:50:59 crc kubenswrapper[4861]: I0309 09:50:59.476264 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/f7ed5e40-0dc4-417c-bef9-cbf919777c67-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"f7ed5e40-0dc4-417c-bef9-cbf919777c67\") " pod="openstack/tempest-tests-tempest" Mar 09 09:50:59 crc kubenswrapper[4861]: I0309 09:50:59.476349 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/f7ed5e40-0dc4-417c-bef9-cbf919777c67-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"f7ed5e40-0dc4-417c-bef9-cbf919777c67\") " pod="openstack/tempest-tests-tempest" Mar 09 09:50:59 crc kubenswrapper[4861]: I0309 09:50:59.476703 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f7ed5e40-0dc4-417c-bef9-cbf919777c67-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"f7ed5e40-0dc4-417c-bef9-cbf919777c67\") " pod="openstack/tempest-tests-tempest" Mar 09 09:50:59 crc kubenswrapper[4861]: I0309 09:50:59.480856 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f7ed5e40-0dc4-417c-bef9-cbf919777c67-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"f7ed5e40-0dc4-417c-bef9-cbf919777c67\") " pod="openstack/tempest-tests-tempest" Mar 09 09:50:59 crc kubenswrapper[4861]: I0309 09:50:59.480921 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/f7ed5e40-0dc4-417c-bef9-cbf919777c67-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"f7ed5e40-0dc4-417c-bef9-cbf919777c67\") " pod="openstack/tempest-tests-tempest" Mar 09 09:50:59 crc kubenswrapper[4861]: I0309 09:50:59.482665 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f7ed5e40-0dc4-417c-bef9-cbf919777c67-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"f7ed5e40-0dc4-417c-bef9-cbf919777c67\") " pod="openstack/tempest-tests-tempest" Mar 09 09:50:59 crc kubenswrapper[4861]: I0309 09:50:59.492574 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bghsg\" (UniqueName: \"kubernetes.io/projected/f7ed5e40-0dc4-417c-bef9-cbf919777c67-kube-api-access-bghsg\") pod \"tempest-tests-tempest\" (UID: \"f7ed5e40-0dc4-417c-bef9-cbf919777c67\") " pod="openstack/tempest-tests-tempest" Mar 09 09:50:59 crc kubenswrapper[4861]: I0309 09:50:59.508939 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"f7ed5e40-0dc4-417c-bef9-cbf919777c67\") " pod="openstack/tempest-tests-tempest" Mar 09 09:50:59 crc kubenswrapper[4861]: I0309 09:50:59.676811 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 09 09:51:00 crc kubenswrapper[4861]: I0309 09:51:00.175314 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 09 09:51:00 crc kubenswrapper[4861]: W0309 09:51:00.178851 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7ed5e40_0dc4_417c_bef9_cbf919777c67.slice/crio-99742d2827fede0f84ffd50e88e846f6a600874c6d9967389bfb447acab02880 WatchSource:0}: Error finding container 99742d2827fede0f84ffd50e88e846f6a600874c6d9967389bfb447acab02880: Status 404 returned error can't find the container with id 99742d2827fede0f84ffd50e88e846f6a600874c6d9967389bfb447acab02880 Mar 09 09:51:00 crc kubenswrapper[4861]: I0309 09:51:00.352745 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7sklc" event={"ID":"e8757a5e-38eb-423f-b20c-e5ef7d947958","Type":"ContainerStarted","Data":"6d87b94fa8c494ba9f7624501951f42fc38c72dd3a3afd36bd28ac69a6932f3e"} Mar 09 09:51:00 crc kubenswrapper[4861]: I0309 09:51:00.356230 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"f7ed5e40-0dc4-417c-bef9-cbf919777c67","Type":"ContainerStarted","Data":"99742d2827fede0f84ffd50e88e846f6a600874c6d9967389bfb447acab02880"} Mar 09 09:51:00 crc kubenswrapper[4861]: I0309 09:51:00.375242 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7sklc" podStartSLOduration=2.858213011 podStartE2EDuration="5.375220026s" podCreationTimestamp="2026-03-09 09:50:55 +0000 UTC" firstStartedPulling="2026-03-09 09:50:57.319461298 +0000 UTC m=+2700.404500699" lastFinishedPulling="2026-03-09 09:50:59.836468313 +0000 UTC m=+2702.921507714" observedRunningTime="2026-03-09 09:51:00.37025084 +0000 UTC m=+2703.455290261" watchObservedRunningTime="2026-03-09 09:51:00.375220026 +0000 UTC m=+2703.460259437" Mar 09 09:51:05 crc kubenswrapper[4861]: I0309 09:51:05.724211 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7sklc" Mar 09 09:51:05 crc kubenswrapper[4861]: I0309 09:51:05.724829 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7sklc" Mar 09 09:51:06 crc kubenswrapper[4861]: I0309 09:51:06.777079 4861 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7sklc" podUID="e8757a5e-38eb-423f-b20c-e5ef7d947958" containerName="registry-server" probeResult="failure" output=< Mar 09 09:51:06 crc kubenswrapper[4861]: timeout: failed to connect service ":50051" within 1s Mar 09 09:51:06 crc kubenswrapper[4861]: > Mar 09 09:51:16 crc kubenswrapper[4861]: I0309 09:51:16.781064 4861 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7sklc" podUID="e8757a5e-38eb-423f-b20c-e5ef7d947958" containerName="registry-server" probeResult="failure" output=< Mar 09 09:51:16 crc kubenswrapper[4861]: timeout: failed to connect service ":50051" within 1s Mar 09 09:51:16 crc kubenswrapper[4861]: > Mar 09 09:51:24 crc kubenswrapper[4861]: I0309 09:51:24.605814 4861 patch_prober.go:28] interesting pod/machine-config-daemon-5g7gc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:51:24 crc kubenswrapper[4861]: I0309 09:51:24.606299 4861 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:51:26 crc kubenswrapper[4861]: I0309 09:51:26.776348 4861 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7sklc" podUID="e8757a5e-38eb-423f-b20c-e5ef7d947958" containerName="registry-server" probeResult="failure" output=< Mar 09 09:51:26 crc kubenswrapper[4861]: timeout: failed to connect service ":50051" within 1s Mar 09 09:51:26 crc kubenswrapper[4861]: > Mar 09 09:51:36 crc kubenswrapper[4861]: I0309 09:51:36.781572 4861 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7sklc" podUID="e8757a5e-38eb-423f-b20c-e5ef7d947958" containerName="registry-server" probeResult="failure" output=< Mar 09 09:51:36 crc kubenswrapper[4861]: timeout: failed to connect service ":50051" within 1s Mar 09 09:51:36 crc kubenswrapper[4861]: > Mar 09 09:51:37 crc kubenswrapper[4861]: E0309 09:51:37.398763 4861 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Mar 09 09:51:37 crc kubenswrapper[4861]: E0309 09:51:37.398998 4861 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bghsg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(f7ed5e40-0dc4-417c-bef9-cbf919777c67): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 09:51:37 crc kubenswrapper[4861]: E0309 09:51:37.400238 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="f7ed5e40-0dc4-417c-bef9-cbf919777c67" Mar 09 09:51:37 crc kubenswrapper[4861]: E0309 09:51:37.747413 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="f7ed5e40-0dc4-417c-bef9-cbf919777c67" Mar 09 09:51:46 crc kubenswrapper[4861]: I0309 09:51:46.771213 4861 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7sklc" podUID="e8757a5e-38eb-423f-b20c-e5ef7d947958" containerName="registry-server" probeResult="failure" output=< Mar 09 09:51:46 crc kubenswrapper[4861]: timeout: failed to connect service ":50051" within 1s Mar 09 09:51:46 crc kubenswrapper[4861]: > Mar 09 09:51:52 crc kubenswrapper[4861]: I0309 09:51:52.902659 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"f7ed5e40-0dc4-417c-bef9-cbf919777c67","Type":"ContainerStarted","Data":"be13d99d2b9a22aeacccb32a154d4a7177d840c23730e15e6012891c8a5b43dc"} Mar 09 09:51:52 crc kubenswrapper[4861]: I0309 09:51:52.924813 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.946583017 podStartE2EDuration="54.924788243s" podCreationTimestamp="2026-03-09 09:50:58 +0000 UTC" firstStartedPulling="2026-03-09 09:51:00.182152814 +0000 UTC m=+2703.267192215" lastFinishedPulling="2026-03-09 09:51:51.16035803 +0000 UTC m=+2754.245397441" observedRunningTime="2026-03-09 09:51:52.918321966 +0000 UTC m=+2756.003361387" watchObservedRunningTime="2026-03-09 09:51:52.924788243 +0000 UTC m=+2756.009827644" Mar 09 09:51:54 crc kubenswrapper[4861]: I0309 09:51:54.605935 4861 patch_prober.go:28] interesting pod/machine-config-daemon-5g7gc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:51:54 crc kubenswrapper[4861]: I0309 09:51:54.607653 4861 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:51:55 crc kubenswrapper[4861]: I0309 09:51:55.772218 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7sklc" Mar 09 09:51:55 crc kubenswrapper[4861]: I0309 09:51:55.821545 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7sklc" Mar 09 09:51:56 crc kubenswrapper[4861]: I0309 09:51:56.604493 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7sklc"] Mar 09 09:51:56 crc kubenswrapper[4861]: I0309 09:51:56.940296 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7sklc" podUID="e8757a5e-38eb-423f-b20c-e5ef7d947958" containerName="registry-server" containerID="cri-o://6d87b94fa8c494ba9f7624501951f42fc38c72dd3a3afd36bd28ac69a6932f3e" gracePeriod=2 Mar 09 09:51:57 crc kubenswrapper[4861]: I0309 09:51:57.448644 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7sklc" Mar 09 09:51:57 crc kubenswrapper[4861]: I0309 09:51:57.590084 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlkvx\" (UniqueName: \"kubernetes.io/projected/e8757a5e-38eb-423f-b20c-e5ef7d947958-kube-api-access-qlkvx\") pod \"e8757a5e-38eb-423f-b20c-e5ef7d947958\" (UID: \"e8757a5e-38eb-423f-b20c-e5ef7d947958\") " Mar 09 09:51:57 crc kubenswrapper[4861]: I0309 09:51:57.590719 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8757a5e-38eb-423f-b20c-e5ef7d947958-catalog-content\") pod \"e8757a5e-38eb-423f-b20c-e5ef7d947958\" (UID: \"e8757a5e-38eb-423f-b20c-e5ef7d947958\") " Mar 09 09:51:57 crc kubenswrapper[4861]: I0309 09:51:57.590784 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8757a5e-38eb-423f-b20c-e5ef7d947958-utilities\") pod \"e8757a5e-38eb-423f-b20c-e5ef7d947958\" (UID: \"e8757a5e-38eb-423f-b20c-e5ef7d947958\") " Mar 09 09:51:57 crc kubenswrapper[4861]: I0309 09:51:57.592467 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8757a5e-38eb-423f-b20c-e5ef7d947958-utilities" (OuterVolumeSpecName: "utilities") pod "e8757a5e-38eb-423f-b20c-e5ef7d947958" (UID: "e8757a5e-38eb-423f-b20c-e5ef7d947958"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:51:57 crc kubenswrapper[4861]: I0309 09:51:57.599606 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8757a5e-38eb-423f-b20c-e5ef7d947958-kube-api-access-qlkvx" (OuterVolumeSpecName: "kube-api-access-qlkvx") pod "e8757a5e-38eb-423f-b20c-e5ef7d947958" (UID: "e8757a5e-38eb-423f-b20c-e5ef7d947958"). InnerVolumeSpecName "kube-api-access-qlkvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:51:57 crc kubenswrapper[4861]: I0309 09:51:57.692833 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlkvx\" (UniqueName: \"kubernetes.io/projected/e8757a5e-38eb-423f-b20c-e5ef7d947958-kube-api-access-qlkvx\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:57 crc kubenswrapper[4861]: I0309 09:51:57.692885 4861 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8757a5e-38eb-423f-b20c-e5ef7d947958-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:57 crc kubenswrapper[4861]: I0309 09:51:57.719093 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8757a5e-38eb-423f-b20c-e5ef7d947958-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e8757a5e-38eb-423f-b20c-e5ef7d947958" (UID: "e8757a5e-38eb-423f-b20c-e5ef7d947958"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:51:57 crc kubenswrapper[4861]: I0309 09:51:57.795937 4861 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8757a5e-38eb-423f-b20c-e5ef7d947958-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:51:57 crc kubenswrapper[4861]: I0309 09:51:57.953230 4861 generic.go:334] "Generic (PLEG): container finished" podID="e8757a5e-38eb-423f-b20c-e5ef7d947958" containerID="6d87b94fa8c494ba9f7624501951f42fc38c72dd3a3afd36bd28ac69a6932f3e" exitCode=0 Mar 09 09:51:57 crc kubenswrapper[4861]: I0309 09:51:57.953281 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7sklc" event={"ID":"e8757a5e-38eb-423f-b20c-e5ef7d947958","Type":"ContainerDied","Data":"6d87b94fa8c494ba9f7624501951f42fc38c72dd3a3afd36bd28ac69a6932f3e"} Mar 09 09:51:57 crc kubenswrapper[4861]: I0309 09:51:57.953309 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7sklc" event={"ID":"e8757a5e-38eb-423f-b20c-e5ef7d947958","Type":"ContainerDied","Data":"61a4b3595991412cb0aa0df46ee07f12e6224038c37477cc9c05e810835a30ec"} Mar 09 09:51:57 crc kubenswrapper[4861]: I0309 09:51:57.953327 4861 scope.go:117] "RemoveContainer" containerID="6d87b94fa8c494ba9f7624501951f42fc38c72dd3a3afd36bd28ac69a6932f3e" Mar 09 09:51:57 crc kubenswrapper[4861]: I0309 09:51:57.953517 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7sklc" Mar 09 09:51:58 crc kubenswrapper[4861]: I0309 09:51:58.002724 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7sklc"] Mar 09 09:51:58 crc kubenswrapper[4861]: I0309 09:51:58.004459 4861 scope.go:117] "RemoveContainer" containerID="b8fd48a0256a65eca391fe60597d733d78940edeb4189c48255e0b5193c8b286" Mar 09 09:51:58 crc kubenswrapper[4861]: I0309 09:51:58.010049 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7sklc"] Mar 09 09:51:58 crc kubenswrapper[4861]: I0309 09:51:58.031107 4861 scope.go:117] "RemoveContainer" containerID="ca30a97092e41c8173c8748073949870c7034239781f56c0bc1310049508a473" Mar 09 09:51:58 crc kubenswrapper[4861]: I0309 09:51:58.079550 4861 scope.go:117] "RemoveContainer" containerID="6d87b94fa8c494ba9f7624501951f42fc38c72dd3a3afd36bd28ac69a6932f3e" Mar 09 09:51:58 crc kubenswrapper[4861]: E0309 09:51:58.080115 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d87b94fa8c494ba9f7624501951f42fc38c72dd3a3afd36bd28ac69a6932f3e\": container with ID starting with 6d87b94fa8c494ba9f7624501951f42fc38c72dd3a3afd36bd28ac69a6932f3e not found: ID does not exist" containerID="6d87b94fa8c494ba9f7624501951f42fc38c72dd3a3afd36bd28ac69a6932f3e" Mar 09 09:51:58 crc kubenswrapper[4861]: I0309 09:51:58.080169 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d87b94fa8c494ba9f7624501951f42fc38c72dd3a3afd36bd28ac69a6932f3e"} err="failed to get container status \"6d87b94fa8c494ba9f7624501951f42fc38c72dd3a3afd36bd28ac69a6932f3e\": rpc error: code = NotFound desc = could not find container \"6d87b94fa8c494ba9f7624501951f42fc38c72dd3a3afd36bd28ac69a6932f3e\": container with ID starting with 6d87b94fa8c494ba9f7624501951f42fc38c72dd3a3afd36bd28ac69a6932f3e not found: ID does not exist" Mar 09 09:51:58 crc kubenswrapper[4861]: I0309 09:51:58.080201 4861 scope.go:117] "RemoveContainer" containerID="b8fd48a0256a65eca391fe60597d733d78940edeb4189c48255e0b5193c8b286" Mar 09 09:51:58 crc kubenswrapper[4861]: E0309 09:51:58.080767 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8fd48a0256a65eca391fe60597d733d78940edeb4189c48255e0b5193c8b286\": container with ID starting with b8fd48a0256a65eca391fe60597d733d78940edeb4189c48255e0b5193c8b286 not found: ID does not exist" containerID="b8fd48a0256a65eca391fe60597d733d78940edeb4189c48255e0b5193c8b286" Mar 09 09:51:58 crc kubenswrapper[4861]: I0309 09:51:58.080803 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8fd48a0256a65eca391fe60597d733d78940edeb4189c48255e0b5193c8b286"} err="failed to get container status \"b8fd48a0256a65eca391fe60597d733d78940edeb4189c48255e0b5193c8b286\": rpc error: code = NotFound desc = could not find container \"b8fd48a0256a65eca391fe60597d733d78940edeb4189c48255e0b5193c8b286\": container with ID starting with b8fd48a0256a65eca391fe60597d733d78940edeb4189c48255e0b5193c8b286 not found: ID does not exist" Mar 09 09:51:58 crc kubenswrapper[4861]: I0309 09:51:58.080827 4861 scope.go:117] "RemoveContainer" containerID="ca30a97092e41c8173c8748073949870c7034239781f56c0bc1310049508a473" Mar 09 09:51:58 crc kubenswrapper[4861]: E0309 09:51:58.081053 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca30a97092e41c8173c8748073949870c7034239781f56c0bc1310049508a473\": container with ID starting with ca30a97092e41c8173c8748073949870c7034239781f56c0bc1310049508a473 not found: ID does not exist" containerID="ca30a97092e41c8173c8748073949870c7034239781f56c0bc1310049508a473" Mar 09 09:51:58 crc kubenswrapper[4861]: I0309 09:51:58.081072 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca30a97092e41c8173c8748073949870c7034239781f56c0bc1310049508a473"} err="failed to get container status \"ca30a97092e41c8173c8748073949870c7034239781f56c0bc1310049508a473\": rpc error: code = NotFound desc = could not find container \"ca30a97092e41c8173c8748073949870c7034239781f56c0bc1310049508a473\": container with ID starting with ca30a97092e41c8173c8748073949870c7034239781f56c0bc1310049508a473 not found: ID does not exist" Mar 09 09:51:59 crc kubenswrapper[4861]: I0309 09:51:59.672413 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8757a5e-38eb-423f-b20c-e5ef7d947958" path="/var/lib/kubelet/pods/e8757a5e-38eb-423f-b20c-e5ef7d947958/volumes" Mar 09 09:52:00 crc kubenswrapper[4861]: I0309 09:52:00.165171 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550832-8pgc6"] Mar 09 09:52:00 crc kubenswrapper[4861]: E0309 09:52:00.165764 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8757a5e-38eb-423f-b20c-e5ef7d947958" containerName="extract-utilities" Mar 09 09:52:00 crc kubenswrapper[4861]: I0309 09:52:00.168043 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8757a5e-38eb-423f-b20c-e5ef7d947958" containerName="extract-utilities" Mar 09 09:52:00 crc kubenswrapper[4861]: E0309 09:52:00.168122 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8757a5e-38eb-423f-b20c-e5ef7d947958" containerName="extract-content" Mar 09 09:52:00 crc kubenswrapper[4861]: I0309 09:52:00.168131 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8757a5e-38eb-423f-b20c-e5ef7d947958" containerName="extract-content" Mar 09 09:52:00 crc kubenswrapper[4861]: E0309 09:52:00.168190 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8757a5e-38eb-423f-b20c-e5ef7d947958" containerName="registry-server" Mar 09 09:52:00 crc kubenswrapper[4861]: I0309 09:52:00.168199 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8757a5e-38eb-423f-b20c-e5ef7d947958" containerName="registry-server" Mar 09 09:52:00 crc kubenswrapper[4861]: I0309 09:52:00.168606 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8757a5e-38eb-423f-b20c-e5ef7d947958" containerName="registry-server" Mar 09 09:52:00 crc kubenswrapper[4861]: I0309 09:52:00.169489 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550832-8pgc6" Mar 09 09:52:00 crc kubenswrapper[4861]: I0309 09:52:00.171639 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:52:00 crc kubenswrapper[4861]: I0309 09:52:00.172913 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:52:00 crc kubenswrapper[4861]: I0309 09:52:00.173099 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k86l8" Mar 09 09:52:00 crc kubenswrapper[4861]: I0309 09:52:00.177030 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550832-8pgc6"] Mar 09 09:52:00 crc kubenswrapper[4861]: I0309 09:52:00.344064 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwml2\" (UniqueName: \"kubernetes.io/projected/d1e3de6d-29be-4668-8739-c5d0afd5b183-kube-api-access-qwml2\") pod \"auto-csr-approver-29550832-8pgc6\" (UID: \"d1e3de6d-29be-4668-8739-c5d0afd5b183\") " pod="openshift-infra/auto-csr-approver-29550832-8pgc6" Mar 09 09:52:00 crc kubenswrapper[4861]: I0309 09:52:00.446830 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwml2\" (UniqueName: \"kubernetes.io/projected/d1e3de6d-29be-4668-8739-c5d0afd5b183-kube-api-access-qwml2\") pod \"auto-csr-approver-29550832-8pgc6\" (UID: \"d1e3de6d-29be-4668-8739-c5d0afd5b183\") " pod="openshift-infra/auto-csr-approver-29550832-8pgc6" Mar 09 09:52:00 crc kubenswrapper[4861]: I0309 09:52:00.473656 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwml2\" (UniqueName: \"kubernetes.io/projected/d1e3de6d-29be-4668-8739-c5d0afd5b183-kube-api-access-qwml2\") pod \"auto-csr-approver-29550832-8pgc6\" (UID: \"d1e3de6d-29be-4668-8739-c5d0afd5b183\") " pod="openshift-infra/auto-csr-approver-29550832-8pgc6" Mar 09 09:52:00 crc kubenswrapper[4861]: I0309 09:52:00.495839 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550832-8pgc6" Mar 09 09:52:00 crc kubenswrapper[4861]: I0309 09:52:00.972073 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550832-8pgc6"] Mar 09 09:52:00 crc kubenswrapper[4861]: W0309 09:52:00.980405 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1e3de6d_29be_4668_8739_c5d0afd5b183.slice/crio-132e425805ef7e8dd17be6ebf0bbae5728546a8bf3e367e9a9397999332691e3 WatchSource:0}: Error finding container 132e425805ef7e8dd17be6ebf0bbae5728546a8bf3e367e9a9397999332691e3: Status 404 returned error can't find the container with id 132e425805ef7e8dd17be6ebf0bbae5728546a8bf3e367e9a9397999332691e3 Mar 09 09:52:01 crc kubenswrapper[4861]: I0309 09:52:01.991875 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550832-8pgc6" event={"ID":"d1e3de6d-29be-4668-8739-c5d0afd5b183","Type":"ContainerStarted","Data":"132e425805ef7e8dd17be6ebf0bbae5728546a8bf3e367e9a9397999332691e3"} Mar 09 09:52:03 crc kubenswrapper[4861]: I0309 09:52:03.002237 4861 generic.go:334] "Generic (PLEG): container finished" podID="d1e3de6d-29be-4668-8739-c5d0afd5b183" containerID="e570b3aa786918ee88159aca1f61a5919dd5545daffb5779dec81db86b1a9551" exitCode=0 Mar 09 09:52:03 crc kubenswrapper[4861]: I0309 09:52:03.002572 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550832-8pgc6" event={"ID":"d1e3de6d-29be-4668-8739-c5d0afd5b183","Type":"ContainerDied","Data":"e570b3aa786918ee88159aca1f61a5919dd5545daffb5779dec81db86b1a9551"} Mar 09 09:52:04 crc kubenswrapper[4861]: I0309 09:52:04.391426 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550832-8pgc6" Mar 09 09:52:04 crc kubenswrapper[4861]: I0309 09:52:04.532312 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwml2\" (UniqueName: \"kubernetes.io/projected/d1e3de6d-29be-4668-8739-c5d0afd5b183-kube-api-access-qwml2\") pod \"d1e3de6d-29be-4668-8739-c5d0afd5b183\" (UID: \"d1e3de6d-29be-4668-8739-c5d0afd5b183\") " Mar 09 09:52:04 crc kubenswrapper[4861]: I0309 09:52:04.544042 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1e3de6d-29be-4668-8739-c5d0afd5b183-kube-api-access-qwml2" (OuterVolumeSpecName: "kube-api-access-qwml2") pod "d1e3de6d-29be-4668-8739-c5d0afd5b183" (UID: "d1e3de6d-29be-4668-8739-c5d0afd5b183"). InnerVolumeSpecName "kube-api-access-qwml2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:52:04 crc kubenswrapper[4861]: I0309 09:52:04.637058 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwml2\" (UniqueName: \"kubernetes.io/projected/d1e3de6d-29be-4668-8739-c5d0afd5b183-kube-api-access-qwml2\") on node \"crc\" DevicePath \"\"" Mar 09 09:52:05 crc kubenswrapper[4861]: I0309 09:52:05.022995 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550832-8pgc6" event={"ID":"d1e3de6d-29be-4668-8739-c5d0afd5b183","Type":"ContainerDied","Data":"132e425805ef7e8dd17be6ebf0bbae5728546a8bf3e367e9a9397999332691e3"} Mar 09 09:52:05 crc kubenswrapper[4861]: I0309 09:52:05.023431 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="132e425805ef7e8dd17be6ebf0bbae5728546a8bf3e367e9a9397999332691e3" Mar 09 09:52:05 crc kubenswrapper[4861]: I0309 09:52:05.023168 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550832-8pgc6" Mar 09 09:52:05 crc kubenswrapper[4861]: I0309 09:52:05.461750 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550826-ht7dz"] Mar 09 09:52:05 crc kubenswrapper[4861]: I0309 09:52:05.472035 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550826-ht7dz"] Mar 09 09:52:05 crc kubenswrapper[4861]: I0309 09:52:05.668350 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4785c3f-1721-4529-b665-f43afc2e2691" path="/var/lib/kubelet/pods/d4785c3f-1721-4529-b665-f43afc2e2691/volumes" Mar 09 09:52:24 crc kubenswrapper[4861]: I0309 09:52:24.605990 4861 patch_prober.go:28] interesting pod/machine-config-daemon-5g7gc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:52:24 crc kubenswrapper[4861]: I0309 09:52:24.606550 4861 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:52:24 crc kubenswrapper[4861]: I0309 09:52:24.606613 4861 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" Mar 09 09:52:24 crc kubenswrapper[4861]: I0309 09:52:24.607469 4861 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e048477124348cf603efe8e3f38bd683df1e311fb377d2679ee9dcf66a493f5e"} pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 09:52:24 crc kubenswrapper[4861]: I0309 09:52:24.607536 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" containerID="cri-o://e048477124348cf603efe8e3f38bd683df1e311fb377d2679ee9dcf66a493f5e" gracePeriod=600 Mar 09 09:52:25 crc kubenswrapper[4861]: I0309 09:52:25.235100 4861 generic.go:334] "Generic (PLEG): container finished" podID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerID="e048477124348cf603efe8e3f38bd683df1e311fb377d2679ee9dcf66a493f5e" exitCode=0 Mar 09 09:52:25 crc kubenswrapper[4861]: I0309 09:52:25.235479 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" event={"ID":"6f7875e3-174f-4c67-8675-d878de74aa4f","Type":"ContainerDied","Data":"e048477124348cf603efe8e3f38bd683df1e311fb377d2679ee9dcf66a493f5e"} Mar 09 09:52:25 crc kubenswrapper[4861]: I0309 09:52:25.235513 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" event={"ID":"6f7875e3-174f-4c67-8675-d878de74aa4f","Type":"ContainerStarted","Data":"77c857d062282bd5e967d0e73907a96e77dbe37daf546bb7546f15dc1d364592"} Mar 09 09:52:25 crc kubenswrapper[4861]: I0309 09:52:25.235533 4861 scope.go:117] "RemoveContainer" containerID="480298580e8917b47d66a4586a753752f58dc7e4c2678e95158c475a3b504483" Mar 09 09:52:37 crc kubenswrapper[4861]: I0309 09:52:37.314242 4861 scope.go:117] "RemoveContainer" containerID="60af282ae6a3914029203361477e4ed680273abf708b540b0ee65159e0872db0" Mar 09 09:54:00 crc kubenswrapper[4861]: I0309 09:54:00.148237 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550834-4cfvx"] Mar 09 09:54:00 crc kubenswrapper[4861]: E0309 09:54:00.149948 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1e3de6d-29be-4668-8739-c5d0afd5b183" containerName="oc" Mar 09 09:54:00 crc kubenswrapper[4861]: I0309 09:54:00.149968 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1e3de6d-29be-4668-8739-c5d0afd5b183" containerName="oc" Mar 09 09:54:00 crc kubenswrapper[4861]: I0309 09:54:00.150200 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1e3de6d-29be-4668-8739-c5d0afd5b183" containerName="oc" Mar 09 09:54:00 crc kubenswrapper[4861]: I0309 09:54:00.151129 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550834-4cfvx" Mar 09 09:54:00 crc kubenswrapper[4861]: I0309 09:54:00.154355 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:54:00 crc kubenswrapper[4861]: I0309 09:54:00.154640 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k86l8" Mar 09 09:54:00 crc kubenswrapper[4861]: I0309 09:54:00.155404 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:54:00 crc kubenswrapper[4861]: I0309 09:54:00.170842 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550834-4cfvx"] Mar 09 09:54:00 crc kubenswrapper[4861]: I0309 09:54:00.251148 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2wpj\" (UniqueName: \"kubernetes.io/projected/91e023dd-cb71-4232-9d33-545f75430736-kube-api-access-s2wpj\") pod \"auto-csr-approver-29550834-4cfvx\" (UID: \"91e023dd-cb71-4232-9d33-545f75430736\") " pod="openshift-infra/auto-csr-approver-29550834-4cfvx" Mar 09 09:54:00 crc kubenswrapper[4861]: I0309 09:54:00.353436 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2wpj\" (UniqueName: \"kubernetes.io/projected/91e023dd-cb71-4232-9d33-545f75430736-kube-api-access-s2wpj\") pod \"auto-csr-approver-29550834-4cfvx\" (UID: \"91e023dd-cb71-4232-9d33-545f75430736\") " pod="openshift-infra/auto-csr-approver-29550834-4cfvx" Mar 09 09:54:00 crc kubenswrapper[4861]: I0309 09:54:00.373143 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2wpj\" (UniqueName: \"kubernetes.io/projected/91e023dd-cb71-4232-9d33-545f75430736-kube-api-access-s2wpj\") pod \"auto-csr-approver-29550834-4cfvx\" (UID: \"91e023dd-cb71-4232-9d33-545f75430736\") " pod="openshift-infra/auto-csr-approver-29550834-4cfvx" Mar 09 09:54:00 crc kubenswrapper[4861]: I0309 09:54:00.474492 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550834-4cfvx" Mar 09 09:54:00 crc kubenswrapper[4861]: I0309 09:54:00.935349 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550834-4cfvx"] Mar 09 09:54:01 crc kubenswrapper[4861]: I0309 09:54:01.068603 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550834-4cfvx" event={"ID":"91e023dd-cb71-4232-9d33-545f75430736","Type":"ContainerStarted","Data":"6525f36d01b9bfb9ca98a39e73138fe5965d94d1c5debd445c5bdcb47db80d96"} Mar 09 09:54:03 crc kubenswrapper[4861]: I0309 09:54:03.089491 4861 generic.go:334] "Generic (PLEG): container finished" podID="91e023dd-cb71-4232-9d33-545f75430736" containerID="2b853a2364d2632e02bbc3723618e85a5fec323cffb1d47850fd9c5abfc9b9a1" exitCode=0 Mar 09 09:54:03 crc kubenswrapper[4861]: I0309 09:54:03.089611 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550834-4cfvx" event={"ID":"91e023dd-cb71-4232-9d33-545f75430736","Type":"ContainerDied","Data":"2b853a2364d2632e02bbc3723618e85a5fec323cffb1d47850fd9c5abfc9b9a1"} Mar 09 09:54:04 crc kubenswrapper[4861]: I0309 09:54:04.478627 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550834-4cfvx" Mar 09 09:54:04 crc kubenswrapper[4861]: I0309 09:54:04.562020 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2wpj\" (UniqueName: \"kubernetes.io/projected/91e023dd-cb71-4232-9d33-545f75430736-kube-api-access-s2wpj\") pod \"91e023dd-cb71-4232-9d33-545f75430736\" (UID: \"91e023dd-cb71-4232-9d33-545f75430736\") " Mar 09 09:54:04 crc kubenswrapper[4861]: I0309 09:54:04.568572 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91e023dd-cb71-4232-9d33-545f75430736-kube-api-access-s2wpj" (OuterVolumeSpecName: "kube-api-access-s2wpj") pod "91e023dd-cb71-4232-9d33-545f75430736" (UID: "91e023dd-cb71-4232-9d33-545f75430736"). InnerVolumeSpecName "kube-api-access-s2wpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:54:04 crc kubenswrapper[4861]: I0309 09:54:04.663824 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2wpj\" (UniqueName: \"kubernetes.io/projected/91e023dd-cb71-4232-9d33-545f75430736-kube-api-access-s2wpj\") on node \"crc\" DevicePath \"\"" Mar 09 09:54:05 crc kubenswrapper[4861]: I0309 09:54:05.109144 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550834-4cfvx" event={"ID":"91e023dd-cb71-4232-9d33-545f75430736","Type":"ContainerDied","Data":"6525f36d01b9bfb9ca98a39e73138fe5965d94d1c5debd445c5bdcb47db80d96"} Mar 09 09:54:05 crc kubenswrapper[4861]: I0309 09:54:05.109192 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6525f36d01b9bfb9ca98a39e73138fe5965d94d1c5debd445c5bdcb47db80d96" Mar 09 09:54:05 crc kubenswrapper[4861]: I0309 09:54:05.109203 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550834-4cfvx" Mar 09 09:54:05 crc kubenswrapper[4861]: I0309 09:54:05.569609 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550828-2595v"] Mar 09 09:54:05 crc kubenswrapper[4861]: I0309 09:54:05.579886 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550828-2595v"] Mar 09 09:54:05 crc kubenswrapper[4861]: I0309 09:54:05.669528 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae7917dd-e857-4eb8-bbc3-bcc88452f4d5" path="/var/lib/kubelet/pods/ae7917dd-e857-4eb8-bbc3-bcc88452f4d5/volumes" Mar 09 09:54:24 crc kubenswrapper[4861]: I0309 09:54:24.607322 4861 patch_prober.go:28] interesting pod/machine-config-daemon-5g7gc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:54:24 crc kubenswrapper[4861]: I0309 09:54:24.608092 4861 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:54:37 crc kubenswrapper[4861]: I0309 09:54:37.426888 4861 scope.go:117] "RemoveContainer" containerID="9ec00bfcb40742a5d129a60b8b76d443a9f3b4fc4afda680a4afb795f16d066e" Mar 09 09:54:54 crc kubenswrapper[4861]: I0309 09:54:54.606170 4861 patch_prober.go:28] interesting pod/machine-config-daemon-5g7gc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:54:54 crc kubenswrapper[4861]: I0309 09:54:54.606742 4861 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:55:24 crc kubenswrapper[4861]: I0309 09:55:24.605994 4861 patch_prober.go:28] interesting pod/machine-config-daemon-5g7gc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 09:55:24 crc kubenswrapper[4861]: I0309 09:55:24.606662 4861 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 09:55:24 crc kubenswrapper[4861]: I0309 09:55:24.606716 4861 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" Mar 09 09:55:24 crc kubenswrapper[4861]: I0309 09:55:24.607611 4861 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"77c857d062282bd5e967d0e73907a96e77dbe37daf546bb7546f15dc1d364592"} pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 09:55:24 crc kubenswrapper[4861]: I0309 09:55:24.607672 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" containerID="cri-o://77c857d062282bd5e967d0e73907a96e77dbe37daf546bb7546f15dc1d364592" gracePeriod=600 Mar 09 09:55:24 crc kubenswrapper[4861]: E0309 09:55:24.729978 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:55:24 crc kubenswrapper[4861]: I0309 09:55:24.779550 4861 generic.go:334] "Generic (PLEG): container finished" podID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerID="77c857d062282bd5e967d0e73907a96e77dbe37daf546bb7546f15dc1d364592" exitCode=0 Mar 09 09:55:24 crc kubenswrapper[4861]: I0309 09:55:24.779601 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" event={"ID":"6f7875e3-174f-4c67-8675-d878de74aa4f","Type":"ContainerDied","Data":"77c857d062282bd5e967d0e73907a96e77dbe37daf546bb7546f15dc1d364592"} Mar 09 09:55:24 crc kubenswrapper[4861]: I0309 09:55:24.779636 4861 scope.go:117] "RemoveContainer" containerID="e048477124348cf603efe8e3f38bd683df1e311fb377d2679ee9dcf66a493f5e" Mar 09 09:55:24 crc kubenswrapper[4861]: I0309 09:55:24.780403 4861 scope.go:117] "RemoveContainer" containerID="77c857d062282bd5e967d0e73907a96e77dbe37daf546bb7546f15dc1d364592" Mar 09 09:55:24 crc kubenswrapper[4861]: E0309 09:55:24.780669 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:55:39 crc kubenswrapper[4861]: I0309 09:55:39.658878 4861 scope.go:117] "RemoveContainer" containerID="77c857d062282bd5e967d0e73907a96e77dbe37daf546bb7546f15dc1d364592" Mar 09 09:55:39 crc kubenswrapper[4861]: E0309 09:55:39.659667 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:55:54 crc kubenswrapper[4861]: I0309 09:55:54.658539 4861 scope.go:117] "RemoveContainer" containerID="77c857d062282bd5e967d0e73907a96e77dbe37daf546bb7546f15dc1d364592" Mar 09 09:55:54 crc kubenswrapper[4861]: E0309 09:55:54.659338 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:56:00 crc kubenswrapper[4861]: I0309 09:56:00.149466 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550836-mbsg5"] Mar 09 09:56:00 crc kubenswrapper[4861]: E0309 09:56:00.150782 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91e023dd-cb71-4232-9d33-545f75430736" containerName="oc" Mar 09 09:56:00 crc kubenswrapper[4861]: I0309 09:56:00.150808 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="91e023dd-cb71-4232-9d33-545f75430736" containerName="oc" Mar 09 09:56:00 crc kubenswrapper[4861]: I0309 09:56:00.151163 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="91e023dd-cb71-4232-9d33-545f75430736" containerName="oc" Mar 09 09:56:00 crc kubenswrapper[4861]: I0309 09:56:00.151973 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550836-mbsg5" Mar 09 09:56:00 crc kubenswrapper[4861]: I0309 09:56:00.156429 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:56:00 crc kubenswrapper[4861]: I0309 09:56:00.156512 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k86l8" Mar 09 09:56:00 crc kubenswrapper[4861]: I0309 09:56:00.157180 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:56:00 crc kubenswrapper[4861]: I0309 09:56:00.169113 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550836-mbsg5"] Mar 09 09:56:00 crc kubenswrapper[4861]: I0309 09:56:00.282059 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2s9x\" (UniqueName: \"kubernetes.io/projected/a189714d-1d52-456d-a04b-a5a8fbee6087-kube-api-access-l2s9x\") pod \"auto-csr-approver-29550836-mbsg5\" (UID: \"a189714d-1d52-456d-a04b-a5a8fbee6087\") " pod="openshift-infra/auto-csr-approver-29550836-mbsg5" Mar 09 09:56:00 crc kubenswrapper[4861]: I0309 09:56:00.384273 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2s9x\" (UniqueName: \"kubernetes.io/projected/a189714d-1d52-456d-a04b-a5a8fbee6087-kube-api-access-l2s9x\") pod \"auto-csr-approver-29550836-mbsg5\" (UID: \"a189714d-1d52-456d-a04b-a5a8fbee6087\") " pod="openshift-infra/auto-csr-approver-29550836-mbsg5" Mar 09 09:56:00 crc kubenswrapper[4861]: I0309 09:56:00.404914 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2s9x\" (UniqueName: \"kubernetes.io/projected/a189714d-1d52-456d-a04b-a5a8fbee6087-kube-api-access-l2s9x\") pod \"auto-csr-approver-29550836-mbsg5\" (UID: \"a189714d-1d52-456d-a04b-a5a8fbee6087\") " pod="openshift-infra/auto-csr-approver-29550836-mbsg5" Mar 09 09:56:00 crc kubenswrapper[4861]: I0309 09:56:00.471903 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550836-mbsg5" Mar 09 09:56:00 crc kubenswrapper[4861]: I0309 09:56:00.983073 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550836-mbsg5"] Mar 09 09:56:00 crc kubenswrapper[4861]: I0309 09:56:00.983810 4861 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 09:56:01 crc kubenswrapper[4861]: I0309 09:56:01.111427 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550836-mbsg5" event={"ID":"a189714d-1d52-456d-a04b-a5a8fbee6087","Type":"ContainerStarted","Data":"760b62b4b0dca271fc8873140f0fb7bde42e4d8d34f73cefe45700bf047fcd0b"} Mar 09 09:56:03 crc kubenswrapper[4861]: I0309 09:56:03.151338 4861 generic.go:334] "Generic (PLEG): container finished" podID="a189714d-1d52-456d-a04b-a5a8fbee6087" containerID="0037ed70b65b374a3333ec5dbabfe26eb321f8e5020a738b7abfcdd9189053ec" exitCode=0 Mar 09 09:56:03 crc kubenswrapper[4861]: I0309 09:56:03.151410 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550836-mbsg5" event={"ID":"a189714d-1d52-456d-a04b-a5a8fbee6087","Type":"ContainerDied","Data":"0037ed70b65b374a3333ec5dbabfe26eb321f8e5020a738b7abfcdd9189053ec"} Mar 09 09:56:04 crc kubenswrapper[4861]: I0309 09:56:04.595087 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550836-mbsg5" Mar 09 09:56:04 crc kubenswrapper[4861]: I0309 09:56:04.768772 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2s9x\" (UniqueName: \"kubernetes.io/projected/a189714d-1d52-456d-a04b-a5a8fbee6087-kube-api-access-l2s9x\") pod \"a189714d-1d52-456d-a04b-a5a8fbee6087\" (UID: \"a189714d-1d52-456d-a04b-a5a8fbee6087\") " Mar 09 09:56:04 crc kubenswrapper[4861]: I0309 09:56:04.782105 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a189714d-1d52-456d-a04b-a5a8fbee6087-kube-api-access-l2s9x" (OuterVolumeSpecName: "kube-api-access-l2s9x") pod "a189714d-1d52-456d-a04b-a5a8fbee6087" (UID: "a189714d-1d52-456d-a04b-a5a8fbee6087"). InnerVolumeSpecName "kube-api-access-l2s9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:56:04 crc kubenswrapper[4861]: I0309 09:56:04.872561 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2s9x\" (UniqueName: \"kubernetes.io/projected/a189714d-1d52-456d-a04b-a5a8fbee6087-kube-api-access-l2s9x\") on node \"crc\" DevicePath \"\"" Mar 09 09:56:05 crc kubenswrapper[4861]: I0309 09:56:05.170218 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550836-mbsg5" event={"ID":"a189714d-1d52-456d-a04b-a5a8fbee6087","Type":"ContainerDied","Data":"760b62b4b0dca271fc8873140f0fb7bde42e4d8d34f73cefe45700bf047fcd0b"} Mar 09 09:56:05 crc kubenswrapper[4861]: I0309 09:56:05.170253 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="760b62b4b0dca271fc8873140f0fb7bde42e4d8d34f73cefe45700bf047fcd0b" Mar 09 09:56:05 crc kubenswrapper[4861]: I0309 09:56:05.170280 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550836-mbsg5" Mar 09 09:56:05 crc kubenswrapper[4861]: I0309 09:56:05.667722 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550830-xb7vc"] Mar 09 09:56:05 crc kubenswrapper[4861]: I0309 09:56:05.671914 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550830-xb7vc"] Mar 09 09:56:06 crc kubenswrapper[4861]: I0309 09:56:06.658614 4861 scope.go:117] "RemoveContainer" containerID="77c857d062282bd5e967d0e73907a96e77dbe37daf546bb7546f15dc1d364592" Mar 09 09:56:06 crc kubenswrapper[4861]: E0309 09:56:06.659146 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:56:07 crc kubenswrapper[4861]: I0309 09:56:07.844921 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c10f203-a354-498a-9c0e-07a612cd16b0" path="/var/lib/kubelet/pods/4c10f203-a354-498a-9c0e-07a612cd16b0/volumes" Mar 09 09:56:18 crc kubenswrapper[4861]: I0309 09:56:18.657728 4861 scope.go:117] "RemoveContainer" containerID="77c857d062282bd5e967d0e73907a96e77dbe37daf546bb7546f15dc1d364592" Mar 09 09:56:18 crc kubenswrapper[4861]: E0309 09:56:18.658570 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:56:33 crc kubenswrapper[4861]: I0309 09:56:33.658081 4861 scope.go:117] "RemoveContainer" containerID="77c857d062282bd5e967d0e73907a96e77dbe37daf546bb7546f15dc1d364592" Mar 09 09:56:33 crc kubenswrapper[4861]: E0309 09:56:33.658927 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:56:37 crc kubenswrapper[4861]: I0309 09:56:37.513531 4861 scope.go:117] "RemoveContainer" containerID="c57bf3d45e8f27c7e72879defa1c19e7e9eef9510d7a7f05aa700a45703446dc" Mar 09 09:56:47 crc kubenswrapper[4861]: I0309 09:56:47.663758 4861 scope.go:117] "RemoveContainer" containerID="77c857d062282bd5e967d0e73907a96e77dbe37daf546bb7546f15dc1d364592" Mar 09 09:56:47 crc kubenswrapper[4861]: E0309 09:56:47.664505 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:56:59 crc kubenswrapper[4861]: I0309 09:56:59.658280 4861 scope.go:117] "RemoveContainer" containerID="77c857d062282bd5e967d0e73907a96e77dbe37daf546bb7546f15dc1d364592" Mar 09 09:56:59 crc kubenswrapper[4861]: E0309 09:56:59.659071 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:57:13 crc kubenswrapper[4861]: I0309 09:57:13.658811 4861 scope.go:117] "RemoveContainer" containerID="77c857d062282bd5e967d0e73907a96e77dbe37daf546bb7546f15dc1d364592" Mar 09 09:57:13 crc kubenswrapper[4861]: E0309 09:57:13.659623 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:57:26 crc kubenswrapper[4861]: I0309 09:57:26.658329 4861 scope.go:117] "RemoveContainer" containerID="77c857d062282bd5e967d0e73907a96e77dbe37daf546bb7546f15dc1d364592" Mar 09 09:57:26 crc kubenswrapper[4861]: E0309 09:57:26.659208 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:57:38 crc kubenswrapper[4861]: I0309 09:57:38.657900 4861 scope.go:117] "RemoveContainer" containerID="77c857d062282bd5e967d0e73907a96e77dbe37daf546bb7546f15dc1d364592" Mar 09 09:57:38 crc kubenswrapper[4861]: E0309 09:57:38.658705 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:57:53 crc kubenswrapper[4861]: I0309 09:57:53.658557 4861 scope.go:117] "RemoveContainer" containerID="77c857d062282bd5e967d0e73907a96e77dbe37daf546bb7546f15dc1d364592" Mar 09 09:57:53 crc kubenswrapper[4861]: E0309 09:57:53.659523 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:58:00 crc kubenswrapper[4861]: I0309 09:58:00.149213 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550838-g4gxr"] Mar 09 09:58:00 crc kubenswrapper[4861]: E0309 09:58:00.150347 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a189714d-1d52-456d-a04b-a5a8fbee6087" containerName="oc" Mar 09 09:58:00 crc kubenswrapper[4861]: I0309 09:58:00.150368 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="a189714d-1d52-456d-a04b-a5a8fbee6087" containerName="oc" Mar 09 09:58:00 crc kubenswrapper[4861]: I0309 09:58:00.150751 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="a189714d-1d52-456d-a04b-a5a8fbee6087" containerName="oc" Mar 09 09:58:00 crc kubenswrapper[4861]: I0309 09:58:00.151578 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550838-g4gxr" Mar 09 09:58:00 crc kubenswrapper[4861]: I0309 09:58:00.154432 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k86l8" Mar 09 09:58:00 crc kubenswrapper[4861]: I0309 09:58:00.154449 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 09:58:00 crc kubenswrapper[4861]: I0309 09:58:00.155256 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 09:58:00 crc kubenswrapper[4861]: I0309 09:58:00.159681 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550838-g4gxr"] Mar 09 09:58:00 crc kubenswrapper[4861]: I0309 09:58:00.183836 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpnbq\" (UniqueName: \"kubernetes.io/projected/2e005695-9c62-4320-a9f8-192525751618-kube-api-access-vpnbq\") pod \"auto-csr-approver-29550838-g4gxr\" (UID: \"2e005695-9c62-4320-a9f8-192525751618\") " pod="openshift-infra/auto-csr-approver-29550838-g4gxr" Mar 09 09:58:00 crc kubenswrapper[4861]: I0309 09:58:00.288459 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpnbq\" (UniqueName: \"kubernetes.io/projected/2e005695-9c62-4320-a9f8-192525751618-kube-api-access-vpnbq\") pod \"auto-csr-approver-29550838-g4gxr\" (UID: \"2e005695-9c62-4320-a9f8-192525751618\") " pod="openshift-infra/auto-csr-approver-29550838-g4gxr" Mar 09 09:58:00 crc kubenswrapper[4861]: I0309 09:58:00.310735 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpnbq\" (UniqueName: \"kubernetes.io/projected/2e005695-9c62-4320-a9f8-192525751618-kube-api-access-vpnbq\") pod \"auto-csr-approver-29550838-g4gxr\" (UID: \"2e005695-9c62-4320-a9f8-192525751618\") " pod="openshift-infra/auto-csr-approver-29550838-g4gxr" Mar 09 09:58:00 crc kubenswrapper[4861]: I0309 09:58:00.494533 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550838-g4gxr" Mar 09 09:58:00 crc kubenswrapper[4861]: I0309 09:58:00.963655 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550838-g4gxr"] Mar 09 09:58:01 crc kubenswrapper[4861]: I0309 09:58:01.208744 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550838-g4gxr" event={"ID":"2e005695-9c62-4320-a9f8-192525751618","Type":"ContainerStarted","Data":"579d8a421d9e9c5d4b438f76b40f53abd51c8ab109b86234abe11d7ce3d7a313"} Mar 09 09:58:02 crc kubenswrapper[4861]: I0309 09:58:02.219649 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550838-g4gxr" event={"ID":"2e005695-9c62-4320-a9f8-192525751618","Type":"ContainerStarted","Data":"0dae7640fa970823283fa042152987d7e0ece6b44431e37ea15241134f184d51"} Mar 09 09:58:02 crc kubenswrapper[4861]: I0309 09:58:02.245353 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550838-g4gxr" podStartSLOduration=1.322008033 podStartE2EDuration="2.24532842s" podCreationTimestamp="2026-03-09 09:58:00 +0000 UTC" firstStartedPulling="2026-03-09 09:58:00.969548833 +0000 UTC m=+3124.054588234" lastFinishedPulling="2026-03-09 09:58:01.89286922 +0000 UTC m=+3124.977908621" observedRunningTime="2026-03-09 09:58:02.235837414 +0000 UTC m=+3125.320876815" watchObservedRunningTime="2026-03-09 09:58:02.24532842 +0000 UTC m=+3125.330367821" Mar 09 09:58:03 crc kubenswrapper[4861]: I0309 09:58:03.232127 4861 generic.go:334] "Generic (PLEG): container finished" podID="2e005695-9c62-4320-a9f8-192525751618" containerID="0dae7640fa970823283fa042152987d7e0ece6b44431e37ea15241134f184d51" exitCode=0 Mar 09 09:58:03 crc kubenswrapper[4861]: I0309 09:58:03.232248 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550838-g4gxr" event={"ID":"2e005695-9c62-4320-a9f8-192525751618","Type":"ContainerDied","Data":"0dae7640fa970823283fa042152987d7e0ece6b44431e37ea15241134f184d51"} Mar 09 09:58:04 crc kubenswrapper[4861]: I0309 09:58:04.634970 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550838-g4gxr" Mar 09 09:58:04 crc kubenswrapper[4861]: I0309 09:58:04.691010 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpnbq\" (UniqueName: \"kubernetes.io/projected/2e005695-9c62-4320-a9f8-192525751618-kube-api-access-vpnbq\") pod \"2e005695-9c62-4320-a9f8-192525751618\" (UID: \"2e005695-9c62-4320-a9f8-192525751618\") " Mar 09 09:58:04 crc kubenswrapper[4861]: I0309 09:58:04.698939 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e005695-9c62-4320-a9f8-192525751618-kube-api-access-vpnbq" (OuterVolumeSpecName: "kube-api-access-vpnbq") pod "2e005695-9c62-4320-a9f8-192525751618" (UID: "2e005695-9c62-4320-a9f8-192525751618"). InnerVolumeSpecName "kube-api-access-vpnbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:58:04 crc kubenswrapper[4861]: I0309 09:58:04.793393 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpnbq\" (UniqueName: \"kubernetes.io/projected/2e005695-9c62-4320-a9f8-192525751618-kube-api-access-vpnbq\") on node \"crc\" DevicePath \"\"" Mar 09 09:58:05 crc kubenswrapper[4861]: I0309 09:58:05.250569 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550838-g4gxr" event={"ID":"2e005695-9c62-4320-a9f8-192525751618","Type":"ContainerDied","Data":"579d8a421d9e9c5d4b438f76b40f53abd51c8ab109b86234abe11d7ce3d7a313"} Mar 09 09:58:05 crc kubenswrapper[4861]: I0309 09:58:05.250608 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="579d8a421d9e9c5d4b438f76b40f53abd51c8ab109b86234abe11d7ce3d7a313" Mar 09 09:58:05 crc kubenswrapper[4861]: I0309 09:58:05.250658 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550838-g4gxr" Mar 09 09:58:05 crc kubenswrapper[4861]: I0309 09:58:05.378110 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550832-8pgc6"] Mar 09 09:58:05 crc kubenswrapper[4861]: I0309 09:58:05.388440 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550832-8pgc6"] Mar 09 09:58:05 crc kubenswrapper[4861]: I0309 09:58:05.658891 4861 scope.go:117] "RemoveContainer" containerID="77c857d062282bd5e967d0e73907a96e77dbe37daf546bb7546f15dc1d364592" Mar 09 09:58:05 crc kubenswrapper[4861]: E0309 09:58:05.659342 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:58:05 crc kubenswrapper[4861]: I0309 09:58:05.670053 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1e3de6d-29be-4668-8739-c5d0afd5b183" path="/var/lib/kubelet/pods/d1e3de6d-29be-4668-8739-c5d0afd5b183/volumes" Mar 09 09:58:17 crc kubenswrapper[4861]: I0309 09:58:17.664573 4861 scope.go:117] "RemoveContainer" containerID="77c857d062282bd5e967d0e73907a96e77dbe37daf546bb7546f15dc1d364592" Mar 09 09:58:17 crc kubenswrapper[4861]: E0309 09:58:17.666403 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:58:29 crc kubenswrapper[4861]: I0309 09:58:29.657940 4861 scope.go:117] "RemoveContainer" containerID="77c857d062282bd5e967d0e73907a96e77dbe37daf546bb7546f15dc1d364592" Mar 09 09:58:29 crc kubenswrapper[4861]: E0309 09:58:29.658682 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:58:37 crc kubenswrapper[4861]: I0309 09:58:37.630861 4861 scope.go:117] "RemoveContainer" containerID="e570b3aa786918ee88159aca1f61a5919dd5545daffb5779dec81db86b1a9551" Mar 09 09:58:40 crc kubenswrapper[4861]: I0309 09:58:40.658301 4861 scope.go:117] "RemoveContainer" containerID="77c857d062282bd5e967d0e73907a96e77dbe37daf546bb7546f15dc1d364592" Mar 09 09:58:40 crc kubenswrapper[4861]: E0309 09:58:40.659323 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:58:51 crc kubenswrapper[4861]: I0309 09:58:51.658329 4861 scope.go:117] "RemoveContainer" containerID="77c857d062282bd5e967d0e73907a96e77dbe37daf546bb7546f15dc1d364592" Mar 09 09:58:51 crc kubenswrapper[4861]: E0309 09:58:51.659278 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:58:53 crc kubenswrapper[4861]: I0309 09:58:53.887811 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-l7qh2"] Mar 09 09:58:53 crc kubenswrapper[4861]: E0309 09:58:53.892193 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e005695-9c62-4320-a9f8-192525751618" containerName="oc" Mar 09 09:58:53 crc kubenswrapper[4861]: I0309 09:58:53.892232 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e005695-9c62-4320-a9f8-192525751618" containerName="oc" Mar 09 09:58:53 crc kubenswrapper[4861]: I0309 09:58:53.892459 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e005695-9c62-4320-a9f8-192525751618" containerName="oc" Mar 09 09:58:53 crc kubenswrapper[4861]: I0309 09:58:53.893893 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l7qh2" Mar 09 09:58:53 crc kubenswrapper[4861]: I0309 09:58:53.905204 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l7qh2"] Mar 09 09:58:53 crc kubenswrapper[4861]: I0309 09:58:53.944238 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7xvr\" (UniqueName: \"kubernetes.io/projected/bf340526-0d0c-4767-a1d2-8ae3930b32ce-kube-api-access-l7xvr\") pod \"certified-operators-l7qh2\" (UID: \"bf340526-0d0c-4767-a1d2-8ae3930b32ce\") " pod="openshift-marketplace/certified-operators-l7qh2" Mar 09 09:58:53 crc kubenswrapper[4861]: I0309 09:58:53.944409 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf340526-0d0c-4767-a1d2-8ae3930b32ce-utilities\") pod \"certified-operators-l7qh2\" (UID: \"bf340526-0d0c-4767-a1d2-8ae3930b32ce\") " pod="openshift-marketplace/certified-operators-l7qh2" Mar 09 09:58:53 crc kubenswrapper[4861]: I0309 09:58:53.944565 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf340526-0d0c-4767-a1d2-8ae3930b32ce-catalog-content\") pod \"certified-operators-l7qh2\" (UID: \"bf340526-0d0c-4767-a1d2-8ae3930b32ce\") " pod="openshift-marketplace/certified-operators-l7qh2" Mar 09 09:58:54 crc kubenswrapper[4861]: I0309 09:58:54.045949 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7xvr\" (UniqueName: \"kubernetes.io/projected/bf340526-0d0c-4767-a1d2-8ae3930b32ce-kube-api-access-l7xvr\") pod \"certified-operators-l7qh2\" (UID: \"bf340526-0d0c-4767-a1d2-8ae3930b32ce\") " pod="openshift-marketplace/certified-operators-l7qh2" Mar 09 09:58:54 crc kubenswrapper[4861]: I0309 09:58:54.046069 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf340526-0d0c-4767-a1d2-8ae3930b32ce-utilities\") pod \"certified-operators-l7qh2\" (UID: \"bf340526-0d0c-4767-a1d2-8ae3930b32ce\") " pod="openshift-marketplace/certified-operators-l7qh2" Mar 09 09:58:54 crc kubenswrapper[4861]: I0309 09:58:54.046100 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf340526-0d0c-4767-a1d2-8ae3930b32ce-catalog-content\") pod \"certified-operators-l7qh2\" (UID: \"bf340526-0d0c-4767-a1d2-8ae3930b32ce\") " pod="openshift-marketplace/certified-operators-l7qh2" Mar 09 09:58:54 crc kubenswrapper[4861]: I0309 09:58:54.046533 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf340526-0d0c-4767-a1d2-8ae3930b32ce-catalog-content\") pod \"certified-operators-l7qh2\" (UID: \"bf340526-0d0c-4767-a1d2-8ae3930b32ce\") " pod="openshift-marketplace/certified-operators-l7qh2" Mar 09 09:58:54 crc kubenswrapper[4861]: I0309 09:58:54.046601 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf340526-0d0c-4767-a1d2-8ae3930b32ce-utilities\") pod \"certified-operators-l7qh2\" (UID: \"bf340526-0d0c-4767-a1d2-8ae3930b32ce\") " pod="openshift-marketplace/certified-operators-l7qh2" Mar 09 09:58:54 crc kubenswrapper[4861]: I0309 09:58:54.071345 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7xvr\" (UniqueName: \"kubernetes.io/projected/bf340526-0d0c-4767-a1d2-8ae3930b32ce-kube-api-access-l7xvr\") pod \"certified-operators-l7qh2\" (UID: \"bf340526-0d0c-4767-a1d2-8ae3930b32ce\") " pod="openshift-marketplace/certified-operators-l7qh2" Mar 09 09:58:54 crc kubenswrapper[4861]: I0309 09:58:54.089848 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-l5wnv"] Mar 09 09:58:54 crc kubenswrapper[4861]: I0309 09:58:54.092203 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l5wnv" Mar 09 09:58:54 crc kubenswrapper[4861]: I0309 09:58:54.106779 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l5wnv"] Mar 09 09:58:54 crc kubenswrapper[4861]: I0309 09:58:54.148054 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38539b65-3d4b-442d-a287-ba22875226f9-utilities\") pod \"redhat-marketplace-l5wnv\" (UID: \"38539b65-3d4b-442d-a287-ba22875226f9\") " pod="openshift-marketplace/redhat-marketplace-l5wnv" Mar 09 09:58:54 crc kubenswrapper[4861]: I0309 09:58:54.148283 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdhlm\" (UniqueName: \"kubernetes.io/projected/38539b65-3d4b-442d-a287-ba22875226f9-kube-api-access-sdhlm\") pod \"redhat-marketplace-l5wnv\" (UID: \"38539b65-3d4b-442d-a287-ba22875226f9\") " pod="openshift-marketplace/redhat-marketplace-l5wnv" Mar 09 09:58:54 crc kubenswrapper[4861]: I0309 09:58:54.148323 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38539b65-3d4b-442d-a287-ba22875226f9-catalog-content\") pod \"redhat-marketplace-l5wnv\" (UID: \"38539b65-3d4b-442d-a287-ba22875226f9\") " pod="openshift-marketplace/redhat-marketplace-l5wnv" Mar 09 09:58:54 crc kubenswrapper[4861]: I0309 09:58:54.218088 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l7qh2" Mar 09 09:58:54 crc kubenswrapper[4861]: I0309 09:58:54.249637 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38539b65-3d4b-442d-a287-ba22875226f9-utilities\") pod \"redhat-marketplace-l5wnv\" (UID: \"38539b65-3d4b-442d-a287-ba22875226f9\") " pod="openshift-marketplace/redhat-marketplace-l5wnv" Mar 09 09:58:54 crc kubenswrapper[4861]: I0309 09:58:54.250083 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdhlm\" (UniqueName: \"kubernetes.io/projected/38539b65-3d4b-442d-a287-ba22875226f9-kube-api-access-sdhlm\") pod \"redhat-marketplace-l5wnv\" (UID: \"38539b65-3d4b-442d-a287-ba22875226f9\") " pod="openshift-marketplace/redhat-marketplace-l5wnv" Mar 09 09:58:54 crc kubenswrapper[4861]: I0309 09:58:54.250213 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38539b65-3d4b-442d-a287-ba22875226f9-catalog-content\") pod \"redhat-marketplace-l5wnv\" (UID: \"38539b65-3d4b-442d-a287-ba22875226f9\") " pod="openshift-marketplace/redhat-marketplace-l5wnv" Mar 09 09:58:54 crc kubenswrapper[4861]: I0309 09:58:54.250126 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38539b65-3d4b-442d-a287-ba22875226f9-utilities\") pod \"redhat-marketplace-l5wnv\" (UID: \"38539b65-3d4b-442d-a287-ba22875226f9\") " pod="openshift-marketplace/redhat-marketplace-l5wnv" Mar 09 09:58:54 crc kubenswrapper[4861]: I0309 09:58:54.250735 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38539b65-3d4b-442d-a287-ba22875226f9-catalog-content\") pod \"redhat-marketplace-l5wnv\" (UID: \"38539b65-3d4b-442d-a287-ba22875226f9\") " pod="openshift-marketplace/redhat-marketplace-l5wnv" Mar 09 09:58:54 crc kubenswrapper[4861]: I0309 09:58:54.269143 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdhlm\" (UniqueName: \"kubernetes.io/projected/38539b65-3d4b-442d-a287-ba22875226f9-kube-api-access-sdhlm\") pod \"redhat-marketplace-l5wnv\" (UID: \"38539b65-3d4b-442d-a287-ba22875226f9\") " pod="openshift-marketplace/redhat-marketplace-l5wnv" Mar 09 09:58:54 crc kubenswrapper[4861]: I0309 09:58:54.439230 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l5wnv" Mar 09 09:58:54 crc kubenswrapper[4861]: I0309 09:58:54.787661 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l7qh2"] Mar 09 09:58:54 crc kubenswrapper[4861]: I0309 09:58:54.842878 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l5wnv"] Mar 09 09:58:55 crc kubenswrapper[4861]: I0309 09:58:55.693013 4861 generic.go:334] "Generic (PLEG): container finished" podID="bf340526-0d0c-4767-a1d2-8ae3930b32ce" containerID="a790766d902224d75e3979c374e7475069a1291ecb69f60351063a10f6558231" exitCode=0 Mar 09 09:58:55 crc kubenswrapper[4861]: I0309 09:58:55.693106 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l7qh2" event={"ID":"bf340526-0d0c-4767-a1d2-8ae3930b32ce","Type":"ContainerDied","Data":"a790766d902224d75e3979c374e7475069a1291ecb69f60351063a10f6558231"} Mar 09 09:58:55 crc kubenswrapper[4861]: I0309 09:58:55.693466 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l7qh2" event={"ID":"bf340526-0d0c-4767-a1d2-8ae3930b32ce","Type":"ContainerStarted","Data":"a053340c674d6ae4b8fae7a5c589d5b7a00c3e644be0540590329d7b2b322dbb"} Mar 09 09:58:55 crc kubenswrapper[4861]: I0309 09:58:55.699017 4861 generic.go:334] "Generic (PLEG): container finished" podID="38539b65-3d4b-442d-a287-ba22875226f9" containerID="5f64984784a3a92e97b4cf99ca7f70374a9ef1d0f134adca0757be1926785cd3" exitCode=0 Mar 09 09:58:55 crc kubenswrapper[4861]: I0309 09:58:55.699072 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l5wnv" event={"ID":"38539b65-3d4b-442d-a287-ba22875226f9","Type":"ContainerDied","Data":"5f64984784a3a92e97b4cf99ca7f70374a9ef1d0f134adca0757be1926785cd3"} Mar 09 09:58:55 crc kubenswrapper[4861]: I0309 09:58:55.699101 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l5wnv" event={"ID":"38539b65-3d4b-442d-a287-ba22875226f9","Type":"ContainerStarted","Data":"712c2f5cc15e8794f3be89a8ba2e57a6451ca09dd380acc075b14d3f6b8b3a58"} Mar 09 09:58:56 crc kubenswrapper[4861]: I0309 09:58:56.719337 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l7qh2" event={"ID":"bf340526-0d0c-4767-a1d2-8ae3930b32ce","Type":"ContainerStarted","Data":"6e9f3d0c4a7b4f8b90a4faed06d032c037d0392df1d2bcb0dbb0b662d950cb85"} Mar 09 09:58:56 crc kubenswrapper[4861]: I0309 09:58:56.723102 4861 generic.go:334] "Generic (PLEG): container finished" podID="38539b65-3d4b-442d-a287-ba22875226f9" containerID="5d3212bc046193621c441a590ecc8f7ae8612d04240600fbfb53f2c5f715f2c9" exitCode=0 Mar 09 09:58:56 crc kubenswrapper[4861]: I0309 09:58:56.723187 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l5wnv" event={"ID":"38539b65-3d4b-442d-a287-ba22875226f9","Type":"ContainerDied","Data":"5d3212bc046193621c441a590ecc8f7ae8612d04240600fbfb53f2c5f715f2c9"} Mar 09 09:58:57 crc kubenswrapper[4861]: I0309 09:58:57.734472 4861 generic.go:334] "Generic (PLEG): container finished" podID="bf340526-0d0c-4767-a1d2-8ae3930b32ce" containerID="6e9f3d0c4a7b4f8b90a4faed06d032c037d0392df1d2bcb0dbb0b662d950cb85" exitCode=0 Mar 09 09:58:57 crc kubenswrapper[4861]: I0309 09:58:57.734536 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l7qh2" event={"ID":"bf340526-0d0c-4767-a1d2-8ae3930b32ce","Type":"ContainerDied","Data":"6e9f3d0c4a7b4f8b90a4faed06d032c037d0392df1d2bcb0dbb0b662d950cb85"} Mar 09 09:58:57 crc kubenswrapper[4861]: I0309 09:58:57.738722 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l5wnv" event={"ID":"38539b65-3d4b-442d-a287-ba22875226f9","Type":"ContainerStarted","Data":"e9bec3797e1e86773da9cd93f7948062c5e9c2c4445b8274a929eedf760c1002"} Mar 09 09:58:57 crc kubenswrapper[4861]: I0309 09:58:57.775129 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-l5wnv" podStartSLOduration=2.344864249 podStartE2EDuration="3.775109675s" podCreationTimestamp="2026-03-09 09:58:54 +0000 UTC" firstStartedPulling="2026-03-09 09:58:55.701743787 +0000 UTC m=+3178.786783198" lastFinishedPulling="2026-03-09 09:58:57.131989223 +0000 UTC m=+3180.217028624" observedRunningTime="2026-03-09 09:58:57.768391875 +0000 UTC m=+3180.853431286" watchObservedRunningTime="2026-03-09 09:58:57.775109675 +0000 UTC m=+3180.860149076" Mar 09 09:58:58 crc kubenswrapper[4861]: I0309 09:58:58.750472 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l7qh2" event={"ID":"bf340526-0d0c-4767-a1d2-8ae3930b32ce","Type":"ContainerStarted","Data":"0d157eb5a35ac3cd612692606d3d6d102186bce27f824fad09bc534cb6b5180d"} Mar 09 09:58:58 crc kubenswrapper[4861]: I0309 09:58:58.775606 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-l7qh2" podStartSLOduration=3.158892903 podStartE2EDuration="5.775584023s" podCreationTimestamp="2026-03-09 09:58:53 +0000 UTC" firstStartedPulling="2026-03-09 09:58:55.695401347 +0000 UTC m=+3178.780440768" lastFinishedPulling="2026-03-09 09:58:58.312092487 +0000 UTC m=+3181.397131888" observedRunningTime="2026-03-09 09:58:58.770748479 +0000 UTC m=+3181.855787880" watchObservedRunningTime="2026-03-09 09:58:58.775584023 +0000 UTC m=+3181.860623424" Mar 09 09:59:04 crc kubenswrapper[4861]: I0309 09:59:04.218516 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-l7qh2" Mar 09 09:59:04 crc kubenswrapper[4861]: I0309 09:59:04.218837 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-l7qh2" Mar 09 09:59:04 crc kubenswrapper[4861]: I0309 09:59:04.440140 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-l5wnv" Mar 09 09:59:04 crc kubenswrapper[4861]: I0309 09:59:04.440199 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-l5wnv" Mar 09 09:59:04 crc kubenswrapper[4861]: I0309 09:59:04.490623 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-l5wnv" Mar 09 09:59:04 crc kubenswrapper[4861]: I0309 09:59:04.864331 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-l5wnv" Mar 09 09:59:05 crc kubenswrapper[4861]: I0309 09:59:05.260467 4861 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-l7qh2" podUID="bf340526-0d0c-4767-a1d2-8ae3930b32ce" containerName="registry-server" probeResult="failure" output=< Mar 09 09:59:05 crc kubenswrapper[4861]: timeout: failed to connect service ":50051" within 1s Mar 09 09:59:05 crc kubenswrapper[4861]: > Mar 09 09:59:06 crc kubenswrapper[4861]: I0309 09:59:06.087781 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l5wnv"] Mar 09 09:59:06 crc kubenswrapper[4861]: I0309 09:59:06.658724 4861 scope.go:117] "RemoveContainer" containerID="77c857d062282bd5e967d0e73907a96e77dbe37daf546bb7546f15dc1d364592" Mar 09 09:59:06 crc kubenswrapper[4861]: E0309 09:59:06.659003 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:59:06 crc kubenswrapper[4861]: I0309 09:59:06.842264 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-l5wnv" podUID="38539b65-3d4b-442d-a287-ba22875226f9" containerName="registry-server" containerID="cri-o://e9bec3797e1e86773da9cd93f7948062c5e9c2c4445b8274a929eedf760c1002" gracePeriod=2 Mar 09 09:59:07 crc kubenswrapper[4861]: I0309 09:59:07.402196 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l5wnv" Mar 09 09:59:07 crc kubenswrapper[4861]: I0309 09:59:07.516867 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38539b65-3d4b-442d-a287-ba22875226f9-utilities\") pod \"38539b65-3d4b-442d-a287-ba22875226f9\" (UID: \"38539b65-3d4b-442d-a287-ba22875226f9\") " Mar 09 09:59:07 crc kubenswrapper[4861]: I0309 09:59:07.516991 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdhlm\" (UniqueName: \"kubernetes.io/projected/38539b65-3d4b-442d-a287-ba22875226f9-kube-api-access-sdhlm\") pod \"38539b65-3d4b-442d-a287-ba22875226f9\" (UID: \"38539b65-3d4b-442d-a287-ba22875226f9\") " Mar 09 09:59:07 crc kubenswrapper[4861]: I0309 09:59:07.517115 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38539b65-3d4b-442d-a287-ba22875226f9-catalog-content\") pod \"38539b65-3d4b-442d-a287-ba22875226f9\" (UID: \"38539b65-3d4b-442d-a287-ba22875226f9\") " Mar 09 09:59:07 crc kubenswrapper[4861]: I0309 09:59:07.518501 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38539b65-3d4b-442d-a287-ba22875226f9-utilities" (OuterVolumeSpecName: "utilities") pod "38539b65-3d4b-442d-a287-ba22875226f9" (UID: "38539b65-3d4b-442d-a287-ba22875226f9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:59:07 crc kubenswrapper[4861]: I0309 09:59:07.525700 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38539b65-3d4b-442d-a287-ba22875226f9-kube-api-access-sdhlm" (OuterVolumeSpecName: "kube-api-access-sdhlm") pod "38539b65-3d4b-442d-a287-ba22875226f9" (UID: "38539b65-3d4b-442d-a287-ba22875226f9"). InnerVolumeSpecName "kube-api-access-sdhlm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:59:07 crc kubenswrapper[4861]: I0309 09:59:07.549955 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38539b65-3d4b-442d-a287-ba22875226f9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "38539b65-3d4b-442d-a287-ba22875226f9" (UID: "38539b65-3d4b-442d-a287-ba22875226f9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:59:07 crc kubenswrapper[4861]: I0309 09:59:07.622579 4861 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38539b65-3d4b-442d-a287-ba22875226f9-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:07 crc kubenswrapper[4861]: I0309 09:59:07.622616 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdhlm\" (UniqueName: \"kubernetes.io/projected/38539b65-3d4b-442d-a287-ba22875226f9-kube-api-access-sdhlm\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:07 crc kubenswrapper[4861]: I0309 09:59:07.622628 4861 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38539b65-3d4b-442d-a287-ba22875226f9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:07 crc kubenswrapper[4861]: I0309 09:59:07.853160 4861 generic.go:334] "Generic (PLEG): container finished" podID="38539b65-3d4b-442d-a287-ba22875226f9" containerID="e9bec3797e1e86773da9cd93f7948062c5e9c2c4445b8274a929eedf760c1002" exitCode=0 Mar 09 09:59:07 crc kubenswrapper[4861]: I0309 09:59:07.853222 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l5wnv" Mar 09 09:59:07 crc kubenswrapper[4861]: I0309 09:59:07.853238 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l5wnv" event={"ID":"38539b65-3d4b-442d-a287-ba22875226f9","Type":"ContainerDied","Data":"e9bec3797e1e86773da9cd93f7948062c5e9c2c4445b8274a929eedf760c1002"} Mar 09 09:59:07 crc kubenswrapper[4861]: I0309 09:59:07.853309 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l5wnv" event={"ID":"38539b65-3d4b-442d-a287-ba22875226f9","Type":"ContainerDied","Data":"712c2f5cc15e8794f3be89a8ba2e57a6451ca09dd380acc075b14d3f6b8b3a58"} Mar 09 09:59:07 crc kubenswrapper[4861]: I0309 09:59:07.853337 4861 scope.go:117] "RemoveContainer" containerID="e9bec3797e1e86773da9cd93f7948062c5e9c2c4445b8274a929eedf760c1002" Mar 09 09:59:07 crc kubenswrapper[4861]: I0309 09:59:07.883653 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l5wnv"] Mar 09 09:59:07 crc kubenswrapper[4861]: I0309 09:59:07.889346 4861 scope.go:117] "RemoveContainer" containerID="5d3212bc046193621c441a590ecc8f7ae8612d04240600fbfb53f2c5f715f2c9" Mar 09 09:59:07 crc kubenswrapper[4861]: I0309 09:59:07.904145 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-l5wnv"] Mar 09 09:59:07 crc kubenswrapper[4861]: I0309 09:59:07.945034 4861 scope.go:117] "RemoveContainer" containerID="5f64984784a3a92e97b4cf99ca7f70374a9ef1d0f134adca0757be1926785cd3" Mar 09 09:59:07 crc kubenswrapper[4861]: I0309 09:59:07.992222 4861 scope.go:117] "RemoveContainer" containerID="e9bec3797e1e86773da9cd93f7948062c5e9c2c4445b8274a929eedf760c1002" Mar 09 09:59:07 crc kubenswrapper[4861]: E0309 09:59:07.992779 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9bec3797e1e86773da9cd93f7948062c5e9c2c4445b8274a929eedf760c1002\": container with ID starting with e9bec3797e1e86773da9cd93f7948062c5e9c2c4445b8274a929eedf760c1002 not found: ID does not exist" containerID="e9bec3797e1e86773da9cd93f7948062c5e9c2c4445b8274a929eedf760c1002" Mar 09 09:59:07 crc kubenswrapper[4861]: I0309 09:59:07.992825 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9bec3797e1e86773da9cd93f7948062c5e9c2c4445b8274a929eedf760c1002"} err="failed to get container status \"e9bec3797e1e86773da9cd93f7948062c5e9c2c4445b8274a929eedf760c1002\": rpc error: code = NotFound desc = could not find container \"e9bec3797e1e86773da9cd93f7948062c5e9c2c4445b8274a929eedf760c1002\": container with ID starting with e9bec3797e1e86773da9cd93f7948062c5e9c2c4445b8274a929eedf760c1002 not found: ID does not exist" Mar 09 09:59:07 crc kubenswrapper[4861]: I0309 09:59:07.992853 4861 scope.go:117] "RemoveContainer" containerID="5d3212bc046193621c441a590ecc8f7ae8612d04240600fbfb53f2c5f715f2c9" Mar 09 09:59:07 crc kubenswrapper[4861]: E0309 09:59:07.993150 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d3212bc046193621c441a590ecc8f7ae8612d04240600fbfb53f2c5f715f2c9\": container with ID starting with 5d3212bc046193621c441a590ecc8f7ae8612d04240600fbfb53f2c5f715f2c9 not found: ID does not exist" containerID="5d3212bc046193621c441a590ecc8f7ae8612d04240600fbfb53f2c5f715f2c9" Mar 09 09:59:07 crc kubenswrapper[4861]: I0309 09:59:07.993180 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d3212bc046193621c441a590ecc8f7ae8612d04240600fbfb53f2c5f715f2c9"} err="failed to get container status \"5d3212bc046193621c441a590ecc8f7ae8612d04240600fbfb53f2c5f715f2c9\": rpc error: code = NotFound desc = could not find container \"5d3212bc046193621c441a590ecc8f7ae8612d04240600fbfb53f2c5f715f2c9\": container with ID starting with 5d3212bc046193621c441a590ecc8f7ae8612d04240600fbfb53f2c5f715f2c9 not found: ID does not exist" Mar 09 09:59:07 crc kubenswrapper[4861]: I0309 09:59:07.993195 4861 scope.go:117] "RemoveContainer" containerID="5f64984784a3a92e97b4cf99ca7f70374a9ef1d0f134adca0757be1926785cd3" Mar 09 09:59:07 crc kubenswrapper[4861]: E0309 09:59:07.993429 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f64984784a3a92e97b4cf99ca7f70374a9ef1d0f134adca0757be1926785cd3\": container with ID starting with 5f64984784a3a92e97b4cf99ca7f70374a9ef1d0f134adca0757be1926785cd3 not found: ID does not exist" containerID="5f64984784a3a92e97b4cf99ca7f70374a9ef1d0f134adca0757be1926785cd3" Mar 09 09:59:07 crc kubenswrapper[4861]: I0309 09:59:07.993454 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f64984784a3a92e97b4cf99ca7f70374a9ef1d0f134adca0757be1926785cd3"} err="failed to get container status \"5f64984784a3a92e97b4cf99ca7f70374a9ef1d0f134adca0757be1926785cd3\": rpc error: code = NotFound desc = could not find container \"5f64984784a3a92e97b4cf99ca7f70374a9ef1d0f134adca0757be1926785cd3\": container with ID starting with 5f64984784a3a92e97b4cf99ca7f70374a9ef1d0f134adca0757be1926785cd3 not found: ID does not exist" Mar 09 09:59:09 crc kubenswrapper[4861]: I0309 09:59:09.669911 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38539b65-3d4b-442d-a287-ba22875226f9" path="/var/lib/kubelet/pods/38539b65-3d4b-442d-a287-ba22875226f9/volumes" Mar 09 09:59:14 crc kubenswrapper[4861]: I0309 09:59:14.268846 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-l7qh2" Mar 09 09:59:14 crc kubenswrapper[4861]: I0309 09:59:14.320937 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-l7qh2" Mar 09 09:59:14 crc kubenswrapper[4861]: I0309 09:59:14.879029 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l7qh2"] Mar 09 09:59:15 crc kubenswrapper[4861]: I0309 09:59:15.929970 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-l7qh2" podUID="bf340526-0d0c-4767-a1d2-8ae3930b32ce" containerName="registry-server" containerID="cri-o://0d157eb5a35ac3cd612692606d3d6d102186bce27f824fad09bc534cb6b5180d" gracePeriod=2 Mar 09 09:59:16 crc kubenswrapper[4861]: I0309 09:59:16.410699 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l7qh2" Mar 09 09:59:16 crc kubenswrapper[4861]: I0309 09:59:16.528604 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7xvr\" (UniqueName: \"kubernetes.io/projected/bf340526-0d0c-4767-a1d2-8ae3930b32ce-kube-api-access-l7xvr\") pod \"bf340526-0d0c-4767-a1d2-8ae3930b32ce\" (UID: \"bf340526-0d0c-4767-a1d2-8ae3930b32ce\") " Mar 09 09:59:16 crc kubenswrapper[4861]: I0309 09:59:16.528674 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf340526-0d0c-4767-a1d2-8ae3930b32ce-utilities\") pod \"bf340526-0d0c-4767-a1d2-8ae3930b32ce\" (UID: \"bf340526-0d0c-4767-a1d2-8ae3930b32ce\") " Mar 09 09:59:16 crc kubenswrapper[4861]: I0309 09:59:16.528930 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf340526-0d0c-4767-a1d2-8ae3930b32ce-catalog-content\") pod \"bf340526-0d0c-4767-a1d2-8ae3930b32ce\" (UID: \"bf340526-0d0c-4767-a1d2-8ae3930b32ce\") " Mar 09 09:59:16 crc kubenswrapper[4861]: I0309 09:59:16.529867 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf340526-0d0c-4767-a1d2-8ae3930b32ce-utilities" (OuterVolumeSpecName: "utilities") pod "bf340526-0d0c-4767-a1d2-8ae3930b32ce" (UID: "bf340526-0d0c-4767-a1d2-8ae3930b32ce"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:59:16 crc kubenswrapper[4861]: I0309 09:59:16.541007 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf340526-0d0c-4767-a1d2-8ae3930b32ce-kube-api-access-l7xvr" (OuterVolumeSpecName: "kube-api-access-l7xvr") pod "bf340526-0d0c-4767-a1d2-8ae3930b32ce" (UID: "bf340526-0d0c-4767-a1d2-8ae3930b32ce"). InnerVolumeSpecName "kube-api-access-l7xvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 09:59:16 crc kubenswrapper[4861]: I0309 09:59:16.593863 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf340526-0d0c-4767-a1d2-8ae3930b32ce-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bf340526-0d0c-4767-a1d2-8ae3930b32ce" (UID: "bf340526-0d0c-4767-a1d2-8ae3930b32ce"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 09:59:16 crc kubenswrapper[4861]: I0309 09:59:16.631860 4861 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf340526-0d0c-4767-a1d2-8ae3930b32ce-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:16 crc kubenswrapper[4861]: I0309 09:59:16.631916 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7xvr\" (UniqueName: \"kubernetes.io/projected/bf340526-0d0c-4767-a1d2-8ae3930b32ce-kube-api-access-l7xvr\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:16 crc kubenswrapper[4861]: I0309 09:59:16.631930 4861 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf340526-0d0c-4767-a1d2-8ae3930b32ce-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 09:59:16 crc kubenswrapper[4861]: I0309 09:59:16.940754 4861 generic.go:334] "Generic (PLEG): container finished" podID="bf340526-0d0c-4767-a1d2-8ae3930b32ce" containerID="0d157eb5a35ac3cd612692606d3d6d102186bce27f824fad09bc534cb6b5180d" exitCode=0 Mar 09 09:59:16 crc kubenswrapper[4861]: I0309 09:59:16.940818 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l7qh2" Mar 09 09:59:16 crc kubenswrapper[4861]: I0309 09:59:16.940838 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l7qh2" event={"ID":"bf340526-0d0c-4767-a1d2-8ae3930b32ce","Type":"ContainerDied","Data":"0d157eb5a35ac3cd612692606d3d6d102186bce27f824fad09bc534cb6b5180d"} Mar 09 09:59:16 crc kubenswrapper[4861]: I0309 09:59:16.940871 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l7qh2" event={"ID":"bf340526-0d0c-4767-a1d2-8ae3930b32ce","Type":"ContainerDied","Data":"a053340c674d6ae4b8fae7a5c589d5b7a00c3e644be0540590329d7b2b322dbb"} Mar 09 09:59:16 crc kubenswrapper[4861]: I0309 09:59:16.940897 4861 scope.go:117] "RemoveContainer" containerID="0d157eb5a35ac3cd612692606d3d6d102186bce27f824fad09bc534cb6b5180d" Mar 09 09:59:16 crc kubenswrapper[4861]: I0309 09:59:16.967665 4861 scope.go:117] "RemoveContainer" containerID="6e9f3d0c4a7b4f8b90a4faed06d032c037d0392df1d2bcb0dbb0b662d950cb85" Mar 09 09:59:16 crc kubenswrapper[4861]: I0309 09:59:16.987896 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l7qh2"] Mar 09 09:59:17 crc kubenswrapper[4861]: I0309 09:59:17.004208 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-l7qh2"] Mar 09 09:59:17 crc kubenswrapper[4861]: I0309 09:59:17.020583 4861 scope.go:117] "RemoveContainer" containerID="a790766d902224d75e3979c374e7475069a1291ecb69f60351063a10f6558231" Mar 09 09:59:17 crc kubenswrapper[4861]: I0309 09:59:17.059270 4861 scope.go:117] "RemoveContainer" containerID="0d157eb5a35ac3cd612692606d3d6d102186bce27f824fad09bc534cb6b5180d" Mar 09 09:59:17 crc kubenswrapper[4861]: E0309 09:59:17.060539 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d157eb5a35ac3cd612692606d3d6d102186bce27f824fad09bc534cb6b5180d\": container with ID starting with 0d157eb5a35ac3cd612692606d3d6d102186bce27f824fad09bc534cb6b5180d not found: ID does not exist" containerID="0d157eb5a35ac3cd612692606d3d6d102186bce27f824fad09bc534cb6b5180d" Mar 09 09:59:17 crc kubenswrapper[4861]: I0309 09:59:17.060659 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d157eb5a35ac3cd612692606d3d6d102186bce27f824fad09bc534cb6b5180d"} err="failed to get container status \"0d157eb5a35ac3cd612692606d3d6d102186bce27f824fad09bc534cb6b5180d\": rpc error: code = NotFound desc = could not find container \"0d157eb5a35ac3cd612692606d3d6d102186bce27f824fad09bc534cb6b5180d\": container with ID starting with 0d157eb5a35ac3cd612692606d3d6d102186bce27f824fad09bc534cb6b5180d not found: ID does not exist" Mar 09 09:59:17 crc kubenswrapper[4861]: I0309 09:59:17.060710 4861 scope.go:117] "RemoveContainer" containerID="6e9f3d0c4a7b4f8b90a4faed06d032c037d0392df1d2bcb0dbb0b662d950cb85" Mar 09 09:59:17 crc kubenswrapper[4861]: E0309 09:59:17.062007 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e9f3d0c4a7b4f8b90a4faed06d032c037d0392df1d2bcb0dbb0b662d950cb85\": container with ID starting with 6e9f3d0c4a7b4f8b90a4faed06d032c037d0392df1d2bcb0dbb0b662d950cb85 not found: ID does not exist" containerID="6e9f3d0c4a7b4f8b90a4faed06d032c037d0392df1d2bcb0dbb0b662d950cb85" Mar 09 09:59:17 crc kubenswrapper[4861]: I0309 09:59:17.062102 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e9f3d0c4a7b4f8b90a4faed06d032c037d0392df1d2bcb0dbb0b662d950cb85"} err="failed to get container status \"6e9f3d0c4a7b4f8b90a4faed06d032c037d0392df1d2bcb0dbb0b662d950cb85\": rpc error: code = NotFound desc = could not find container \"6e9f3d0c4a7b4f8b90a4faed06d032c037d0392df1d2bcb0dbb0b662d950cb85\": container with ID starting with 6e9f3d0c4a7b4f8b90a4faed06d032c037d0392df1d2bcb0dbb0b662d950cb85 not found: ID does not exist" Mar 09 09:59:17 crc kubenswrapper[4861]: I0309 09:59:17.062143 4861 scope.go:117] "RemoveContainer" containerID="a790766d902224d75e3979c374e7475069a1291ecb69f60351063a10f6558231" Mar 09 09:59:17 crc kubenswrapper[4861]: E0309 09:59:17.063548 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a790766d902224d75e3979c374e7475069a1291ecb69f60351063a10f6558231\": container with ID starting with a790766d902224d75e3979c374e7475069a1291ecb69f60351063a10f6558231 not found: ID does not exist" containerID="a790766d902224d75e3979c374e7475069a1291ecb69f60351063a10f6558231" Mar 09 09:59:17 crc kubenswrapper[4861]: I0309 09:59:17.064144 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a790766d902224d75e3979c374e7475069a1291ecb69f60351063a10f6558231"} err="failed to get container status \"a790766d902224d75e3979c374e7475069a1291ecb69f60351063a10f6558231\": rpc error: code = NotFound desc = could not find container \"a790766d902224d75e3979c374e7475069a1291ecb69f60351063a10f6558231\": container with ID starting with a790766d902224d75e3979c374e7475069a1291ecb69f60351063a10f6558231 not found: ID does not exist" Mar 09 09:59:17 crc kubenswrapper[4861]: I0309 09:59:17.673226 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf340526-0d0c-4767-a1d2-8ae3930b32ce" path="/var/lib/kubelet/pods/bf340526-0d0c-4767-a1d2-8ae3930b32ce/volumes" Mar 09 09:59:21 crc kubenswrapper[4861]: I0309 09:59:21.658202 4861 scope.go:117] "RemoveContainer" containerID="77c857d062282bd5e967d0e73907a96e77dbe37daf546bb7546f15dc1d364592" Mar 09 09:59:21 crc kubenswrapper[4861]: E0309 09:59:21.659071 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:59:32 crc kubenswrapper[4861]: I0309 09:59:32.658115 4861 scope.go:117] "RemoveContainer" containerID="77c857d062282bd5e967d0e73907a96e77dbe37daf546bb7546f15dc1d364592" Mar 09 09:59:32 crc kubenswrapper[4861]: E0309 09:59:32.659033 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 09:59:47 crc kubenswrapper[4861]: I0309 09:59:47.664436 4861 scope.go:117] "RemoveContainer" containerID="77c857d062282bd5e967d0e73907a96e77dbe37daf546bb7546f15dc1d364592" Mar 09 09:59:47 crc kubenswrapper[4861]: E0309 09:59:47.665298 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 10:00:00 crc kubenswrapper[4861]: I0309 10:00:00.145117 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550840-nlgn5"] Mar 09 10:00:00 crc kubenswrapper[4861]: E0309 10:00:00.146147 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38539b65-3d4b-442d-a287-ba22875226f9" containerName="registry-server" Mar 09 10:00:00 crc kubenswrapper[4861]: I0309 10:00:00.146166 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="38539b65-3d4b-442d-a287-ba22875226f9" containerName="registry-server" Mar 09 10:00:00 crc kubenswrapper[4861]: E0309 10:00:00.146190 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38539b65-3d4b-442d-a287-ba22875226f9" containerName="extract-utilities" Mar 09 10:00:00 crc kubenswrapper[4861]: I0309 10:00:00.146198 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="38539b65-3d4b-442d-a287-ba22875226f9" containerName="extract-utilities" Mar 09 10:00:00 crc kubenswrapper[4861]: E0309 10:00:00.146209 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf340526-0d0c-4767-a1d2-8ae3930b32ce" containerName="registry-server" Mar 09 10:00:00 crc kubenswrapper[4861]: I0309 10:00:00.146218 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf340526-0d0c-4767-a1d2-8ae3930b32ce" containerName="registry-server" Mar 09 10:00:00 crc kubenswrapper[4861]: E0309 10:00:00.146239 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38539b65-3d4b-442d-a287-ba22875226f9" containerName="extract-content" Mar 09 10:00:00 crc kubenswrapper[4861]: I0309 10:00:00.146249 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="38539b65-3d4b-442d-a287-ba22875226f9" containerName="extract-content" Mar 09 10:00:00 crc kubenswrapper[4861]: E0309 10:00:00.146259 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf340526-0d0c-4767-a1d2-8ae3930b32ce" containerName="extract-content" Mar 09 10:00:00 crc kubenswrapper[4861]: I0309 10:00:00.146266 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf340526-0d0c-4767-a1d2-8ae3930b32ce" containerName="extract-content" Mar 09 10:00:00 crc kubenswrapper[4861]: E0309 10:00:00.146287 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf340526-0d0c-4767-a1d2-8ae3930b32ce" containerName="extract-utilities" Mar 09 10:00:00 crc kubenswrapper[4861]: I0309 10:00:00.146294 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf340526-0d0c-4767-a1d2-8ae3930b32ce" containerName="extract-utilities" Mar 09 10:00:00 crc kubenswrapper[4861]: I0309 10:00:00.146548 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf340526-0d0c-4767-a1d2-8ae3930b32ce" containerName="registry-server" Mar 09 10:00:00 crc kubenswrapper[4861]: I0309 10:00:00.146588 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="38539b65-3d4b-442d-a287-ba22875226f9" containerName="registry-server" Mar 09 10:00:00 crc kubenswrapper[4861]: I0309 10:00:00.147291 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550840-nlgn5" Mar 09 10:00:00 crc kubenswrapper[4861]: I0309 10:00:00.149590 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 10:00:00 crc kubenswrapper[4861]: I0309 10:00:00.149715 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k86l8" Mar 09 10:00:00 crc kubenswrapper[4861]: I0309 10:00:00.153835 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 10:00:00 crc kubenswrapper[4861]: I0309 10:00:00.154788 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550840-vr69x"] Mar 09 10:00:00 crc kubenswrapper[4861]: I0309 10:00:00.156338 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550840-vr69x" Mar 09 10:00:00 crc kubenswrapper[4861]: I0309 10:00:00.162034 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 09 10:00:00 crc kubenswrapper[4861]: I0309 10:00:00.162262 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 09 10:00:00 crc kubenswrapper[4861]: I0309 10:00:00.185285 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550840-vr69x"] Mar 09 10:00:00 crc kubenswrapper[4861]: I0309 10:00:00.226715 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550840-nlgn5"] Mar 09 10:00:00 crc kubenswrapper[4861]: I0309 10:00:00.284069 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzbdw\" (UniqueName: \"kubernetes.io/projected/97334747-9900-462d-b915-910d721ee722-kube-api-access-kzbdw\") pod \"auto-csr-approver-29550840-nlgn5\" (UID: \"97334747-9900-462d-b915-910d721ee722\") " pod="openshift-infra/auto-csr-approver-29550840-nlgn5" Mar 09 10:00:00 crc kubenswrapper[4861]: I0309 10:00:00.284306 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdg6n\" (UniqueName: \"kubernetes.io/projected/8f3eab37-2f60-4fbb-9d02-fde35d4a68e1-kube-api-access-rdg6n\") pod \"collect-profiles-29550840-vr69x\" (UID: \"8f3eab37-2f60-4fbb-9d02-fde35d4a68e1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550840-vr69x" Mar 09 10:00:00 crc kubenswrapper[4861]: I0309 10:00:00.284474 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8f3eab37-2f60-4fbb-9d02-fde35d4a68e1-secret-volume\") pod \"collect-profiles-29550840-vr69x\" (UID: \"8f3eab37-2f60-4fbb-9d02-fde35d4a68e1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550840-vr69x" Mar 09 10:00:00 crc kubenswrapper[4861]: I0309 10:00:00.284573 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8f3eab37-2f60-4fbb-9d02-fde35d4a68e1-config-volume\") pod \"collect-profiles-29550840-vr69x\" (UID: \"8f3eab37-2f60-4fbb-9d02-fde35d4a68e1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550840-vr69x" Mar 09 10:00:00 crc kubenswrapper[4861]: I0309 10:00:00.385815 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8f3eab37-2f60-4fbb-9d02-fde35d4a68e1-secret-volume\") pod \"collect-profiles-29550840-vr69x\" (UID: \"8f3eab37-2f60-4fbb-9d02-fde35d4a68e1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550840-vr69x" Mar 09 10:00:00 crc kubenswrapper[4861]: I0309 10:00:00.385869 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8f3eab37-2f60-4fbb-9d02-fde35d4a68e1-config-volume\") pod \"collect-profiles-29550840-vr69x\" (UID: \"8f3eab37-2f60-4fbb-9d02-fde35d4a68e1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550840-vr69x" Mar 09 10:00:00 crc kubenswrapper[4861]: I0309 10:00:00.385975 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzbdw\" (UniqueName: \"kubernetes.io/projected/97334747-9900-462d-b915-910d721ee722-kube-api-access-kzbdw\") pod \"auto-csr-approver-29550840-nlgn5\" (UID: \"97334747-9900-462d-b915-910d721ee722\") " pod="openshift-infra/auto-csr-approver-29550840-nlgn5" Mar 09 10:00:00 crc kubenswrapper[4861]: I0309 10:00:00.386087 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdg6n\" (UniqueName: \"kubernetes.io/projected/8f3eab37-2f60-4fbb-9d02-fde35d4a68e1-kube-api-access-rdg6n\") pod \"collect-profiles-29550840-vr69x\" (UID: \"8f3eab37-2f60-4fbb-9d02-fde35d4a68e1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550840-vr69x" Mar 09 10:00:00 crc kubenswrapper[4861]: I0309 10:00:00.387582 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8f3eab37-2f60-4fbb-9d02-fde35d4a68e1-config-volume\") pod \"collect-profiles-29550840-vr69x\" (UID: \"8f3eab37-2f60-4fbb-9d02-fde35d4a68e1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550840-vr69x" Mar 09 10:00:00 crc kubenswrapper[4861]: I0309 10:00:00.392959 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8f3eab37-2f60-4fbb-9d02-fde35d4a68e1-secret-volume\") pod \"collect-profiles-29550840-vr69x\" (UID: \"8f3eab37-2f60-4fbb-9d02-fde35d4a68e1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550840-vr69x" Mar 09 10:00:00 crc kubenswrapper[4861]: I0309 10:00:00.405191 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzbdw\" (UniqueName: \"kubernetes.io/projected/97334747-9900-462d-b915-910d721ee722-kube-api-access-kzbdw\") pod \"auto-csr-approver-29550840-nlgn5\" (UID: \"97334747-9900-462d-b915-910d721ee722\") " pod="openshift-infra/auto-csr-approver-29550840-nlgn5" Mar 09 10:00:00 crc kubenswrapper[4861]: I0309 10:00:00.405849 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdg6n\" (UniqueName: \"kubernetes.io/projected/8f3eab37-2f60-4fbb-9d02-fde35d4a68e1-kube-api-access-rdg6n\") pod \"collect-profiles-29550840-vr69x\" (UID: \"8f3eab37-2f60-4fbb-9d02-fde35d4a68e1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550840-vr69x" Mar 09 10:00:00 crc kubenswrapper[4861]: I0309 10:00:00.489823 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550840-nlgn5" Mar 09 10:00:00 crc kubenswrapper[4861]: I0309 10:00:00.525959 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550840-vr69x" Mar 09 10:00:00 crc kubenswrapper[4861]: I0309 10:00:00.994334 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550840-nlgn5"] Mar 09 10:00:01 crc kubenswrapper[4861]: I0309 10:00:01.083319 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550840-vr69x"] Mar 09 10:00:01 crc kubenswrapper[4861]: W0309 10:00:01.086708 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f3eab37_2f60_4fbb_9d02_fde35d4a68e1.slice/crio-e250ec4fe57b6f0e1cd925088b2bfb589631bdcf6b9d99edd43da60f75e21f33 WatchSource:0}: Error finding container e250ec4fe57b6f0e1cd925088b2bfb589631bdcf6b9d99edd43da60f75e21f33: Status 404 returned error can't find the container with id e250ec4fe57b6f0e1cd925088b2bfb589631bdcf6b9d99edd43da60f75e21f33 Mar 09 10:00:01 crc kubenswrapper[4861]: I0309 10:00:01.346101 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550840-vr69x" event={"ID":"8f3eab37-2f60-4fbb-9d02-fde35d4a68e1","Type":"ContainerStarted","Data":"66bfc60437ffdf44a1211c95098ed79c9c0c1ee7d69d31679490b02e610a6689"} Mar 09 10:00:01 crc kubenswrapper[4861]: I0309 10:00:01.347509 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550840-vr69x" event={"ID":"8f3eab37-2f60-4fbb-9d02-fde35d4a68e1","Type":"ContainerStarted","Data":"e250ec4fe57b6f0e1cd925088b2bfb589631bdcf6b9d99edd43da60f75e21f33"} Mar 09 10:00:01 crc kubenswrapper[4861]: I0309 10:00:01.347614 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550840-nlgn5" event={"ID":"97334747-9900-462d-b915-910d721ee722","Type":"ContainerStarted","Data":"4f45fe0b3796a41f914c90c7f3ab4bf86cbf3cb4a0accd2596619bc4e58a7256"} Mar 09 10:00:01 crc kubenswrapper[4861]: I0309 10:00:01.372406 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29550840-vr69x" podStartSLOduration=1.372385402 podStartE2EDuration="1.372385402s" podCreationTimestamp="2026-03-09 10:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 10:00:01.369038303 +0000 UTC m=+3244.454077704" watchObservedRunningTime="2026-03-09 10:00:01.372385402 +0000 UTC m=+3244.457424793" Mar 09 10:00:01 crc kubenswrapper[4861]: I0309 10:00:01.659789 4861 scope.go:117] "RemoveContainer" containerID="77c857d062282bd5e967d0e73907a96e77dbe37daf546bb7546f15dc1d364592" Mar 09 10:00:01 crc kubenswrapper[4861]: E0309 10:00:01.660409 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 10:00:02 crc kubenswrapper[4861]: I0309 10:00:02.358544 4861 generic.go:334] "Generic (PLEG): container finished" podID="8f3eab37-2f60-4fbb-9d02-fde35d4a68e1" containerID="66bfc60437ffdf44a1211c95098ed79c9c0c1ee7d69d31679490b02e610a6689" exitCode=0 Mar 09 10:00:02 crc kubenswrapper[4861]: I0309 10:00:02.358617 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550840-vr69x" event={"ID":"8f3eab37-2f60-4fbb-9d02-fde35d4a68e1","Type":"ContainerDied","Data":"66bfc60437ffdf44a1211c95098ed79c9c0c1ee7d69d31679490b02e610a6689"} Mar 09 10:00:03 crc kubenswrapper[4861]: I0309 10:00:03.798101 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550840-vr69x" Mar 09 10:00:03 crc kubenswrapper[4861]: I0309 10:00:03.867614 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8f3eab37-2f60-4fbb-9d02-fde35d4a68e1-config-volume\") pod \"8f3eab37-2f60-4fbb-9d02-fde35d4a68e1\" (UID: \"8f3eab37-2f60-4fbb-9d02-fde35d4a68e1\") " Mar 09 10:00:03 crc kubenswrapper[4861]: I0309 10:00:03.867978 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdg6n\" (UniqueName: \"kubernetes.io/projected/8f3eab37-2f60-4fbb-9d02-fde35d4a68e1-kube-api-access-rdg6n\") pod \"8f3eab37-2f60-4fbb-9d02-fde35d4a68e1\" (UID: \"8f3eab37-2f60-4fbb-9d02-fde35d4a68e1\") " Mar 09 10:00:03 crc kubenswrapper[4861]: I0309 10:00:03.868088 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8f3eab37-2f60-4fbb-9d02-fde35d4a68e1-secret-volume\") pod \"8f3eab37-2f60-4fbb-9d02-fde35d4a68e1\" (UID: \"8f3eab37-2f60-4fbb-9d02-fde35d4a68e1\") " Mar 09 10:00:03 crc kubenswrapper[4861]: I0309 10:00:03.869560 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f3eab37-2f60-4fbb-9d02-fde35d4a68e1-config-volume" (OuterVolumeSpecName: "config-volume") pod "8f3eab37-2f60-4fbb-9d02-fde35d4a68e1" (UID: "8f3eab37-2f60-4fbb-9d02-fde35d4a68e1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 10:00:03 crc kubenswrapper[4861]: I0309 10:00:03.874943 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f3eab37-2f60-4fbb-9d02-fde35d4a68e1-kube-api-access-rdg6n" (OuterVolumeSpecName: "kube-api-access-rdg6n") pod "8f3eab37-2f60-4fbb-9d02-fde35d4a68e1" (UID: "8f3eab37-2f60-4fbb-9d02-fde35d4a68e1"). InnerVolumeSpecName "kube-api-access-rdg6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:00:03 crc kubenswrapper[4861]: I0309 10:00:03.874946 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f3eab37-2f60-4fbb-9d02-fde35d4a68e1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8f3eab37-2f60-4fbb-9d02-fde35d4a68e1" (UID: "8f3eab37-2f60-4fbb-9d02-fde35d4a68e1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:00:03 crc kubenswrapper[4861]: I0309 10:00:03.970983 4861 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8f3eab37-2f60-4fbb-9d02-fde35d4a68e1-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:03 crc kubenswrapper[4861]: I0309 10:00:03.971029 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdg6n\" (UniqueName: \"kubernetes.io/projected/8f3eab37-2f60-4fbb-9d02-fde35d4a68e1-kube-api-access-rdg6n\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:03 crc kubenswrapper[4861]: I0309 10:00:03.971045 4861 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8f3eab37-2f60-4fbb-9d02-fde35d4a68e1-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:04 crc kubenswrapper[4861]: I0309 10:00:04.397288 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550840-vr69x" Mar 09 10:00:04 crc kubenswrapper[4861]: I0309 10:00:04.396561 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550840-vr69x" event={"ID":"8f3eab37-2f60-4fbb-9d02-fde35d4a68e1","Type":"ContainerDied","Data":"e250ec4fe57b6f0e1cd925088b2bfb589631bdcf6b9d99edd43da60f75e21f33"} Mar 09 10:00:04 crc kubenswrapper[4861]: I0309 10:00:04.398639 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e250ec4fe57b6f0e1cd925088b2bfb589631bdcf6b9d99edd43da60f75e21f33" Mar 09 10:00:04 crc kubenswrapper[4861]: I0309 10:00:04.452205 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550795-k4z5f"] Mar 09 10:00:04 crc kubenswrapper[4861]: I0309 10:00:04.461192 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550795-k4z5f"] Mar 09 10:00:04 crc kubenswrapper[4861]: E0309 10:00:04.902980 4861 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97334747_9900_462d_b915_910d721ee722.slice/crio-e0779ff71ef1d50137bdf8113f7509f5ee9b094ac47c1e7095475b66e02d3aab.scope\": RecentStats: unable to find data in memory cache]" Mar 09 10:00:05 crc kubenswrapper[4861]: I0309 10:00:05.405477 4861 generic.go:334] "Generic (PLEG): container finished" podID="97334747-9900-462d-b915-910d721ee722" containerID="e0779ff71ef1d50137bdf8113f7509f5ee9b094ac47c1e7095475b66e02d3aab" exitCode=0 Mar 09 10:00:05 crc kubenswrapper[4861]: I0309 10:00:05.405553 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550840-nlgn5" event={"ID":"97334747-9900-462d-b915-910d721ee722","Type":"ContainerDied","Data":"e0779ff71ef1d50137bdf8113f7509f5ee9b094ac47c1e7095475b66e02d3aab"} Mar 09 10:00:05 crc kubenswrapper[4861]: I0309 10:00:05.667633 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb0f3303-62a4-48b9-859e-0abd577ad301" path="/var/lib/kubelet/pods/cb0f3303-62a4-48b9-859e-0abd577ad301/volumes" Mar 09 10:00:06 crc kubenswrapper[4861]: I0309 10:00:06.783855 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550840-nlgn5" Mar 09 10:00:06 crc kubenswrapper[4861]: I0309 10:00:06.929587 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzbdw\" (UniqueName: \"kubernetes.io/projected/97334747-9900-462d-b915-910d721ee722-kube-api-access-kzbdw\") pod \"97334747-9900-462d-b915-910d721ee722\" (UID: \"97334747-9900-462d-b915-910d721ee722\") " Mar 09 10:00:06 crc kubenswrapper[4861]: I0309 10:00:06.941181 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97334747-9900-462d-b915-910d721ee722-kube-api-access-kzbdw" (OuterVolumeSpecName: "kube-api-access-kzbdw") pod "97334747-9900-462d-b915-910d721ee722" (UID: "97334747-9900-462d-b915-910d721ee722"). InnerVolumeSpecName "kube-api-access-kzbdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:00:07 crc kubenswrapper[4861]: I0309 10:00:07.031452 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzbdw\" (UniqueName: \"kubernetes.io/projected/97334747-9900-462d-b915-910d721ee722-kube-api-access-kzbdw\") on node \"crc\" DevicePath \"\"" Mar 09 10:00:07 crc kubenswrapper[4861]: I0309 10:00:07.425423 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550840-nlgn5" event={"ID":"97334747-9900-462d-b915-910d721ee722","Type":"ContainerDied","Data":"4f45fe0b3796a41f914c90c7f3ab4bf86cbf3cb4a0accd2596619bc4e58a7256"} Mar 09 10:00:07 crc kubenswrapper[4861]: I0309 10:00:07.425490 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f45fe0b3796a41f914c90c7f3ab4bf86cbf3cb4a0accd2596619bc4e58a7256" Mar 09 10:00:07 crc kubenswrapper[4861]: I0309 10:00:07.425510 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550840-nlgn5" Mar 09 10:00:08 crc kubenswrapper[4861]: I0309 10:00:08.668107 4861 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-s96f9 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.17:8443/healthz\": context deadline exceeded" start-of-body= Mar 09 10:00:08 crc kubenswrapper[4861]: I0309 10:00:08.668426 4861 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-s96f9" podUID="ad46ee5f-9c08-438c-8284-aa488f48e522" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.17:8443/healthz\": context deadline exceeded" Mar 09 10:00:08 crc kubenswrapper[4861]: I0309 10:00:08.732442 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550834-4cfvx"] Mar 09 10:00:08 crc kubenswrapper[4861]: I0309 10:00:08.742869 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550834-4cfvx"] Mar 09 10:00:09 crc kubenswrapper[4861]: I0309 10:00:09.670668 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91e023dd-cb71-4232-9d33-545f75430736" path="/var/lib/kubelet/pods/91e023dd-cb71-4232-9d33-545f75430736/volumes" Mar 09 10:00:14 crc kubenswrapper[4861]: I0309 10:00:14.658224 4861 scope.go:117] "RemoveContainer" containerID="77c857d062282bd5e967d0e73907a96e77dbe37daf546bb7546f15dc1d364592" Mar 09 10:00:14 crc kubenswrapper[4861]: E0309 10:00:14.658883 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 10:00:27 crc kubenswrapper[4861]: I0309 10:00:27.664700 4861 scope.go:117] "RemoveContainer" containerID="77c857d062282bd5e967d0e73907a96e77dbe37daf546bb7546f15dc1d364592" Mar 09 10:00:28 crc kubenswrapper[4861]: I0309 10:00:28.858739 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" event={"ID":"6f7875e3-174f-4c67-8675-d878de74aa4f","Type":"ContainerStarted","Data":"7e8b0aadd87c38d58bff92f69ce179da04d7d49c2a37109605ba66bad8f23ee6"} Mar 09 10:00:37 crc kubenswrapper[4861]: I0309 10:00:37.747433 4861 scope.go:117] "RemoveContainer" containerID="21979918b0a41c070adc11f01f1153fc0640c1ad3848685946ff780bc76726d7" Mar 09 10:00:37 crc kubenswrapper[4861]: I0309 10:00:37.780902 4861 scope.go:117] "RemoveContainer" containerID="2b853a2364d2632e02bbc3723618e85a5fec323cffb1d47850fd9c5abfc9b9a1" Mar 09 10:00:53 crc kubenswrapper[4861]: I0309 10:00:53.643560 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4cnqc"] Mar 09 10:00:53 crc kubenswrapper[4861]: E0309 10:00:53.647767 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97334747-9900-462d-b915-910d721ee722" containerName="oc" Mar 09 10:00:53 crc kubenswrapper[4861]: I0309 10:00:53.647795 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="97334747-9900-462d-b915-910d721ee722" containerName="oc" Mar 09 10:00:53 crc kubenswrapper[4861]: E0309 10:00:53.647837 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f3eab37-2f60-4fbb-9d02-fde35d4a68e1" containerName="collect-profiles" Mar 09 10:00:53 crc kubenswrapper[4861]: I0309 10:00:53.647846 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f3eab37-2f60-4fbb-9d02-fde35d4a68e1" containerName="collect-profiles" Mar 09 10:00:53 crc kubenswrapper[4861]: I0309 10:00:53.648074 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="97334747-9900-462d-b915-910d721ee722" containerName="oc" Mar 09 10:00:53 crc kubenswrapper[4861]: I0309 10:00:53.648096 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f3eab37-2f60-4fbb-9d02-fde35d4a68e1" containerName="collect-profiles" Mar 09 10:00:53 crc kubenswrapper[4861]: I0309 10:00:53.649645 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4cnqc" Mar 09 10:00:53 crc kubenswrapper[4861]: I0309 10:00:53.688137 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4cnqc"] Mar 09 10:00:53 crc kubenswrapper[4861]: I0309 10:00:53.741632 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z9p4\" (UniqueName: \"kubernetes.io/projected/b788a03d-dd18-43b8-a8d8-652c1e498505-kube-api-access-6z9p4\") pod \"community-operators-4cnqc\" (UID: \"b788a03d-dd18-43b8-a8d8-652c1e498505\") " pod="openshift-marketplace/community-operators-4cnqc" Mar 09 10:00:53 crc kubenswrapper[4861]: I0309 10:00:53.741684 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b788a03d-dd18-43b8-a8d8-652c1e498505-catalog-content\") pod \"community-operators-4cnqc\" (UID: \"b788a03d-dd18-43b8-a8d8-652c1e498505\") " pod="openshift-marketplace/community-operators-4cnqc" Mar 09 10:00:53 crc kubenswrapper[4861]: I0309 10:00:53.741773 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b788a03d-dd18-43b8-a8d8-652c1e498505-utilities\") pod \"community-operators-4cnqc\" (UID: \"b788a03d-dd18-43b8-a8d8-652c1e498505\") " pod="openshift-marketplace/community-operators-4cnqc" Mar 09 10:00:53 crc kubenswrapper[4861]: I0309 10:00:53.843655 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b788a03d-dd18-43b8-a8d8-652c1e498505-utilities\") pod \"community-operators-4cnqc\" (UID: \"b788a03d-dd18-43b8-a8d8-652c1e498505\") " pod="openshift-marketplace/community-operators-4cnqc" Mar 09 10:00:53 crc kubenswrapper[4861]: I0309 10:00:53.843851 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6z9p4\" (UniqueName: \"kubernetes.io/projected/b788a03d-dd18-43b8-a8d8-652c1e498505-kube-api-access-6z9p4\") pod \"community-operators-4cnqc\" (UID: \"b788a03d-dd18-43b8-a8d8-652c1e498505\") " pod="openshift-marketplace/community-operators-4cnqc" Mar 09 10:00:53 crc kubenswrapper[4861]: I0309 10:00:53.843883 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b788a03d-dd18-43b8-a8d8-652c1e498505-catalog-content\") pod \"community-operators-4cnqc\" (UID: \"b788a03d-dd18-43b8-a8d8-652c1e498505\") " pod="openshift-marketplace/community-operators-4cnqc" Mar 09 10:00:53 crc kubenswrapper[4861]: I0309 10:00:53.844332 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b788a03d-dd18-43b8-a8d8-652c1e498505-utilities\") pod \"community-operators-4cnqc\" (UID: \"b788a03d-dd18-43b8-a8d8-652c1e498505\") " pod="openshift-marketplace/community-operators-4cnqc" Mar 09 10:00:53 crc kubenswrapper[4861]: I0309 10:00:53.844666 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b788a03d-dd18-43b8-a8d8-652c1e498505-catalog-content\") pod \"community-operators-4cnqc\" (UID: \"b788a03d-dd18-43b8-a8d8-652c1e498505\") " pod="openshift-marketplace/community-operators-4cnqc" Mar 09 10:00:53 crc kubenswrapper[4861]: I0309 10:00:53.869717 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z9p4\" (UniqueName: \"kubernetes.io/projected/b788a03d-dd18-43b8-a8d8-652c1e498505-kube-api-access-6z9p4\") pod \"community-operators-4cnqc\" (UID: \"b788a03d-dd18-43b8-a8d8-652c1e498505\") " pod="openshift-marketplace/community-operators-4cnqc" Mar 09 10:00:53 crc kubenswrapper[4861]: I0309 10:00:53.986450 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4cnqc" Mar 09 10:00:54 crc kubenswrapper[4861]: I0309 10:00:54.545613 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4cnqc"] Mar 09 10:00:55 crc kubenswrapper[4861]: I0309 10:00:55.070348 4861 generic.go:334] "Generic (PLEG): container finished" podID="b788a03d-dd18-43b8-a8d8-652c1e498505" containerID="fc120cb2f4a11840d67548b410a0a20575db8c6833102f0fefa329fdb5c2aef2" exitCode=0 Mar 09 10:00:55 crc kubenswrapper[4861]: I0309 10:00:55.070403 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4cnqc" event={"ID":"b788a03d-dd18-43b8-a8d8-652c1e498505","Type":"ContainerDied","Data":"fc120cb2f4a11840d67548b410a0a20575db8c6833102f0fefa329fdb5c2aef2"} Mar 09 10:00:55 crc kubenswrapper[4861]: I0309 10:00:55.070694 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4cnqc" event={"ID":"b788a03d-dd18-43b8-a8d8-652c1e498505","Type":"ContainerStarted","Data":"6a9584e7f8abe74a533c0c93257f602d16efda3ec18036677b9bf2cd06041678"} Mar 09 10:00:56 crc kubenswrapper[4861]: I0309 10:00:56.082725 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4cnqc" event={"ID":"b788a03d-dd18-43b8-a8d8-652c1e498505","Type":"ContainerStarted","Data":"f0b2f54d71351ba363f6bac3d61c913c9b83b7fbfd4304390d58b6d6ef928454"} Mar 09 10:00:58 crc kubenswrapper[4861]: I0309 10:00:58.103848 4861 generic.go:334] "Generic (PLEG): container finished" podID="b788a03d-dd18-43b8-a8d8-652c1e498505" containerID="f0b2f54d71351ba363f6bac3d61c913c9b83b7fbfd4304390d58b6d6ef928454" exitCode=0 Mar 09 10:00:58 crc kubenswrapper[4861]: I0309 10:00:58.103942 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4cnqc" event={"ID":"b788a03d-dd18-43b8-a8d8-652c1e498505","Type":"ContainerDied","Data":"f0b2f54d71351ba363f6bac3d61c913c9b83b7fbfd4304390d58b6d6ef928454"} Mar 09 10:00:59 crc kubenswrapper[4861]: I0309 10:00:59.118402 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4cnqc" event={"ID":"b788a03d-dd18-43b8-a8d8-652c1e498505","Type":"ContainerStarted","Data":"f62c11936656f3ae04c2b31b081fa159814803d5d671b96e734eef8ad2d82625"} Mar 09 10:00:59 crc kubenswrapper[4861]: I0309 10:00:59.142441 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4cnqc" podStartSLOduration=2.692501968 podStartE2EDuration="6.142416267s" podCreationTimestamp="2026-03-09 10:00:53 +0000 UTC" firstStartedPulling="2026-03-09 10:00:55.0719036 +0000 UTC m=+3298.156943001" lastFinishedPulling="2026-03-09 10:00:58.521817899 +0000 UTC m=+3301.606857300" observedRunningTime="2026-03-09 10:00:59.134404486 +0000 UTC m=+3302.219443897" watchObservedRunningTime="2026-03-09 10:00:59.142416267 +0000 UTC m=+3302.227455678" Mar 09 10:01:00 crc kubenswrapper[4861]: I0309 10:01:00.161819 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29550841-pc6h2"] Mar 09 10:01:00 crc kubenswrapper[4861]: I0309 10:01:00.163754 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29550841-pc6h2" Mar 09 10:01:00 crc kubenswrapper[4861]: I0309 10:01:00.175523 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29550841-pc6h2"] Mar 09 10:01:00 crc kubenswrapper[4861]: I0309 10:01:00.337152 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2153c5af-92d8-4f6e-b299-8b06d30603f5-config-data\") pod \"keystone-cron-29550841-pc6h2\" (UID: \"2153c5af-92d8-4f6e-b299-8b06d30603f5\") " pod="openstack/keystone-cron-29550841-pc6h2" Mar 09 10:01:00 crc kubenswrapper[4861]: I0309 10:01:00.337519 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc4k4\" (UniqueName: \"kubernetes.io/projected/2153c5af-92d8-4f6e-b299-8b06d30603f5-kube-api-access-bc4k4\") pod \"keystone-cron-29550841-pc6h2\" (UID: \"2153c5af-92d8-4f6e-b299-8b06d30603f5\") " pod="openstack/keystone-cron-29550841-pc6h2" Mar 09 10:01:00 crc kubenswrapper[4861]: I0309 10:01:00.337559 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2153c5af-92d8-4f6e-b299-8b06d30603f5-combined-ca-bundle\") pod \"keystone-cron-29550841-pc6h2\" (UID: \"2153c5af-92d8-4f6e-b299-8b06d30603f5\") " pod="openstack/keystone-cron-29550841-pc6h2" Mar 09 10:01:00 crc kubenswrapper[4861]: I0309 10:01:00.337637 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2153c5af-92d8-4f6e-b299-8b06d30603f5-fernet-keys\") pod \"keystone-cron-29550841-pc6h2\" (UID: \"2153c5af-92d8-4f6e-b299-8b06d30603f5\") " pod="openstack/keystone-cron-29550841-pc6h2" Mar 09 10:01:00 crc kubenswrapper[4861]: I0309 10:01:00.439030 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bc4k4\" (UniqueName: \"kubernetes.io/projected/2153c5af-92d8-4f6e-b299-8b06d30603f5-kube-api-access-bc4k4\") pod \"keystone-cron-29550841-pc6h2\" (UID: \"2153c5af-92d8-4f6e-b299-8b06d30603f5\") " pod="openstack/keystone-cron-29550841-pc6h2" Mar 09 10:01:00 crc kubenswrapper[4861]: I0309 10:01:00.439092 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2153c5af-92d8-4f6e-b299-8b06d30603f5-combined-ca-bundle\") pod \"keystone-cron-29550841-pc6h2\" (UID: \"2153c5af-92d8-4f6e-b299-8b06d30603f5\") " pod="openstack/keystone-cron-29550841-pc6h2" Mar 09 10:01:00 crc kubenswrapper[4861]: I0309 10:01:00.439146 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2153c5af-92d8-4f6e-b299-8b06d30603f5-fernet-keys\") pod \"keystone-cron-29550841-pc6h2\" (UID: \"2153c5af-92d8-4f6e-b299-8b06d30603f5\") " pod="openstack/keystone-cron-29550841-pc6h2" Mar 09 10:01:00 crc kubenswrapper[4861]: I0309 10:01:00.439208 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2153c5af-92d8-4f6e-b299-8b06d30603f5-config-data\") pod \"keystone-cron-29550841-pc6h2\" (UID: \"2153c5af-92d8-4f6e-b299-8b06d30603f5\") " pod="openstack/keystone-cron-29550841-pc6h2" Mar 09 10:01:00 crc kubenswrapper[4861]: I0309 10:01:00.445835 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2153c5af-92d8-4f6e-b299-8b06d30603f5-fernet-keys\") pod \"keystone-cron-29550841-pc6h2\" (UID: \"2153c5af-92d8-4f6e-b299-8b06d30603f5\") " pod="openstack/keystone-cron-29550841-pc6h2" Mar 09 10:01:00 crc kubenswrapper[4861]: I0309 10:01:00.446726 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2153c5af-92d8-4f6e-b299-8b06d30603f5-combined-ca-bundle\") pod \"keystone-cron-29550841-pc6h2\" (UID: \"2153c5af-92d8-4f6e-b299-8b06d30603f5\") " pod="openstack/keystone-cron-29550841-pc6h2" Mar 09 10:01:00 crc kubenswrapper[4861]: I0309 10:01:00.446838 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2153c5af-92d8-4f6e-b299-8b06d30603f5-config-data\") pod \"keystone-cron-29550841-pc6h2\" (UID: \"2153c5af-92d8-4f6e-b299-8b06d30603f5\") " pod="openstack/keystone-cron-29550841-pc6h2" Mar 09 10:01:00 crc kubenswrapper[4861]: I0309 10:01:00.462459 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bc4k4\" (UniqueName: \"kubernetes.io/projected/2153c5af-92d8-4f6e-b299-8b06d30603f5-kube-api-access-bc4k4\") pod \"keystone-cron-29550841-pc6h2\" (UID: \"2153c5af-92d8-4f6e-b299-8b06d30603f5\") " pod="openstack/keystone-cron-29550841-pc6h2" Mar 09 10:01:00 crc kubenswrapper[4861]: I0309 10:01:00.488785 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29550841-pc6h2" Mar 09 10:01:00 crc kubenswrapper[4861]: I0309 10:01:00.987123 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29550841-pc6h2"] Mar 09 10:01:01 crc kubenswrapper[4861]: I0309 10:01:01.140076 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29550841-pc6h2" event={"ID":"2153c5af-92d8-4f6e-b299-8b06d30603f5","Type":"ContainerStarted","Data":"9aa6527e46f5ed253620ce169b299869ed5960ec5be8b1e2cac58bba24a092f4"} Mar 09 10:01:02 crc kubenswrapper[4861]: I0309 10:01:02.151102 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29550841-pc6h2" event={"ID":"2153c5af-92d8-4f6e-b299-8b06d30603f5","Type":"ContainerStarted","Data":"8a2dad5e125a74d3cdd27e1c93d6103c508ddc87e313c008c372823d1d40115f"} Mar 09 10:01:02 crc kubenswrapper[4861]: I0309 10:01:02.167809 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29550841-pc6h2" podStartSLOduration=2.167791583 podStartE2EDuration="2.167791583s" podCreationTimestamp="2026-03-09 10:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 10:01:02.164509945 +0000 UTC m=+3305.249549356" watchObservedRunningTime="2026-03-09 10:01:02.167791583 +0000 UTC m=+3305.252830984" Mar 09 10:01:03 crc kubenswrapper[4861]: I0309 10:01:03.986610 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4cnqc" Mar 09 10:01:03 crc kubenswrapper[4861]: I0309 10:01:03.987044 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4cnqc" Mar 09 10:01:04 crc kubenswrapper[4861]: I0309 10:01:04.170308 4861 generic.go:334] "Generic (PLEG): container finished" podID="2153c5af-92d8-4f6e-b299-8b06d30603f5" containerID="8a2dad5e125a74d3cdd27e1c93d6103c508ddc87e313c008c372823d1d40115f" exitCode=0 Mar 09 10:01:04 crc kubenswrapper[4861]: I0309 10:01:04.170356 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29550841-pc6h2" event={"ID":"2153c5af-92d8-4f6e-b299-8b06d30603f5","Type":"ContainerDied","Data":"8a2dad5e125a74d3cdd27e1c93d6103c508ddc87e313c008c372823d1d40115f"} Mar 09 10:01:05 crc kubenswrapper[4861]: I0309 10:01:05.034656 4861 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-4cnqc" podUID="b788a03d-dd18-43b8-a8d8-652c1e498505" containerName="registry-server" probeResult="failure" output=< Mar 09 10:01:05 crc kubenswrapper[4861]: timeout: failed to connect service ":50051" within 1s Mar 09 10:01:05 crc kubenswrapper[4861]: > Mar 09 10:01:05 crc kubenswrapper[4861]: I0309 10:01:05.585244 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29550841-pc6h2" Mar 09 10:01:05 crc kubenswrapper[4861]: I0309 10:01:05.748513 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2153c5af-92d8-4f6e-b299-8b06d30603f5-config-data\") pod \"2153c5af-92d8-4f6e-b299-8b06d30603f5\" (UID: \"2153c5af-92d8-4f6e-b299-8b06d30603f5\") " Mar 09 10:01:05 crc kubenswrapper[4861]: I0309 10:01:05.748656 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2153c5af-92d8-4f6e-b299-8b06d30603f5-fernet-keys\") pod \"2153c5af-92d8-4f6e-b299-8b06d30603f5\" (UID: \"2153c5af-92d8-4f6e-b299-8b06d30603f5\") " Mar 09 10:01:05 crc kubenswrapper[4861]: I0309 10:01:05.748876 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2153c5af-92d8-4f6e-b299-8b06d30603f5-combined-ca-bundle\") pod \"2153c5af-92d8-4f6e-b299-8b06d30603f5\" (UID: \"2153c5af-92d8-4f6e-b299-8b06d30603f5\") " Mar 09 10:01:05 crc kubenswrapper[4861]: I0309 10:01:05.748950 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bc4k4\" (UniqueName: \"kubernetes.io/projected/2153c5af-92d8-4f6e-b299-8b06d30603f5-kube-api-access-bc4k4\") pod \"2153c5af-92d8-4f6e-b299-8b06d30603f5\" (UID: \"2153c5af-92d8-4f6e-b299-8b06d30603f5\") " Mar 09 10:01:05 crc kubenswrapper[4861]: I0309 10:01:05.755285 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2153c5af-92d8-4f6e-b299-8b06d30603f5-kube-api-access-bc4k4" (OuterVolumeSpecName: "kube-api-access-bc4k4") pod "2153c5af-92d8-4f6e-b299-8b06d30603f5" (UID: "2153c5af-92d8-4f6e-b299-8b06d30603f5"). InnerVolumeSpecName "kube-api-access-bc4k4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:01:05 crc kubenswrapper[4861]: I0309 10:01:05.760514 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2153c5af-92d8-4f6e-b299-8b06d30603f5-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "2153c5af-92d8-4f6e-b299-8b06d30603f5" (UID: "2153c5af-92d8-4f6e-b299-8b06d30603f5"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:01:05 crc kubenswrapper[4861]: I0309 10:01:05.787230 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2153c5af-92d8-4f6e-b299-8b06d30603f5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2153c5af-92d8-4f6e-b299-8b06d30603f5" (UID: "2153c5af-92d8-4f6e-b299-8b06d30603f5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:01:05 crc kubenswrapper[4861]: I0309 10:01:05.807701 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2153c5af-92d8-4f6e-b299-8b06d30603f5-config-data" (OuterVolumeSpecName: "config-data") pod "2153c5af-92d8-4f6e-b299-8b06d30603f5" (UID: "2153c5af-92d8-4f6e-b299-8b06d30603f5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:01:05 crc kubenswrapper[4861]: I0309 10:01:05.851308 4861 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2153c5af-92d8-4f6e-b299-8b06d30603f5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:05 crc kubenswrapper[4861]: I0309 10:01:05.851350 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bc4k4\" (UniqueName: \"kubernetes.io/projected/2153c5af-92d8-4f6e-b299-8b06d30603f5-kube-api-access-bc4k4\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:05 crc kubenswrapper[4861]: I0309 10:01:05.851379 4861 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2153c5af-92d8-4f6e-b299-8b06d30603f5-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:05 crc kubenswrapper[4861]: I0309 10:01:05.851391 4861 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2153c5af-92d8-4f6e-b299-8b06d30603f5-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:06 crc kubenswrapper[4861]: I0309 10:01:06.198251 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29550841-pc6h2" event={"ID":"2153c5af-92d8-4f6e-b299-8b06d30603f5","Type":"ContainerDied","Data":"9aa6527e46f5ed253620ce169b299869ed5960ec5be8b1e2cac58bba24a092f4"} Mar 09 10:01:06 crc kubenswrapper[4861]: I0309 10:01:06.198505 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9aa6527e46f5ed253620ce169b299869ed5960ec5be8b1e2cac58bba24a092f4" Mar 09 10:01:06 crc kubenswrapper[4861]: I0309 10:01:06.198569 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29550841-pc6h2" Mar 09 10:01:14 crc kubenswrapper[4861]: I0309 10:01:14.077603 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4cnqc" Mar 09 10:01:14 crc kubenswrapper[4861]: I0309 10:01:14.129577 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4cnqc" Mar 09 10:01:14 crc kubenswrapper[4861]: I0309 10:01:14.317601 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4cnqc"] Mar 09 10:01:15 crc kubenswrapper[4861]: I0309 10:01:15.272287 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4cnqc" podUID="b788a03d-dd18-43b8-a8d8-652c1e498505" containerName="registry-server" containerID="cri-o://f62c11936656f3ae04c2b31b081fa159814803d5d671b96e734eef8ad2d82625" gracePeriod=2 Mar 09 10:01:15 crc kubenswrapper[4861]: I0309 10:01:15.811162 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4cnqc" Mar 09 10:01:15 crc kubenswrapper[4861]: I0309 10:01:15.965740 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b788a03d-dd18-43b8-a8d8-652c1e498505-catalog-content\") pod \"b788a03d-dd18-43b8-a8d8-652c1e498505\" (UID: \"b788a03d-dd18-43b8-a8d8-652c1e498505\") " Mar 09 10:01:15 crc kubenswrapper[4861]: I0309 10:01:15.965948 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b788a03d-dd18-43b8-a8d8-652c1e498505-utilities\") pod \"b788a03d-dd18-43b8-a8d8-652c1e498505\" (UID: \"b788a03d-dd18-43b8-a8d8-652c1e498505\") " Mar 09 10:01:15 crc kubenswrapper[4861]: I0309 10:01:15.966119 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6z9p4\" (UniqueName: \"kubernetes.io/projected/b788a03d-dd18-43b8-a8d8-652c1e498505-kube-api-access-6z9p4\") pod \"b788a03d-dd18-43b8-a8d8-652c1e498505\" (UID: \"b788a03d-dd18-43b8-a8d8-652c1e498505\") " Mar 09 10:01:15 crc kubenswrapper[4861]: I0309 10:01:15.967469 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b788a03d-dd18-43b8-a8d8-652c1e498505-utilities" (OuterVolumeSpecName: "utilities") pod "b788a03d-dd18-43b8-a8d8-652c1e498505" (UID: "b788a03d-dd18-43b8-a8d8-652c1e498505"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:01:15 crc kubenswrapper[4861]: I0309 10:01:15.972634 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b788a03d-dd18-43b8-a8d8-652c1e498505-kube-api-access-6z9p4" (OuterVolumeSpecName: "kube-api-access-6z9p4") pod "b788a03d-dd18-43b8-a8d8-652c1e498505" (UID: "b788a03d-dd18-43b8-a8d8-652c1e498505"). InnerVolumeSpecName "kube-api-access-6z9p4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:01:16 crc kubenswrapper[4861]: I0309 10:01:16.043031 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b788a03d-dd18-43b8-a8d8-652c1e498505-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b788a03d-dd18-43b8-a8d8-652c1e498505" (UID: "b788a03d-dd18-43b8-a8d8-652c1e498505"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:01:16 crc kubenswrapper[4861]: I0309 10:01:16.068917 4861 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b788a03d-dd18-43b8-a8d8-652c1e498505-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:16 crc kubenswrapper[4861]: I0309 10:01:16.068967 4861 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b788a03d-dd18-43b8-a8d8-652c1e498505-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:16 crc kubenswrapper[4861]: I0309 10:01:16.068978 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6z9p4\" (UniqueName: \"kubernetes.io/projected/b788a03d-dd18-43b8-a8d8-652c1e498505-kube-api-access-6z9p4\") on node \"crc\" DevicePath \"\"" Mar 09 10:01:16 crc kubenswrapper[4861]: I0309 10:01:16.286647 4861 generic.go:334] "Generic (PLEG): container finished" podID="b788a03d-dd18-43b8-a8d8-652c1e498505" containerID="f62c11936656f3ae04c2b31b081fa159814803d5d671b96e734eef8ad2d82625" exitCode=0 Mar 09 10:01:16 crc kubenswrapper[4861]: I0309 10:01:16.286703 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4cnqc" event={"ID":"b788a03d-dd18-43b8-a8d8-652c1e498505","Type":"ContainerDied","Data":"f62c11936656f3ae04c2b31b081fa159814803d5d671b96e734eef8ad2d82625"} Mar 09 10:01:16 crc kubenswrapper[4861]: I0309 10:01:16.286745 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4cnqc" event={"ID":"b788a03d-dd18-43b8-a8d8-652c1e498505","Type":"ContainerDied","Data":"6a9584e7f8abe74a533c0c93257f602d16efda3ec18036677b9bf2cd06041678"} Mar 09 10:01:16 crc kubenswrapper[4861]: I0309 10:01:16.286763 4861 scope.go:117] "RemoveContainer" containerID="f62c11936656f3ae04c2b31b081fa159814803d5d671b96e734eef8ad2d82625" Mar 09 10:01:16 crc kubenswrapper[4861]: I0309 10:01:16.286774 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4cnqc" Mar 09 10:01:16 crc kubenswrapper[4861]: I0309 10:01:16.324641 4861 scope.go:117] "RemoveContainer" containerID="f0b2f54d71351ba363f6bac3d61c913c9b83b7fbfd4304390d58b6d6ef928454" Mar 09 10:01:16 crc kubenswrapper[4861]: I0309 10:01:16.334490 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4cnqc"] Mar 09 10:01:16 crc kubenswrapper[4861]: I0309 10:01:16.343871 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4cnqc"] Mar 09 10:01:16 crc kubenswrapper[4861]: I0309 10:01:16.386926 4861 scope.go:117] "RemoveContainer" containerID="fc120cb2f4a11840d67548b410a0a20575db8c6833102f0fefa329fdb5c2aef2" Mar 09 10:01:16 crc kubenswrapper[4861]: I0309 10:01:16.409312 4861 scope.go:117] "RemoveContainer" containerID="f62c11936656f3ae04c2b31b081fa159814803d5d671b96e734eef8ad2d82625" Mar 09 10:01:16 crc kubenswrapper[4861]: E0309 10:01:16.410113 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f62c11936656f3ae04c2b31b081fa159814803d5d671b96e734eef8ad2d82625\": container with ID starting with f62c11936656f3ae04c2b31b081fa159814803d5d671b96e734eef8ad2d82625 not found: ID does not exist" containerID="f62c11936656f3ae04c2b31b081fa159814803d5d671b96e734eef8ad2d82625" Mar 09 10:01:16 crc kubenswrapper[4861]: I0309 10:01:16.410152 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f62c11936656f3ae04c2b31b081fa159814803d5d671b96e734eef8ad2d82625"} err="failed to get container status \"f62c11936656f3ae04c2b31b081fa159814803d5d671b96e734eef8ad2d82625\": rpc error: code = NotFound desc = could not find container \"f62c11936656f3ae04c2b31b081fa159814803d5d671b96e734eef8ad2d82625\": container with ID starting with f62c11936656f3ae04c2b31b081fa159814803d5d671b96e734eef8ad2d82625 not found: ID does not exist" Mar 09 10:01:16 crc kubenswrapper[4861]: I0309 10:01:16.410180 4861 scope.go:117] "RemoveContainer" containerID="f0b2f54d71351ba363f6bac3d61c913c9b83b7fbfd4304390d58b6d6ef928454" Mar 09 10:01:16 crc kubenswrapper[4861]: E0309 10:01:16.410607 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0b2f54d71351ba363f6bac3d61c913c9b83b7fbfd4304390d58b6d6ef928454\": container with ID starting with f0b2f54d71351ba363f6bac3d61c913c9b83b7fbfd4304390d58b6d6ef928454 not found: ID does not exist" containerID="f0b2f54d71351ba363f6bac3d61c913c9b83b7fbfd4304390d58b6d6ef928454" Mar 09 10:01:16 crc kubenswrapper[4861]: I0309 10:01:16.410663 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0b2f54d71351ba363f6bac3d61c913c9b83b7fbfd4304390d58b6d6ef928454"} err="failed to get container status \"f0b2f54d71351ba363f6bac3d61c913c9b83b7fbfd4304390d58b6d6ef928454\": rpc error: code = NotFound desc = could not find container \"f0b2f54d71351ba363f6bac3d61c913c9b83b7fbfd4304390d58b6d6ef928454\": container with ID starting with f0b2f54d71351ba363f6bac3d61c913c9b83b7fbfd4304390d58b6d6ef928454 not found: ID does not exist" Mar 09 10:01:16 crc kubenswrapper[4861]: I0309 10:01:16.410699 4861 scope.go:117] "RemoveContainer" containerID="fc120cb2f4a11840d67548b410a0a20575db8c6833102f0fefa329fdb5c2aef2" Mar 09 10:01:16 crc kubenswrapper[4861]: E0309 10:01:16.411176 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc120cb2f4a11840d67548b410a0a20575db8c6833102f0fefa329fdb5c2aef2\": container with ID starting with fc120cb2f4a11840d67548b410a0a20575db8c6833102f0fefa329fdb5c2aef2 not found: ID does not exist" containerID="fc120cb2f4a11840d67548b410a0a20575db8c6833102f0fefa329fdb5c2aef2" Mar 09 10:01:16 crc kubenswrapper[4861]: I0309 10:01:16.411256 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc120cb2f4a11840d67548b410a0a20575db8c6833102f0fefa329fdb5c2aef2"} err="failed to get container status \"fc120cb2f4a11840d67548b410a0a20575db8c6833102f0fefa329fdb5c2aef2\": rpc error: code = NotFound desc = could not find container \"fc120cb2f4a11840d67548b410a0a20575db8c6833102f0fefa329fdb5c2aef2\": container with ID starting with fc120cb2f4a11840d67548b410a0a20575db8c6833102f0fefa329fdb5c2aef2 not found: ID does not exist" Mar 09 10:01:17 crc kubenswrapper[4861]: I0309 10:01:17.669826 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b788a03d-dd18-43b8-a8d8-652c1e498505" path="/var/lib/kubelet/pods/b788a03d-dd18-43b8-a8d8-652c1e498505/volumes" Mar 09 10:02:00 crc kubenswrapper[4861]: I0309 10:02:00.146427 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550842-gxl6n"] Mar 09 10:02:00 crc kubenswrapper[4861]: E0309 10:02:00.147467 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b788a03d-dd18-43b8-a8d8-652c1e498505" containerName="extract-utilities" Mar 09 10:02:00 crc kubenswrapper[4861]: I0309 10:02:00.147481 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="b788a03d-dd18-43b8-a8d8-652c1e498505" containerName="extract-utilities" Mar 09 10:02:00 crc kubenswrapper[4861]: E0309 10:02:00.147494 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2153c5af-92d8-4f6e-b299-8b06d30603f5" containerName="keystone-cron" Mar 09 10:02:00 crc kubenswrapper[4861]: I0309 10:02:00.147501 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="2153c5af-92d8-4f6e-b299-8b06d30603f5" containerName="keystone-cron" Mar 09 10:02:00 crc kubenswrapper[4861]: E0309 10:02:00.147514 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b788a03d-dd18-43b8-a8d8-652c1e498505" containerName="registry-server" Mar 09 10:02:00 crc kubenswrapper[4861]: I0309 10:02:00.147522 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="b788a03d-dd18-43b8-a8d8-652c1e498505" containerName="registry-server" Mar 09 10:02:00 crc kubenswrapper[4861]: E0309 10:02:00.147537 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b788a03d-dd18-43b8-a8d8-652c1e498505" containerName="extract-content" Mar 09 10:02:00 crc kubenswrapper[4861]: I0309 10:02:00.147543 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="b788a03d-dd18-43b8-a8d8-652c1e498505" containerName="extract-content" Mar 09 10:02:00 crc kubenswrapper[4861]: I0309 10:02:00.147718 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="2153c5af-92d8-4f6e-b299-8b06d30603f5" containerName="keystone-cron" Mar 09 10:02:00 crc kubenswrapper[4861]: I0309 10:02:00.147793 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="b788a03d-dd18-43b8-a8d8-652c1e498505" containerName="registry-server" Mar 09 10:02:00 crc kubenswrapper[4861]: I0309 10:02:00.148582 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550842-gxl6n" Mar 09 10:02:00 crc kubenswrapper[4861]: I0309 10:02:00.151151 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 10:02:00 crc kubenswrapper[4861]: I0309 10:02:00.151393 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k86l8" Mar 09 10:02:00 crc kubenswrapper[4861]: I0309 10:02:00.152622 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 10:02:00 crc kubenswrapper[4861]: I0309 10:02:00.155410 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550842-gxl6n"] Mar 09 10:02:00 crc kubenswrapper[4861]: I0309 10:02:00.240119 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txb4c\" (UniqueName: \"kubernetes.io/projected/35216638-05e0-40bf-b9b8-57924a749838-kube-api-access-txb4c\") pod \"auto-csr-approver-29550842-gxl6n\" (UID: \"35216638-05e0-40bf-b9b8-57924a749838\") " pod="openshift-infra/auto-csr-approver-29550842-gxl6n" Mar 09 10:02:00 crc kubenswrapper[4861]: I0309 10:02:00.343583 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txb4c\" (UniqueName: \"kubernetes.io/projected/35216638-05e0-40bf-b9b8-57924a749838-kube-api-access-txb4c\") pod \"auto-csr-approver-29550842-gxl6n\" (UID: \"35216638-05e0-40bf-b9b8-57924a749838\") " pod="openshift-infra/auto-csr-approver-29550842-gxl6n" Mar 09 10:02:00 crc kubenswrapper[4861]: I0309 10:02:00.362890 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txb4c\" (UniqueName: \"kubernetes.io/projected/35216638-05e0-40bf-b9b8-57924a749838-kube-api-access-txb4c\") pod \"auto-csr-approver-29550842-gxl6n\" (UID: \"35216638-05e0-40bf-b9b8-57924a749838\") " pod="openshift-infra/auto-csr-approver-29550842-gxl6n" Mar 09 10:02:00 crc kubenswrapper[4861]: I0309 10:02:00.470787 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550842-gxl6n" Mar 09 10:02:00 crc kubenswrapper[4861]: I0309 10:02:00.905649 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550842-gxl6n"] Mar 09 10:02:00 crc kubenswrapper[4861]: I0309 10:02:00.913187 4861 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 10:02:01 crc kubenswrapper[4861]: I0309 10:02:01.676964 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550842-gxl6n" event={"ID":"35216638-05e0-40bf-b9b8-57924a749838","Type":"ContainerStarted","Data":"0aeccffac58df3299eed5f8cad3fb3fcec109a4d28be6b2c4bb3d1720a68922e"} Mar 09 10:02:02 crc kubenswrapper[4861]: I0309 10:02:02.690869 4861 generic.go:334] "Generic (PLEG): container finished" podID="35216638-05e0-40bf-b9b8-57924a749838" containerID="22736e38955e37a14a35ce5c997c6afbacca015792f9f0409156eb4d6ba8a45b" exitCode=0 Mar 09 10:02:02 crc kubenswrapper[4861]: I0309 10:02:02.691005 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550842-gxl6n" event={"ID":"35216638-05e0-40bf-b9b8-57924a749838","Type":"ContainerDied","Data":"22736e38955e37a14a35ce5c997c6afbacca015792f9f0409156eb4d6ba8a45b"} Mar 09 10:02:04 crc kubenswrapper[4861]: I0309 10:02:04.069358 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550842-gxl6n" Mar 09 10:02:04 crc kubenswrapper[4861]: I0309 10:02:04.124972 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txb4c\" (UniqueName: \"kubernetes.io/projected/35216638-05e0-40bf-b9b8-57924a749838-kube-api-access-txb4c\") pod \"35216638-05e0-40bf-b9b8-57924a749838\" (UID: \"35216638-05e0-40bf-b9b8-57924a749838\") " Mar 09 10:02:04 crc kubenswrapper[4861]: I0309 10:02:04.138228 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35216638-05e0-40bf-b9b8-57924a749838-kube-api-access-txb4c" (OuterVolumeSpecName: "kube-api-access-txb4c") pod "35216638-05e0-40bf-b9b8-57924a749838" (UID: "35216638-05e0-40bf-b9b8-57924a749838"). InnerVolumeSpecName "kube-api-access-txb4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:02:04 crc kubenswrapper[4861]: I0309 10:02:04.228364 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txb4c\" (UniqueName: \"kubernetes.io/projected/35216638-05e0-40bf-b9b8-57924a749838-kube-api-access-txb4c\") on node \"crc\" DevicePath \"\"" Mar 09 10:02:04 crc kubenswrapper[4861]: I0309 10:02:04.711885 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550842-gxl6n" event={"ID":"35216638-05e0-40bf-b9b8-57924a749838","Type":"ContainerDied","Data":"0aeccffac58df3299eed5f8cad3fb3fcec109a4d28be6b2c4bb3d1720a68922e"} Mar 09 10:02:04 crc kubenswrapper[4861]: I0309 10:02:04.711930 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0aeccffac58df3299eed5f8cad3fb3fcec109a4d28be6b2c4bb3d1720a68922e" Mar 09 10:02:04 crc kubenswrapper[4861]: I0309 10:02:04.711975 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550842-gxl6n" Mar 09 10:02:05 crc kubenswrapper[4861]: I0309 10:02:05.150853 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550836-mbsg5"] Mar 09 10:02:05 crc kubenswrapper[4861]: I0309 10:02:05.160110 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550836-mbsg5"] Mar 09 10:02:05 crc kubenswrapper[4861]: I0309 10:02:05.674209 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a189714d-1d52-456d-a04b-a5a8fbee6087" path="/var/lib/kubelet/pods/a189714d-1d52-456d-a04b-a5a8fbee6087/volumes" Mar 09 10:02:10 crc kubenswrapper[4861]: I0309 10:02:10.763801 4861 generic.go:334] "Generic (PLEG): container finished" podID="f7ed5e40-0dc4-417c-bef9-cbf919777c67" containerID="be13d99d2b9a22aeacccb32a154d4a7177d840c23730e15e6012891c8a5b43dc" exitCode=0 Mar 09 10:02:10 crc kubenswrapper[4861]: I0309 10:02:10.763909 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"f7ed5e40-0dc4-417c-bef9-cbf919777c67","Type":"ContainerDied","Data":"be13d99d2b9a22aeacccb32a154d4a7177d840c23730e15e6012891c8a5b43dc"} Mar 09 10:02:12 crc kubenswrapper[4861]: I0309 10:02:12.155977 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 09 10:02:12 crc kubenswrapper[4861]: I0309 10:02:12.209202 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"f7ed5e40-0dc4-417c-bef9-cbf919777c67\" (UID: \"f7ed5e40-0dc4-417c-bef9-cbf919777c67\") " Mar 09 10:02:12 crc kubenswrapper[4861]: I0309 10:02:12.209255 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/f7ed5e40-0dc4-417c-bef9-cbf919777c67-test-operator-ephemeral-workdir\") pod \"f7ed5e40-0dc4-417c-bef9-cbf919777c67\" (UID: \"f7ed5e40-0dc4-417c-bef9-cbf919777c67\") " Mar 09 10:02:12 crc kubenswrapper[4861]: I0309 10:02:12.209308 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f7ed5e40-0dc4-417c-bef9-cbf919777c67-config-data\") pod \"f7ed5e40-0dc4-417c-bef9-cbf919777c67\" (UID: \"f7ed5e40-0dc4-417c-bef9-cbf919777c67\") " Mar 09 10:02:12 crc kubenswrapper[4861]: I0309 10:02:12.209342 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/f7ed5e40-0dc4-417c-bef9-cbf919777c67-test-operator-ephemeral-temporary\") pod \"f7ed5e40-0dc4-417c-bef9-cbf919777c67\" (UID: \"f7ed5e40-0dc4-417c-bef9-cbf919777c67\") " Mar 09 10:02:12 crc kubenswrapper[4861]: I0309 10:02:12.209399 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bghsg\" (UniqueName: \"kubernetes.io/projected/f7ed5e40-0dc4-417c-bef9-cbf919777c67-kube-api-access-bghsg\") pod \"f7ed5e40-0dc4-417c-bef9-cbf919777c67\" (UID: \"f7ed5e40-0dc4-417c-bef9-cbf919777c67\") " Mar 09 10:02:12 crc kubenswrapper[4861]: I0309 10:02:12.210483 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7ed5e40-0dc4-417c-bef9-cbf919777c67-config-data" (OuterVolumeSpecName: "config-data") pod "f7ed5e40-0dc4-417c-bef9-cbf919777c67" (UID: "f7ed5e40-0dc4-417c-bef9-cbf919777c67"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 10:02:12 crc kubenswrapper[4861]: I0309 10:02:12.211327 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7ed5e40-0dc4-417c-bef9-cbf919777c67-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "f7ed5e40-0dc4-417c-bef9-cbf919777c67" (UID: "f7ed5e40-0dc4-417c-bef9-cbf919777c67"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:02:12 crc kubenswrapper[4861]: I0309 10:02:12.213576 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7ed5e40-0dc4-417c-bef9-cbf919777c67-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "f7ed5e40-0dc4-417c-bef9-cbf919777c67" (UID: "f7ed5e40-0dc4-417c-bef9-cbf919777c67"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:02:12 crc kubenswrapper[4861]: I0309 10:02:12.216126 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "test-operator-logs") pod "f7ed5e40-0dc4-417c-bef9-cbf919777c67" (UID: "f7ed5e40-0dc4-417c-bef9-cbf919777c67"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 09 10:02:12 crc kubenswrapper[4861]: I0309 10:02:12.216185 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7ed5e40-0dc4-417c-bef9-cbf919777c67-kube-api-access-bghsg" (OuterVolumeSpecName: "kube-api-access-bghsg") pod "f7ed5e40-0dc4-417c-bef9-cbf919777c67" (UID: "f7ed5e40-0dc4-417c-bef9-cbf919777c67"). InnerVolumeSpecName "kube-api-access-bghsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:02:12 crc kubenswrapper[4861]: I0309 10:02:12.310410 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f7ed5e40-0dc4-417c-bef9-cbf919777c67-ssh-key\") pod \"f7ed5e40-0dc4-417c-bef9-cbf919777c67\" (UID: \"f7ed5e40-0dc4-417c-bef9-cbf919777c67\") " Mar 09 10:02:12 crc kubenswrapper[4861]: I0309 10:02:12.310496 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/f7ed5e40-0dc4-417c-bef9-cbf919777c67-ca-certs\") pod \"f7ed5e40-0dc4-417c-bef9-cbf919777c67\" (UID: \"f7ed5e40-0dc4-417c-bef9-cbf919777c67\") " Mar 09 10:02:12 crc kubenswrapper[4861]: I0309 10:02:12.310590 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f7ed5e40-0dc4-417c-bef9-cbf919777c67-openstack-config\") pod \"f7ed5e40-0dc4-417c-bef9-cbf919777c67\" (UID: \"f7ed5e40-0dc4-417c-bef9-cbf919777c67\") " Mar 09 10:02:12 crc kubenswrapper[4861]: I0309 10:02:12.310655 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f7ed5e40-0dc4-417c-bef9-cbf919777c67-openstack-config-secret\") pod \"f7ed5e40-0dc4-417c-bef9-cbf919777c67\" (UID: \"f7ed5e40-0dc4-417c-bef9-cbf919777c67\") " Mar 09 10:02:12 crc kubenswrapper[4861]: I0309 10:02:12.311168 4861 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f7ed5e40-0dc4-417c-bef9-cbf919777c67-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 10:02:12 crc kubenswrapper[4861]: I0309 10:02:12.311195 4861 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/f7ed5e40-0dc4-417c-bef9-cbf919777c67-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 09 10:02:12 crc kubenswrapper[4861]: I0309 10:02:12.311233 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bghsg\" (UniqueName: \"kubernetes.io/projected/f7ed5e40-0dc4-417c-bef9-cbf919777c67-kube-api-access-bghsg\") on node \"crc\" DevicePath \"\"" Mar 09 10:02:12 crc kubenswrapper[4861]: I0309 10:02:12.311265 4861 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Mar 09 10:02:12 crc kubenswrapper[4861]: I0309 10:02:12.311277 4861 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/f7ed5e40-0dc4-417c-bef9-cbf919777c67-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 09 10:02:12 crc kubenswrapper[4861]: I0309 10:02:12.334414 4861 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Mar 09 10:02:12 crc kubenswrapper[4861]: I0309 10:02:12.335928 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7ed5e40-0dc4-417c-bef9-cbf919777c67-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f7ed5e40-0dc4-417c-bef9-cbf919777c67" (UID: "f7ed5e40-0dc4-417c-bef9-cbf919777c67"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:02:12 crc kubenswrapper[4861]: I0309 10:02:12.340599 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7ed5e40-0dc4-417c-bef9-cbf919777c67-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "f7ed5e40-0dc4-417c-bef9-cbf919777c67" (UID: "f7ed5e40-0dc4-417c-bef9-cbf919777c67"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:02:12 crc kubenswrapper[4861]: I0309 10:02:12.342617 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7ed5e40-0dc4-417c-bef9-cbf919777c67-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "f7ed5e40-0dc4-417c-bef9-cbf919777c67" (UID: "f7ed5e40-0dc4-417c-bef9-cbf919777c67"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:02:12 crc kubenswrapper[4861]: I0309 10:02:12.359967 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7ed5e40-0dc4-417c-bef9-cbf919777c67-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "f7ed5e40-0dc4-417c-bef9-cbf919777c67" (UID: "f7ed5e40-0dc4-417c-bef9-cbf919777c67"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 10:02:12 crc kubenswrapper[4861]: I0309 10:02:12.412786 4861 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f7ed5e40-0dc4-417c-bef9-cbf919777c67-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 09 10:02:12 crc kubenswrapper[4861]: I0309 10:02:12.412821 4861 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f7ed5e40-0dc4-417c-bef9-cbf919777c67-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 09 10:02:12 crc kubenswrapper[4861]: I0309 10:02:12.412835 4861 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Mar 09 10:02:12 crc kubenswrapper[4861]: I0309 10:02:12.412845 4861 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f7ed5e40-0dc4-417c-bef9-cbf919777c67-ssh-key\") on node \"crc\" DevicePath \"\"" Mar 09 10:02:12 crc kubenswrapper[4861]: I0309 10:02:12.412856 4861 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/f7ed5e40-0dc4-417c-bef9-cbf919777c67-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 09 10:02:12 crc kubenswrapper[4861]: I0309 10:02:12.785186 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"f7ed5e40-0dc4-417c-bef9-cbf919777c67","Type":"ContainerDied","Data":"99742d2827fede0f84ffd50e88e846f6a600874c6d9967389bfb447acab02880"} Mar 09 10:02:12 crc kubenswrapper[4861]: I0309 10:02:12.785547 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99742d2827fede0f84ffd50e88e846f6a600874c6d9967389bfb447acab02880" Mar 09 10:02:12 crc kubenswrapper[4861]: I0309 10:02:12.785228 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 09 10:02:16 crc kubenswrapper[4861]: I0309 10:02:16.581799 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 09 10:02:16 crc kubenswrapper[4861]: E0309 10:02:16.582837 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35216638-05e0-40bf-b9b8-57924a749838" containerName="oc" Mar 09 10:02:16 crc kubenswrapper[4861]: I0309 10:02:16.582856 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="35216638-05e0-40bf-b9b8-57924a749838" containerName="oc" Mar 09 10:02:16 crc kubenswrapper[4861]: E0309 10:02:16.582887 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7ed5e40-0dc4-417c-bef9-cbf919777c67" containerName="tempest-tests-tempest-tests-runner" Mar 09 10:02:16 crc kubenswrapper[4861]: I0309 10:02:16.582895 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7ed5e40-0dc4-417c-bef9-cbf919777c67" containerName="tempest-tests-tempest-tests-runner" Mar 09 10:02:16 crc kubenswrapper[4861]: I0309 10:02:16.583112 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7ed5e40-0dc4-417c-bef9-cbf919777c67" containerName="tempest-tests-tempest-tests-runner" Mar 09 10:02:16 crc kubenswrapper[4861]: I0309 10:02:16.583129 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="35216638-05e0-40bf-b9b8-57924a749838" containerName="oc" Mar 09 10:02:16 crc kubenswrapper[4861]: I0309 10:02:16.583830 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 09 10:02:16 crc kubenswrapper[4861]: I0309 10:02:16.588171 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-mkjkn" Mar 09 10:02:16 crc kubenswrapper[4861]: I0309 10:02:16.595363 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 09 10:02:16 crc kubenswrapper[4861]: I0309 10:02:16.698446 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gx6lc\" (UniqueName: \"kubernetes.io/projected/571b04e8-dc75-4bf7-921c-82ee1f86b023-kube-api-access-gx6lc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"571b04e8-dc75-4bf7-921c-82ee1f86b023\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 09 10:02:16 crc kubenswrapper[4861]: I0309 10:02:16.698614 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"571b04e8-dc75-4bf7-921c-82ee1f86b023\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 09 10:02:16 crc kubenswrapper[4861]: I0309 10:02:16.800445 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"571b04e8-dc75-4bf7-921c-82ee1f86b023\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 09 10:02:16 crc kubenswrapper[4861]: I0309 10:02:16.800655 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gx6lc\" (UniqueName: \"kubernetes.io/projected/571b04e8-dc75-4bf7-921c-82ee1f86b023-kube-api-access-gx6lc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"571b04e8-dc75-4bf7-921c-82ee1f86b023\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 09 10:02:16 crc kubenswrapper[4861]: I0309 10:02:16.800956 4861 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"571b04e8-dc75-4bf7-921c-82ee1f86b023\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 09 10:02:16 crc kubenswrapper[4861]: I0309 10:02:16.825939 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gx6lc\" (UniqueName: \"kubernetes.io/projected/571b04e8-dc75-4bf7-921c-82ee1f86b023-kube-api-access-gx6lc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"571b04e8-dc75-4bf7-921c-82ee1f86b023\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 09 10:02:16 crc kubenswrapper[4861]: I0309 10:02:16.827694 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"571b04e8-dc75-4bf7-921c-82ee1f86b023\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 09 10:02:16 crc kubenswrapper[4861]: I0309 10:02:16.907357 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 09 10:02:17 crc kubenswrapper[4861]: I0309 10:02:17.347768 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 09 10:02:17 crc kubenswrapper[4861]: I0309 10:02:17.841739 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"571b04e8-dc75-4bf7-921c-82ee1f86b023","Type":"ContainerStarted","Data":"17b2d47dabae25fff1c608b7e038e2c8b3e62c3a0e65413e91b53ee13eb3d22f"} Mar 09 10:02:18 crc kubenswrapper[4861]: I0309 10:02:18.851883 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"571b04e8-dc75-4bf7-921c-82ee1f86b023","Type":"ContainerStarted","Data":"04f13447fcdc2a019849705d7d0300c8030e365500c446536a91a830c8120401"} Mar 09 10:02:18 crc kubenswrapper[4861]: I0309 10:02:18.868182 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.962027994 podStartE2EDuration="2.868156137s" podCreationTimestamp="2026-03-09 10:02:16 +0000 UTC" firstStartedPulling="2026-03-09 10:02:17.348482455 +0000 UTC m=+3380.433521856" lastFinishedPulling="2026-03-09 10:02:18.254610598 +0000 UTC m=+3381.339649999" observedRunningTime="2026-03-09 10:02:18.862986527 +0000 UTC m=+3381.948025938" watchObservedRunningTime="2026-03-09 10:02:18.868156137 +0000 UTC m=+3381.953195548" Mar 09 10:02:37 crc kubenswrapper[4861]: I0309 10:02:37.952456 4861 scope.go:117] "RemoveContainer" containerID="0037ed70b65b374a3333ec5dbabfe26eb321f8e5020a738b7abfcdd9189053ec" Mar 09 10:02:41 crc kubenswrapper[4861]: I0309 10:02:41.279111 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qf6cx/must-gather-hdwgq"] Mar 09 10:02:41 crc kubenswrapper[4861]: I0309 10:02:41.281162 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qf6cx/must-gather-hdwgq" Mar 09 10:02:41 crc kubenswrapper[4861]: I0309 10:02:41.285168 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-qf6cx"/"openshift-service-ca.crt" Mar 09 10:02:41 crc kubenswrapper[4861]: I0309 10:02:41.285253 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-qf6cx"/"kube-root-ca.crt" Mar 09 10:02:41 crc kubenswrapper[4861]: I0309 10:02:41.303133 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qf6cx/must-gather-hdwgq"] Mar 09 10:02:41 crc kubenswrapper[4861]: I0309 10:02:41.370566 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbhm9\" (UniqueName: \"kubernetes.io/projected/1a5514de-866f-42de-ab75-f4988a3108a7-kube-api-access-sbhm9\") pod \"must-gather-hdwgq\" (UID: \"1a5514de-866f-42de-ab75-f4988a3108a7\") " pod="openshift-must-gather-qf6cx/must-gather-hdwgq" Mar 09 10:02:41 crc kubenswrapper[4861]: I0309 10:02:41.370675 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1a5514de-866f-42de-ab75-f4988a3108a7-must-gather-output\") pod \"must-gather-hdwgq\" (UID: \"1a5514de-866f-42de-ab75-f4988a3108a7\") " pod="openshift-must-gather-qf6cx/must-gather-hdwgq" Mar 09 10:02:41 crc kubenswrapper[4861]: I0309 10:02:41.472940 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbhm9\" (UniqueName: \"kubernetes.io/projected/1a5514de-866f-42de-ab75-f4988a3108a7-kube-api-access-sbhm9\") pod \"must-gather-hdwgq\" (UID: \"1a5514de-866f-42de-ab75-f4988a3108a7\") " pod="openshift-must-gather-qf6cx/must-gather-hdwgq" Mar 09 10:02:41 crc kubenswrapper[4861]: I0309 10:02:41.473059 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1a5514de-866f-42de-ab75-f4988a3108a7-must-gather-output\") pod \"must-gather-hdwgq\" (UID: \"1a5514de-866f-42de-ab75-f4988a3108a7\") " pod="openshift-must-gather-qf6cx/must-gather-hdwgq" Mar 09 10:02:41 crc kubenswrapper[4861]: I0309 10:02:41.473549 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1a5514de-866f-42de-ab75-f4988a3108a7-must-gather-output\") pod \"must-gather-hdwgq\" (UID: \"1a5514de-866f-42de-ab75-f4988a3108a7\") " pod="openshift-must-gather-qf6cx/must-gather-hdwgq" Mar 09 10:02:41 crc kubenswrapper[4861]: I0309 10:02:41.492092 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbhm9\" (UniqueName: \"kubernetes.io/projected/1a5514de-866f-42de-ab75-f4988a3108a7-kube-api-access-sbhm9\") pod \"must-gather-hdwgq\" (UID: \"1a5514de-866f-42de-ab75-f4988a3108a7\") " pod="openshift-must-gather-qf6cx/must-gather-hdwgq" Mar 09 10:02:41 crc kubenswrapper[4861]: I0309 10:02:41.610937 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qf6cx/must-gather-hdwgq" Mar 09 10:02:42 crc kubenswrapper[4861]: I0309 10:02:42.114679 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qf6cx/must-gather-hdwgq"] Mar 09 10:02:43 crc kubenswrapper[4861]: I0309 10:02:43.092646 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qf6cx/must-gather-hdwgq" event={"ID":"1a5514de-866f-42de-ab75-f4988a3108a7","Type":"ContainerStarted","Data":"adebb954dd2d70e987503695b8b5892875f99d7836038bcfc3d4d46212aea582"} Mar 09 10:02:43 crc kubenswrapper[4861]: I0309 10:02:43.280970 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6r2pj"] Mar 09 10:02:43 crc kubenswrapper[4861]: I0309 10:02:43.286751 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6r2pj" Mar 09 10:02:43 crc kubenswrapper[4861]: I0309 10:02:43.293012 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6r2pj"] Mar 09 10:02:43 crc kubenswrapper[4861]: I0309 10:02:43.416033 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35f29053-3ec0-43ce-bcad-ff373dc9b8c5-utilities\") pod \"redhat-operators-6r2pj\" (UID: \"35f29053-3ec0-43ce-bcad-ff373dc9b8c5\") " pod="openshift-marketplace/redhat-operators-6r2pj" Mar 09 10:02:43 crc kubenswrapper[4861]: I0309 10:02:43.416258 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nqh5\" (UniqueName: \"kubernetes.io/projected/35f29053-3ec0-43ce-bcad-ff373dc9b8c5-kube-api-access-5nqh5\") pod \"redhat-operators-6r2pj\" (UID: \"35f29053-3ec0-43ce-bcad-ff373dc9b8c5\") " pod="openshift-marketplace/redhat-operators-6r2pj" Mar 09 10:02:43 crc kubenswrapper[4861]: I0309 10:02:43.416307 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35f29053-3ec0-43ce-bcad-ff373dc9b8c5-catalog-content\") pod \"redhat-operators-6r2pj\" (UID: \"35f29053-3ec0-43ce-bcad-ff373dc9b8c5\") " pod="openshift-marketplace/redhat-operators-6r2pj" Mar 09 10:02:43 crc kubenswrapper[4861]: I0309 10:02:43.518084 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35f29053-3ec0-43ce-bcad-ff373dc9b8c5-catalog-content\") pod \"redhat-operators-6r2pj\" (UID: \"35f29053-3ec0-43ce-bcad-ff373dc9b8c5\") " pod="openshift-marketplace/redhat-operators-6r2pj" Mar 09 10:02:43 crc kubenswrapper[4861]: I0309 10:02:43.518210 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35f29053-3ec0-43ce-bcad-ff373dc9b8c5-utilities\") pod \"redhat-operators-6r2pj\" (UID: \"35f29053-3ec0-43ce-bcad-ff373dc9b8c5\") " pod="openshift-marketplace/redhat-operators-6r2pj" Mar 09 10:02:43 crc kubenswrapper[4861]: I0309 10:02:43.518480 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nqh5\" (UniqueName: \"kubernetes.io/projected/35f29053-3ec0-43ce-bcad-ff373dc9b8c5-kube-api-access-5nqh5\") pod \"redhat-operators-6r2pj\" (UID: \"35f29053-3ec0-43ce-bcad-ff373dc9b8c5\") " pod="openshift-marketplace/redhat-operators-6r2pj" Mar 09 10:02:43 crc kubenswrapper[4861]: I0309 10:02:43.518693 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35f29053-3ec0-43ce-bcad-ff373dc9b8c5-catalog-content\") pod \"redhat-operators-6r2pj\" (UID: \"35f29053-3ec0-43ce-bcad-ff373dc9b8c5\") " pod="openshift-marketplace/redhat-operators-6r2pj" Mar 09 10:02:43 crc kubenswrapper[4861]: I0309 10:02:43.518798 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35f29053-3ec0-43ce-bcad-ff373dc9b8c5-utilities\") pod \"redhat-operators-6r2pj\" (UID: \"35f29053-3ec0-43ce-bcad-ff373dc9b8c5\") " pod="openshift-marketplace/redhat-operators-6r2pj" Mar 09 10:02:43 crc kubenswrapper[4861]: I0309 10:02:43.549438 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nqh5\" (UniqueName: \"kubernetes.io/projected/35f29053-3ec0-43ce-bcad-ff373dc9b8c5-kube-api-access-5nqh5\") pod \"redhat-operators-6r2pj\" (UID: \"35f29053-3ec0-43ce-bcad-ff373dc9b8c5\") " pod="openshift-marketplace/redhat-operators-6r2pj" Mar 09 10:02:43 crc kubenswrapper[4861]: I0309 10:02:43.613395 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6r2pj" Mar 09 10:02:44 crc kubenswrapper[4861]: I0309 10:02:44.181565 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6r2pj"] Mar 09 10:02:44 crc kubenswrapper[4861]: W0309 10:02:44.191317 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35f29053_3ec0_43ce_bcad_ff373dc9b8c5.slice/crio-6f6bf2973929e68ccaeaf9d2db6c8565297247dd4e569851bd0c8d0d164a92ea WatchSource:0}: Error finding container 6f6bf2973929e68ccaeaf9d2db6c8565297247dd4e569851bd0c8d0d164a92ea: Status 404 returned error can't find the container with id 6f6bf2973929e68ccaeaf9d2db6c8565297247dd4e569851bd0c8d0d164a92ea Mar 09 10:02:45 crc kubenswrapper[4861]: I0309 10:02:45.121510 4861 generic.go:334] "Generic (PLEG): container finished" podID="35f29053-3ec0-43ce-bcad-ff373dc9b8c5" containerID="ddcfa625d99d3b6089fbd75e8353f0a4f2857c1b263d394d1e3b1f2acdc31ed5" exitCode=0 Mar 09 10:02:45 crc kubenswrapper[4861]: I0309 10:02:45.121883 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6r2pj" event={"ID":"35f29053-3ec0-43ce-bcad-ff373dc9b8c5","Type":"ContainerDied","Data":"ddcfa625d99d3b6089fbd75e8353f0a4f2857c1b263d394d1e3b1f2acdc31ed5"} Mar 09 10:02:45 crc kubenswrapper[4861]: I0309 10:02:45.121914 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6r2pj" event={"ID":"35f29053-3ec0-43ce-bcad-ff373dc9b8c5","Type":"ContainerStarted","Data":"6f6bf2973929e68ccaeaf9d2db6c8565297247dd4e569851bd0c8d0d164a92ea"} Mar 09 10:02:50 crc kubenswrapper[4861]: I0309 10:02:50.219663 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qf6cx/must-gather-hdwgq" event={"ID":"1a5514de-866f-42de-ab75-f4988a3108a7","Type":"ContainerStarted","Data":"e30208b5d25820e86ffdb3840bb40b2d2680a12af7065ddd048dbe67d401afdf"} Mar 09 10:02:50 crc kubenswrapper[4861]: I0309 10:02:50.220273 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qf6cx/must-gather-hdwgq" event={"ID":"1a5514de-866f-42de-ab75-f4988a3108a7","Type":"ContainerStarted","Data":"96a6a23b425a481921ee3e4a1a25e4bbdbca2e8678a0acbf476847e09d83de06"} Mar 09 10:02:50 crc kubenswrapper[4861]: I0309 10:02:50.224691 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6r2pj" event={"ID":"35f29053-3ec0-43ce-bcad-ff373dc9b8c5","Type":"ContainerStarted","Data":"6b27135f7392a2ddf2d974acb363112b990cf8f922d268dddda88fbd2d04383a"} Mar 09 10:02:50 crc kubenswrapper[4861]: I0309 10:02:50.246190 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-qf6cx/must-gather-hdwgq" podStartSLOduration=2.192952503 podStartE2EDuration="9.246170897s" podCreationTimestamp="2026-03-09 10:02:41 +0000 UTC" firstStartedPulling="2026-03-09 10:02:42.133866032 +0000 UTC m=+3405.218905433" lastFinishedPulling="2026-03-09 10:02:49.187084426 +0000 UTC m=+3412.272123827" observedRunningTime="2026-03-09 10:02:50.235839755 +0000 UTC m=+3413.320879156" watchObservedRunningTime="2026-03-09 10:02:50.246170897 +0000 UTC m=+3413.331210298" Mar 09 10:02:51 crc kubenswrapper[4861]: I0309 10:02:51.236220 4861 generic.go:334] "Generic (PLEG): container finished" podID="35f29053-3ec0-43ce-bcad-ff373dc9b8c5" containerID="6b27135f7392a2ddf2d974acb363112b990cf8f922d268dddda88fbd2d04383a" exitCode=0 Mar 09 10:02:51 crc kubenswrapper[4861]: I0309 10:02:51.236304 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6r2pj" event={"ID":"35f29053-3ec0-43ce-bcad-ff373dc9b8c5","Type":"ContainerDied","Data":"6b27135f7392a2ddf2d974acb363112b990cf8f922d268dddda88fbd2d04383a"} Mar 09 10:02:52 crc kubenswrapper[4861]: I0309 10:02:52.248260 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6r2pj" event={"ID":"35f29053-3ec0-43ce-bcad-ff373dc9b8c5","Type":"ContainerStarted","Data":"fe2132c6a0c4eca4b0230895168bcbfbd31286f967d6c56752e5593d86ef4aff"} Mar 09 10:02:52 crc kubenswrapper[4861]: I0309 10:02:52.281672 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6r2pj" podStartSLOduration=4.954846417 podStartE2EDuration="9.281646095s" podCreationTimestamp="2026-03-09 10:02:43 +0000 UTC" firstStartedPulling="2026-03-09 10:02:47.493404992 +0000 UTC m=+3410.578444393" lastFinishedPulling="2026-03-09 10:02:51.82020467 +0000 UTC m=+3414.905244071" observedRunningTime="2026-03-09 10:02:52.27339473 +0000 UTC m=+3415.358434131" watchObservedRunningTime="2026-03-09 10:02:52.281646095 +0000 UTC m=+3415.366685496" Mar 09 10:02:53 crc kubenswrapper[4861]: I0309 10:02:53.364526 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qf6cx/crc-debug-92k6v"] Mar 09 10:02:53 crc kubenswrapper[4861]: I0309 10:02:53.367077 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qf6cx/crc-debug-92k6v" Mar 09 10:02:53 crc kubenswrapper[4861]: I0309 10:02:53.371035 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-qf6cx"/"default-dockercfg-2hz22" Mar 09 10:02:53 crc kubenswrapper[4861]: I0309 10:02:53.543053 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4742a106-914a-494d-91c2-c563c9269f93-host\") pod \"crc-debug-92k6v\" (UID: \"4742a106-914a-494d-91c2-c563c9269f93\") " pod="openshift-must-gather-qf6cx/crc-debug-92k6v" Mar 09 10:02:53 crc kubenswrapper[4861]: I0309 10:02:53.543566 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2hzk\" (UniqueName: \"kubernetes.io/projected/4742a106-914a-494d-91c2-c563c9269f93-kube-api-access-j2hzk\") pod \"crc-debug-92k6v\" (UID: \"4742a106-914a-494d-91c2-c563c9269f93\") " pod="openshift-must-gather-qf6cx/crc-debug-92k6v" Mar 09 10:02:53 crc kubenswrapper[4861]: I0309 10:02:53.615432 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6r2pj" Mar 09 10:02:53 crc kubenswrapper[4861]: I0309 10:02:53.615509 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6r2pj" Mar 09 10:02:53 crc kubenswrapper[4861]: I0309 10:02:53.645804 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4742a106-914a-494d-91c2-c563c9269f93-host\") pod \"crc-debug-92k6v\" (UID: \"4742a106-914a-494d-91c2-c563c9269f93\") " pod="openshift-must-gather-qf6cx/crc-debug-92k6v" Mar 09 10:02:53 crc kubenswrapper[4861]: I0309 10:02:53.645975 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2hzk\" (UniqueName: \"kubernetes.io/projected/4742a106-914a-494d-91c2-c563c9269f93-kube-api-access-j2hzk\") pod \"crc-debug-92k6v\" (UID: \"4742a106-914a-494d-91c2-c563c9269f93\") " pod="openshift-must-gather-qf6cx/crc-debug-92k6v" Mar 09 10:02:53 crc kubenswrapper[4861]: I0309 10:02:53.646047 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4742a106-914a-494d-91c2-c563c9269f93-host\") pod \"crc-debug-92k6v\" (UID: \"4742a106-914a-494d-91c2-c563c9269f93\") " pod="openshift-must-gather-qf6cx/crc-debug-92k6v" Mar 09 10:02:53 crc kubenswrapper[4861]: I0309 10:02:53.680800 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2hzk\" (UniqueName: \"kubernetes.io/projected/4742a106-914a-494d-91c2-c563c9269f93-kube-api-access-j2hzk\") pod \"crc-debug-92k6v\" (UID: \"4742a106-914a-494d-91c2-c563c9269f93\") " pod="openshift-must-gather-qf6cx/crc-debug-92k6v" Mar 09 10:02:53 crc kubenswrapper[4861]: I0309 10:02:53.692035 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qf6cx/crc-debug-92k6v" Mar 09 10:02:53 crc kubenswrapper[4861]: W0309 10:02:53.750172 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4742a106_914a_494d_91c2_c563c9269f93.slice/crio-bd5307a459d3c910df5845f53ee1490f26204de224ace219c39895d0268f4404 WatchSource:0}: Error finding container bd5307a459d3c910df5845f53ee1490f26204de224ace219c39895d0268f4404: Status 404 returned error can't find the container with id bd5307a459d3c910df5845f53ee1490f26204de224ace219c39895d0268f4404 Mar 09 10:02:54 crc kubenswrapper[4861]: I0309 10:02:54.268494 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qf6cx/crc-debug-92k6v" event={"ID":"4742a106-914a-494d-91c2-c563c9269f93","Type":"ContainerStarted","Data":"bd5307a459d3c910df5845f53ee1490f26204de224ace219c39895d0268f4404"} Mar 09 10:02:54 crc kubenswrapper[4861]: I0309 10:02:54.606862 4861 patch_prober.go:28] interesting pod/machine-config-daemon-5g7gc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 10:02:54 crc kubenswrapper[4861]: I0309 10:02:54.606940 4861 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 10:02:54 crc kubenswrapper[4861]: I0309 10:02:54.675618 4861 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6r2pj" podUID="35f29053-3ec0-43ce-bcad-ff373dc9b8c5" containerName="registry-server" probeResult="failure" output=< Mar 09 10:02:54 crc kubenswrapper[4861]: timeout: failed to connect service ":50051" within 1s Mar 09 10:02:54 crc kubenswrapper[4861]: > Mar 09 10:03:03 crc kubenswrapper[4861]: I0309 10:03:03.676726 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6r2pj" Mar 09 10:03:03 crc kubenswrapper[4861]: I0309 10:03:03.738971 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6r2pj" Mar 09 10:03:04 crc kubenswrapper[4861]: I0309 10:03:04.832543 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6r2pj"] Mar 09 10:03:05 crc kubenswrapper[4861]: I0309 10:03:05.413125 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6r2pj" podUID="35f29053-3ec0-43ce-bcad-ff373dc9b8c5" containerName="registry-server" containerID="cri-o://fe2132c6a0c4eca4b0230895168bcbfbd31286f967d6c56752e5593d86ef4aff" gracePeriod=2 Mar 09 10:03:06 crc kubenswrapper[4861]: I0309 10:03:06.426020 4861 generic.go:334] "Generic (PLEG): container finished" podID="35f29053-3ec0-43ce-bcad-ff373dc9b8c5" containerID="fe2132c6a0c4eca4b0230895168bcbfbd31286f967d6c56752e5593d86ef4aff" exitCode=0 Mar 09 10:03:06 crc kubenswrapper[4861]: I0309 10:03:06.426070 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6r2pj" event={"ID":"35f29053-3ec0-43ce-bcad-ff373dc9b8c5","Type":"ContainerDied","Data":"fe2132c6a0c4eca4b0230895168bcbfbd31286f967d6c56752e5593d86ef4aff"} Mar 09 10:03:07 crc kubenswrapper[4861]: I0309 10:03:07.555001 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6r2pj" Mar 09 10:03:07 crc kubenswrapper[4861]: I0309 10:03:07.648047 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nqh5\" (UniqueName: \"kubernetes.io/projected/35f29053-3ec0-43ce-bcad-ff373dc9b8c5-kube-api-access-5nqh5\") pod \"35f29053-3ec0-43ce-bcad-ff373dc9b8c5\" (UID: \"35f29053-3ec0-43ce-bcad-ff373dc9b8c5\") " Mar 09 10:03:07 crc kubenswrapper[4861]: I0309 10:03:07.648148 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35f29053-3ec0-43ce-bcad-ff373dc9b8c5-catalog-content\") pod \"35f29053-3ec0-43ce-bcad-ff373dc9b8c5\" (UID: \"35f29053-3ec0-43ce-bcad-ff373dc9b8c5\") " Mar 09 10:03:07 crc kubenswrapper[4861]: I0309 10:03:07.648474 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35f29053-3ec0-43ce-bcad-ff373dc9b8c5-utilities\") pod \"35f29053-3ec0-43ce-bcad-ff373dc9b8c5\" (UID: \"35f29053-3ec0-43ce-bcad-ff373dc9b8c5\") " Mar 09 10:03:07 crc kubenswrapper[4861]: I0309 10:03:07.649974 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35f29053-3ec0-43ce-bcad-ff373dc9b8c5-utilities" (OuterVolumeSpecName: "utilities") pod "35f29053-3ec0-43ce-bcad-ff373dc9b8c5" (UID: "35f29053-3ec0-43ce-bcad-ff373dc9b8c5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:03:07 crc kubenswrapper[4861]: I0309 10:03:07.661438 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35f29053-3ec0-43ce-bcad-ff373dc9b8c5-kube-api-access-5nqh5" (OuterVolumeSpecName: "kube-api-access-5nqh5") pod "35f29053-3ec0-43ce-bcad-ff373dc9b8c5" (UID: "35f29053-3ec0-43ce-bcad-ff373dc9b8c5"). InnerVolumeSpecName "kube-api-access-5nqh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:03:07 crc kubenswrapper[4861]: I0309 10:03:07.755207 4861 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35f29053-3ec0-43ce-bcad-ff373dc9b8c5-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 10:03:07 crc kubenswrapper[4861]: I0309 10:03:07.755263 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5nqh5\" (UniqueName: \"kubernetes.io/projected/35f29053-3ec0-43ce-bcad-ff373dc9b8c5-kube-api-access-5nqh5\") on node \"crc\" DevicePath \"\"" Mar 09 10:03:07 crc kubenswrapper[4861]: I0309 10:03:07.850756 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35f29053-3ec0-43ce-bcad-ff373dc9b8c5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "35f29053-3ec0-43ce-bcad-ff373dc9b8c5" (UID: "35f29053-3ec0-43ce-bcad-ff373dc9b8c5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:03:07 crc kubenswrapper[4861]: I0309 10:03:07.857050 4861 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35f29053-3ec0-43ce-bcad-ff373dc9b8c5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 10:03:08 crc kubenswrapper[4861]: I0309 10:03:08.447114 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6r2pj" event={"ID":"35f29053-3ec0-43ce-bcad-ff373dc9b8c5","Type":"ContainerDied","Data":"6f6bf2973929e68ccaeaf9d2db6c8565297247dd4e569851bd0c8d0d164a92ea"} Mar 09 10:03:08 crc kubenswrapper[4861]: I0309 10:03:08.447609 4861 scope.go:117] "RemoveContainer" containerID="fe2132c6a0c4eca4b0230895168bcbfbd31286f967d6c56752e5593d86ef4aff" Mar 09 10:03:08 crc kubenswrapper[4861]: I0309 10:03:08.447165 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6r2pj" Mar 09 10:03:08 crc kubenswrapper[4861]: I0309 10:03:08.449212 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qf6cx/crc-debug-92k6v" event={"ID":"4742a106-914a-494d-91c2-c563c9269f93","Type":"ContainerStarted","Data":"9c3ff4f259657f8189b7fa75b2b33d7656978117585aa35efe1e507abbd6e84f"} Mar 09 10:03:08 crc kubenswrapper[4861]: I0309 10:03:08.479539 4861 scope.go:117] "RemoveContainer" containerID="6b27135f7392a2ddf2d974acb363112b990cf8f922d268dddda88fbd2d04383a" Mar 09 10:03:08 crc kubenswrapper[4861]: I0309 10:03:08.511241 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-qf6cx/crc-debug-92k6v" podStartSLOduration=2.04727893 podStartE2EDuration="15.511216295s" podCreationTimestamp="2026-03-09 10:02:53 +0000 UTC" firstStartedPulling="2026-03-09 10:02:53.752922118 +0000 UTC m=+3416.837961519" lastFinishedPulling="2026-03-09 10:03:07.216859483 +0000 UTC m=+3430.301898884" observedRunningTime="2026-03-09 10:03:08.481017721 +0000 UTC m=+3431.566057122" watchObservedRunningTime="2026-03-09 10:03:08.511216295 +0000 UTC m=+3431.596255696" Mar 09 10:03:08 crc kubenswrapper[4861]: I0309 10:03:08.512307 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6r2pj"] Mar 09 10:03:08 crc kubenswrapper[4861]: I0309 10:03:08.522298 4861 scope.go:117] "RemoveContainer" containerID="ddcfa625d99d3b6089fbd75e8353f0a4f2857c1b263d394d1e3b1f2acdc31ed5" Mar 09 10:03:08 crc kubenswrapper[4861]: I0309 10:03:08.523237 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6r2pj"] Mar 09 10:03:09 crc kubenswrapper[4861]: I0309 10:03:09.670009 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35f29053-3ec0-43ce-bcad-ff373dc9b8c5" path="/var/lib/kubelet/pods/35f29053-3ec0-43ce-bcad-ff373dc9b8c5/volumes" Mar 09 10:03:24 crc kubenswrapper[4861]: I0309 10:03:24.606040 4861 patch_prober.go:28] interesting pod/machine-config-daemon-5g7gc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 10:03:24 crc kubenswrapper[4861]: I0309 10:03:24.606577 4861 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 10:03:46 crc kubenswrapper[4861]: I0309 10:03:46.803205 4861 generic.go:334] "Generic (PLEG): container finished" podID="4742a106-914a-494d-91c2-c563c9269f93" containerID="9c3ff4f259657f8189b7fa75b2b33d7656978117585aa35efe1e507abbd6e84f" exitCode=0 Mar 09 10:03:46 crc kubenswrapper[4861]: I0309 10:03:46.803332 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qf6cx/crc-debug-92k6v" event={"ID":"4742a106-914a-494d-91c2-c563c9269f93","Type":"ContainerDied","Data":"9c3ff4f259657f8189b7fa75b2b33d7656978117585aa35efe1e507abbd6e84f"} Mar 09 10:03:47 crc kubenswrapper[4861]: I0309 10:03:47.938389 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qf6cx/crc-debug-92k6v" Mar 09 10:03:47 crc kubenswrapper[4861]: I0309 10:03:47.975142 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-qf6cx/crc-debug-92k6v"] Mar 09 10:03:47 crc kubenswrapper[4861]: I0309 10:03:47.985099 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-qf6cx/crc-debug-92k6v"] Mar 09 10:03:48 crc kubenswrapper[4861]: I0309 10:03:48.114797 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4742a106-914a-494d-91c2-c563c9269f93-host\") pod \"4742a106-914a-494d-91c2-c563c9269f93\" (UID: \"4742a106-914a-494d-91c2-c563c9269f93\") " Mar 09 10:03:48 crc kubenswrapper[4861]: I0309 10:03:48.114934 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2hzk\" (UniqueName: \"kubernetes.io/projected/4742a106-914a-494d-91c2-c563c9269f93-kube-api-access-j2hzk\") pod \"4742a106-914a-494d-91c2-c563c9269f93\" (UID: \"4742a106-914a-494d-91c2-c563c9269f93\") " Mar 09 10:03:48 crc kubenswrapper[4861]: I0309 10:03:48.115084 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4742a106-914a-494d-91c2-c563c9269f93-host" (OuterVolumeSpecName: "host") pod "4742a106-914a-494d-91c2-c563c9269f93" (UID: "4742a106-914a-494d-91c2-c563c9269f93"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 10:03:48 crc kubenswrapper[4861]: I0309 10:03:48.115420 4861 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4742a106-914a-494d-91c2-c563c9269f93-host\") on node \"crc\" DevicePath \"\"" Mar 09 10:03:48 crc kubenswrapper[4861]: I0309 10:03:48.121002 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4742a106-914a-494d-91c2-c563c9269f93-kube-api-access-j2hzk" (OuterVolumeSpecName: "kube-api-access-j2hzk") pod "4742a106-914a-494d-91c2-c563c9269f93" (UID: "4742a106-914a-494d-91c2-c563c9269f93"). InnerVolumeSpecName "kube-api-access-j2hzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:03:48 crc kubenswrapper[4861]: I0309 10:03:48.217226 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2hzk\" (UniqueName: \"kubernetes.io/projected/4742a106-914a-494d-91c2-c563c9269f93-kube-api-access-j2hzk\") on node \"crc\" DevicePath \"\"" Mar 09 10:03:48 crc kubenswrapper[4861]: I0309 10:03:48.825891 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd5307a459d3c910df5845f53ee1490f26204de224ace219c39895d0268f4404" Mar 09 10:03:48 crc kubenswrapper[4861]: I0309 10:03:48.826425 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qf6cx/crc-debug-92k6v" Mar 09 10:03:49 crc kubenswrapper[4861]: I0309 10:03:49.182634 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qf6cx/crc-debug-fbwr5"] Mar 09 10:03:49 crc kubenswrapper[4861]: E0309 10:03:49.183353 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35f29053-3ec0-43ce-bcad-ff373dc9b8c5" containerName="extract-utilities" Mar 09 10:03:49 crc kubenswrapper[4861]: I0309 10:03:49.183433 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="35f29053-3ec0-43ce-bcad-ff373dc9b8c5" containerName="extract-utilities" Mar 09 10:03:49 crc kubenswrapper[4861]: E0309 10:03:49.183460 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4742a106-914a-494d-91c2-c563c9269f93" containerName="container-00" Mar 09 10:03:49 crc kubenswrapper[4861]: I0309 10:03:49.183467 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="4742a106-914a-494d-91c2-c563c9269f93" containerName="container-00" Mar 09 10:03:49 crc kubenswrapper[4861]: E0309 10:03:49.183487 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35f29053-3ec0-43ce-bcad-ff373dc9b8c5" containerName="extract-content" Mar 09 10:03:49 crc kubenswrapper[4861]: I0309 10:03:49.183496 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="35f29053-3ec0-43ce-bcad-ff373dc9b8c5" containerName="extract-content" Mar 09 10:03:49 crc kubenswrapper[4861]: E0309 10:03:49.183512 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35f29053-3ec0-43ce-bcad-ff373dc9b8c5" containerName="registry-server" Mar 09 10:03:49 crc kubenswrapper[4861]: I0309 10:03:49.183519 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="35f29053-3ec0-43ce-bcad-ff373dc9b8c5" containerName="registry-server" Mar 09 10:03:49 crc kubenswrapper[4861]: I0309 10:03:49.183744 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="4742a106-914a-494d-91c2-c563c9269f93" containerName="container-00" Mar 09 10:03:49 crc kubenswrapper[4861]: I0309 10:03:49.183759 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="35f29053-3ec0-43ce-bcad-ff373dc9b8c5" containerName="registry-server" Mar 09 10:03:49 crc kubenswrapper[4861]: I0309 10:03:49.184490 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qf6cx/crc-debug-fbwr5" Mar 09 10:03:49 crc kubenswrapper[4861]: I0309 10:03:49.190467 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-qf6cx"/"default-dockercfg-2hz22" Mar 09 10:03:49 crc kubenswrapper[4861]: I0309 10:03:49.338605 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b1092453-0112-487c-9476-469c8275386b-host\") pod \"crc-debug-fbwr5\" (UID: \"b1092453-0112-487c-9476-469c8275386b\") " pod="openshift-must-gather-qf6cx/crc-debug-fbwr5" Mar 09 10:03:49 crc kubenswrapper[4861]: I0309 10:03:49.338716 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tkb9\" (UniqueName: \"kubernetes.io/projected/b1092453-0112-487c-9476-469c8275386b-kube-api-access-7tkb9\") pod \"crc-debug-fbwr5\" (UID: \"b1092453-0112-487c-9476-469c8275386b\") " pod="openshift-must-gather-qf6cx/crc-debug-fbwr5" Mar 09 10:03:49 crc kubenswrapper[4861]: I0309 10:03:49.441250 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b1092453-0112-487c-9476-469c8275386b-host\") pod \"crc-debug-fbwr5\" (UID: \"b1092453-0112-487c-9476-469c8275386b\") " pod="openshift-must-gather-qf6cx/crc-debug-fbwr5" Mar 09 10:03:49 crc kubenswrapper[4861]: I0309 10:03:49.441370 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tkb9\" (UniqueName: \"kubernetes.io/projected/b1092453-0112-487c-9476-469c8275386b-kube-api-access-7tkb9\") pod \"crc-debug-fbwr5\" (UID: \"b1092453-0112-487c-9476-469c8275386b\") " pod="openshift-must-gather-qf6cx/crc-debug-fbwr5" Mar 09 10:03:49 crc kubenswrapper[4861]: I0309 10:03:49.441945 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b1092453-0112-487c-9476-469c8275386b-host\") pod \"crc-debug-fbwr5\" (UID: \"b1092453-0112-487c-9476-469c8275386b\") " pod="openshift-must-gather-qf6cx/crc-debug-fbwr5" Mar 09 10:03:49 crc kubenswrapper[4861]: I0309 10:03:49.460475 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tkb9\" (UniqueName: \"kubernetes.io/projected/b1092453-0112-487c-9476-469c8275386b-kube-api-access-7tkb9\") pod \"crc-debug-fbwr5\" (UID: \"b1092453-0112-487c-9476-469c8275386b\") " pod="openshift-must-gather-qf6cx/crc-debug-fbwr5" Mar 09 10:03:49 crc kubenswrapper[4861]: I0309 10:03:49.500860 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qf6cx/crc-debug-fbwr5" Mar 09 10:03:49 crc kubenswrapper[4861]: I0309 10:03:49.672103 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4742a106-914a-494d-91c2-c563c9269f93" path="/var/lib/kubelet/pods/4742a106-914a-494d-91c2-c563c9269f93/volumes" Mar 09 10:03:49 crc kubenswrapper[4861]: I0309 10:03:49.834833 4861 generic.go:334] "Generic (PLEG): container finished" podID="b1092453-0112-487c-9476-469c8275386b" containerID="58f6b0d2bbb2902a7de5eb5985724de08925d760a4c59da34f46135d9d5a26ad" exitCode=0 Mar 09 10:03:49 crc kubenswrapper[4861]: I0309 10:03:49.834987 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qf6cx/crc-debug-fbwr5" event={"ID":"b1092453-0112-487c-9476-469c8275386b","Type":"ContainerDied","Data":"58f6b0d2bbb2902a7de5eb5985724de08925d760a4c59da34f46135d9d5a26ad"} Mar 09 10:03:49 crc kubenswrapper[4861]: I0309 10:03:49.835197 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qf6cx/crc-debug-fbwr5" event={"ID":"b1092453-0112-487c-9476-469c8275386b","Type":"ContainerStarted","Data":"c3420a47c3331ad26f3f3ad7956f9b328276e9478b118438975873f396c94069"} Mar 09 10:03:50 crc kubenswrapper[4861]: I0309 10:03:50.315849 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-qf6cx/crc-debug-fbwr5"] Mar 09 10:03:50 crc kubenswrapper[4861]: I0309 10:03:50.324896 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-qf6cx/crc-debug-fbwr5"] Mar 09 10:03:50 crc kubenswrapper[4861]: I0309 10:03:50.955200 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qf6cx/crc-debug-fbwr5" Mar 09 10:03:51 crc kubenswrapper[4861]: I0309 10:03:51.072397 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tkb9\" (UniqueName: \"kubernetes.io/projected/b1092453-0112-487c-9476-469c8275386b-kube-api-access-7tkb9\") pod \"b1092453-0112-487c-9476-469c8275386b\" (UID: \"b1092453-0112-487c-9476-469c8275386b\") " Mar 09 10:03:51 crc kubenswrapper[4861]: I0309 10:03:51.072762 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b1092453-0112-487c-9476-469c8275386b-host\") pod \"b1092453-0112-487c-9476-469c8275386b\" (UID: \"b1092453-0112-487c-9476-469c8275386b\") " Mar 09 10:03:51 crc kubenswrapper[4861]: I0309 10:03:51.072822 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b1092453-0112-487c-9476-469c8275386b-host" (OuterVolumeSpecName: "host") pod "b1092453-0112-487c-9476-469c8275386b" (UID: "b1092453-0112-487c-9476-469c8275386b"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 10:03:51 crc kubenswrapper[4861]: I0309 10:03:51.073300 4861 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b1092453-0112-487c-9476-469c8275386b-host\") on node \"crc\" DevicePath \"\"" Mar 09 10:03:51 crc kubenswrapper[4861]: I0309 10:03:51.080284 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1092453-0112-487c-9476-469c8275386b-kube-api-access-7tkb9" (OuterVolumeSpecName: "kube-api-access-7tkb9") pod "b1092453-0112-487c-9476-469c8275386b" (UID: "b1092453-0112-487c-9476-469c8275386b"). InnerVolumeSpecName "kube-api-access-7tkb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:03:51 crc kubenswrapper[4861]: I0309 10:03:51.175428 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tkb9\" (UniqueName: \"kubernetes.io/projected/b1092453-0112-487c-9476-469c8275386b-kube-api-access-7tkb9\") on node \"crc\" DevicePath \"\"" Mar 09 10:03:51 crc kubenswrapper[4861]: I0309 10:03:51.459217 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qf6cx/crc-debug-6vw7q"] Mar 09 10:03:51 crc kubenswrapper[4861]: E0309 10:03:51.459684 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1092453-0112-487c-9476-469c8275386b" containerName="container-00" Mar 09 10:03:51 crc kubenswrapper[4861]: I0309 10:03:51.459701 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1092453-0112-487c-9476-469c8275386b" containerName="container-00" Mar 09 10:03:51 crc kubenswrapper[4861]: I0309 10:03:51.459930 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1092453-0112-487c-9476-469c8275386b" containerName="container-00" Mar 09 10:03:51 crc kubenswrapper[4861]: I0309 10:03:51.460707 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qf6cx/crc-debug-6vw7q" Mar 09 10:03:51 crc kubenswrapper[4861]: I0309 10:03:51.582093 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f8576189-973c-4242-81db-3e0800dcf5d4-host\") pod \"crc-debug-6vw7q\" (UID: \"f8576189-973c-4242-81db-3e0800dcf5d4\") " pod="openshift-must-gather-qf6cx/crc-debug-6vw7q" Mar 09 10:03:51 crc kubenswrapper[4861]: I0309 10:03:51.582482 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m22w6\" (UniqueName: \"kubernetes.io/projected/f8576189-973c-4242-81db-3e0800dcf5d4-kube-api-access-m22w6\") pod \"crc-debug-6vw7q\" (UID: \"f8576189-973c-4242-81db-3e0800dcf5d4\") " pod="openshift-must-gather-qf6cx/crc-debug-6vw7q" Mar 09 10:03:51 crc kubenswrapper[4861]: I0309 10:03:51.669235 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1092453-0112-487c-9476-469c8275386b" path="/var/lib/kubelet/pods/b1092453-0112-487c-9476-469c8275386b/volumes" Mar 09 10:03:51 crc kubenswrapper[4861]: I0309 10:03:51.683837 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f8576189-973c-4242-81db-3e0800dcf5d4-host\") pod \"crc-debug-6vw7q\" (UID: \"f8576189-973c-4242-81db-3e0800dcf5d4\") " pod="openshift-must-gather-qf6cx/crc-debug-6vw7q" Mar 09 10:03:51 crc kubenswrapper[4861]: I0309 10:03:51.683943 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m22w6\" (UniqueName: \"kubernetes.io/projected/f8576189-973c-4242-81db-3e0800dcf5d4-kube-api-access-m22w6\") pod \"crc-debug-6vw7q\" (UID: \"f8576189-973c-4242-81db-3e0800dcf5d4\") " pod="openshift-must-gather-qf6cx/crc-debug-6vw7q" Mar 09 10:03:51 crc kubenswrapper[4861]: I0309 10:03:51.683997 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f8576189-973c-4242-81db-3e0800dcf5d4-host\") pod \"crc-debug-6vw7q\" (UID: \"f8576189-973c-4242-81db-3e0800dcf5d4\") " pod="openshift-must-gather-qf6cx/crc-debug-6vw7q" Mar 09 10:03:51 crc kubenswrapper[4861]: I0309 10:03:51.702075 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m22w6\" (UniqueName: \"kubernetes.io/projected/f8576189-973c-4242-81db-3e0800dcf5d4-kube-api-access-m22w6\") pod \"crc-debug-6vw7q\" (UID: \"f8576189-973c-4242-81db-3e0800dcf5d4\") " pod="openshift-must-gather-qf6cx/crc-debug-6vw7q" Mar 09 10:03:51 crc kubenswrapper[4861]: I0309 10:03:51.777837 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qf6cx/crc-debug-6vw7q" Mar 09 10:03:51 crc kubenswrapper[4861]: I0309 10:03:51.862459 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qf6cx/crc-debug-6vw7q" event={"ID":"f8576189-973c-4242-81db-3e0800dcf5d4","Type":"ContainerStarted","Data":"b53a98b52f7c8244556b6833f74fa870b9450b81196f03139b0d28932c5af94f"} Mar 09 10:03:51 crc kubenswrapper[4861]: I0309 10:03:51.865124 4861 scope.go:117] "RemoveContainer" containerID="58f6b0d2bbb2902a7de5eb5985724de08925d760a4c59da34f46135d9d5a26ad" Mar 09 10:03:51 crc kubenswrapper[4861]: I0309 10:03:51.865207 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qf6cx/crc-debug-fbwr5" Mar 09 10:03:52 crc kubenswrapper[4861]: I0309 10:03:52.877683 4861 generic.go:334] "Generic (PLEG): container finished" podID="f8576189-973c-4242-81db-3e0800dcf5d4" containerID="9a6490f1c2630686b13e11bcd5c8caae564a32037b3c8c18bf38df060f8fbc9b" exitCode=0 Mar 09 10:03:52 crc kubenswrapper[4861]: I0309 10:03:52.877746 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qf6cx/crc-debug-6vw7q" event={"ID":"f8576189-973c-4242-81db-3e0800dcf5d4","Type":"ContainerDied","Data":"9a6490f1c2630686b13e11bcd5c8caae564a32037b3c8c18bf38df060f8fbc9b"} Mar 09 10:03:52 crc kubenswrapper[4861]: I0309 10:03:52.922593 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-qf6cx/crc-debug-6vw7q"] Mar 09 10:03:52 crc kubenswrapper[4861]: I0309 10:03:52.934551 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-qf6cx/crc-debug-6vw7q"] Mar 09 10:03:54 crc kubenswrapper[4861]: I0309 10:03:53.999999 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qf6cx/crc-debug-6vw7q" Mar 09 10:03:54 crc kubenswrapper[4861]: I0309 10:03:54.132552 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m22w6\" (UniqueName: \"kubernetes.io/projected/f8576189-973c-4242-81db-3e0800dcf5d4-kube-api-access-m22w6\") pod \"f8576189-973c-4242-81db-3e0800dcf5d4\" (UID: \"f8576189-973c-4242-81db-3e0800dcf5d4\") " Mar 09 10:03:54 crc kubenswrapper[4861]: I0309 10:03:54.132648 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f8576189-973c-4242-81db-3e0800dcf5d4-host\") pod \"f8576189-973c-4242-81db-3e0800dcf5d4\" (UID: \"f8576189-973c-4242-81db-3e0800dcf5d4\") " Mar 09 10:03:54 crc kubenswrapper[4861]: I0309 10:03:54.132727 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f8576189-973c-4242-81db-3e0800dcf5d4-host" (OuterVolumeSpecName: "host") pod "f8576189-973c-4242-81db-3e0800dcf5d4" (UID: "f8576189-973c-4242-81db-3e0800dcf5d4"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 10:03:54 crc kubenswrapper[4861]: I0309 10:03:54.133200 4861 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f8576189-973c-4242-81db-3e0800dcf5d4-host\") on node \"crc\" DevicePath \"\"" Mar 09 10:03:54 crc kubenswrapper[4861]: I0309 10:03:54.139789 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8576189-973c-4242-81db-3e0800dcf5d4-kube-api-access-m22w6" (OuterVolumeSpecName: "kube-api-access-m22w6") pod "f8576189-973c-4242-81db-3e0800dcf5d4" (UID: "f8576189-973c-4242-81db-3e0800dcf5d4"). InnerVolumeSpecName "kube-api-access-m22w6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:03:54 crc kubenswrapper[4861]: I0309 10:03:54.234816 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m22w6\" (UniqueName: \"kubernetes.io/projected/f8576189-973c-4242-81db-3e0800dcf5d4-kube-api-access-m22w6\") on node \"crc\" DevicePath \"\"" Mar 09 10:03:54 crc kubenswrapper[4861]: I0309 10:03:54.606193 4861 patch_prober.go:28] interesting pod/machine-config-daemon-5g7gc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 10:03:54 crc kubenswrapper[4861]: I0309 10:03:54.606273 4861 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 10:03:54 crc kubenswrapper[4861]: I0309 10:03:54.606330 4861 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" Mar 09 10:03:54 crc kubenswrapper[4861]: I0309 10:03:54.607256 4861 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7e8b0aadd87c38d58bff92f69ce179da04d7d49c2a37109605ba66bad8f23ee6"} pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 10:03:54 crc kubenswrapper[4861]: I0309 10:03:54.607331 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" containerID="cri-o://7e8b0aadd87c38d58bff92f69ce179da04d7d49c2a37109605ba66bad8f23ee6" gracePeriod=600 Mar 09 10:03:54 crc kubenswrapper[4861]: I0309 10:03:54.899329 4861 generic.go:334] "Generic (PLEG): container finished" podID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerID="7e8b0aadd87c38d58bff92f69ce179da04d7d49c2a37109605ba66bad8f23ee6" exitCode=0 Mar 09 10:03:54 crc kubenswrapper[4861]: I0309 10:03:54.899387 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" event={"ID":"6f7875e3-174f-4c67-8675-d878de74aa4f","Type":"ContainerDied","Data":"7e8b0aadd87c38d58bff92f69ce179da04d7d49c2a37109605ba66bad8f23ee6"} Mar 09 10:03:54 crc kubenswrapper[4861]: I0309 10:03:54.899728 4861 scope.go:117] "RemoveContainer" containerID="77c857d062282bd5e967d0e73907a96e77dbe37daf546bb7546f15dc1d364592" Mar 09 10:03:54 crc kubenswrapper[4861]: I0309 10:03:54.904176 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qf6cx/crc-debug-6vw7q" Mar 09 10:03:54 crc kubenswrapper[4861]: I0309 10:03:54.947717 4861 scope.go:117] "RemoveContainer" containerID="9a6490f1c2630686b13e11bcd5c8caae564a32037b3c8c18bf38df060f8fbc9b" Mar 09 10:03:55 crc kubenswrapper[4861]: I0309 10:03:55.668936 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8576189-973c-4242-81db-3e0800dcf5d4" path="/var/lib/kubelet/pods/f8576189-973c-4242-81db-3e0800dcf5d4/volumes" Mar 09 10:03:55 crc kubenswrapper[4861]: I0309 10:03:55.913934 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" event={"ID":"6f7875e3-174f-4c67-8675-d878de74aa4f","Type":"ContainerStarted","Data":"7bbbec7e8a5f7da112767a5599dcb0b362a1472008cac65425c981acbf405224"} Mar 09 10:04:00 crc kubenswrapper[4861]: I0309 10:04:00.146366 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550844-84kmk"] Mar 09 10:04:00 crc kubenswrapper[4861]: E0309 10:04:00.147275 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8576189-973c-4242-81db-3e0800dcf5d4" containerName="container-00" Mar 09 10:04:00 crc kubenswrapper[4861]: I0309 10:04:00.147293 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8576189-973c-4242-81db-3e0800dcf5d4" containerName="container-00" Mar 09 10:04:00 crc kubenswrapper[4861]: I0309 10:04:00.147584 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8576189-973c-4242-81db-3e0800dcf5d4" containerName="container-00" Mar 09 10:04:00 crc kubenswrapper[4861]: I0309 10:04:00.148470 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550844-84kmk" Mar 09 10:04:00 crc kubenswrapper[4861]: I0309 10:04:00.151138 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 10:04:00 crc kubenswrapper[4861]: I0309 10:04:00.151222 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 10:04:00 crc kubenswrapper[4861]: I0309 10:04:00.151500 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k86l8" Mar 09 10:04:00 crc kubenswrapper[4861]: I0309 10:04:00.158544 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550844-84kmk"] Mar 09 10:04:00 crc kubenswrapper[4861]: I0309 10:04:00.253193 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7qvn\" (UniqueName: \"kubernetes.io/projected/64419f08-ab44-4dbf-ab6f-3ef60a1f1c3b-kube-api-access-l7qvn\") pod \"auto-csr-approver-29550844-84kmk\" (UID: \"64419f08-ab44-4dbf-ab6f-3ef60a1f1c3b\") " pod="openshift-infra/auto-csr-approver-29550844-84kmk" Mar 09 10:04:00 crc kubenswrapper[4861]: I0309 10:04:00.354947 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7qvn\" (UniqueName: \"kubernetes.io/projected/64419f08-ab44-4dbf-ab6f-3ef60a1f1c3b-kube-api-access-l7qvn\") pod \"auto-csr-approver-29550844-84kmk\" (UID: \"64419f08-ab44-4dbf-ab6f-3ef60a1f1c3b\") " pod="openshift-infra/auto-csr-approver-29550844-84kmk" Mar 09 10:04:00 crc kubenswrapper[4861]: I0309 10:04:00.381082 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7qvn\" (UniqueName: \"kubernetes.io/projected/64419f08-ab44-4dbf-ab6f-3ef60a1f1c3b-kube-api-access-l7qvn\") pod \"auto-csr-approver-29550844-84kmk\" (UID: \"64419f08-ab44-4dbf-ab6f-3ef60a1f1c3b\") " pod="openshift-infra/auto-csr-approver-29550844-84kmk" Mar 09 10:04:00 crc kubenswrapper[4861]: I0309 10:04:00.468389 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550844-84kmk" Mar 09 10:04:01 crc kubenswrapper[4861]: I0309 10:04:01.003048 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550844-84kmk"] Mar 09 10:04:01 crc kubenswrapper[4861]: W0309 10:04:01.008549 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64419f08_ab44_4dbf_ab6f_3ef60a1f1c3b.slice/crio-3a4e8ad57eb7a1bc2a783134615a827cdebbaff94eb097375b68abe97dadece8 WatchSource:0}: Error finding container 3a4e8ad57eb7a1bc2a783134615a827cdebbaff94eb097375b68abe97dadece8: Status 404 returned error can't find the container with id 3a4e8ad57eb7a1bc2a783134615a827cdebbaff94eb097375b68abe97dadece8 Mar 09 10:04:01 crc kubenswrapper[4861]: I0309 10:04:01.967459 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550844-84kmk" event={"ID":"64419f08-ab44-4dbf-ab6f-3ef60a1f1c3b","Type":"ContainerStarted","Data":"3a4e8ad57eb7a1bc2a783134615a827cdebbaff94eb097375b68abe97dadece8"} Mar 09 10:04:02 crc kubenswrapper[4861]: I0309 10:04:02.978185 4861 generic.go:334] "Generic (PLEG): container finished" podID="64419f08-ab44-4dbf-ab6f-3ef60a1f1c3b" containerID="fc2c6afa73b4227406e48803bf40c6c5d86f0c387b3d847cb91eb1b35bf15333" exitCode=0 Mar 09 10:04:02 crc kubenswrapper[4861]: I0309 10:04:02.978421 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550844-84kmk" event={"ID":"64419f08-ab44-4dbf-ab6f-3ef60a1f1c3b","Type":"ContainerDied","Data":"fc2c6afa73b4227406e48803bf40c6c5d86f0c387b3d847cb91eb1b35bf15333"} Mar 09 10:04:04 crc kubenswrapper[4861]: I0309 10:04:04.310506 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550844-84kmk" Mar 09 10:04:04 crc kubenswrapper[4861]: I0309 10:04:04.444875 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7qvn\" (UniqueName: \"kubernetes.io/projected/64419f08-ab44-4dbf-ab6f-3ef60a1f1c3b-kube-api-access-l7qvn\") pod \"64419f08-ab44-4dbf-ab6f-3ef60a1f1c3b\" (UID: \"64419f08-ab44-4dbf-ab6f-3ef60a1f1c3b\") " Mar 09 10:04:04 crc kubenswrapper[4861]: I0309 10:04:04.450876 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64419f08-ab44-4dbf-ab6f-3ef60a1f1c3b-kube-api-access-l7qvn" (OuterVolumeSpecName: "kube-api-access-l7qvn") pod "64419f08-ab44-4dbf-ab6f-3ef60a1f1c3b" (UID: "64419f08-ab44-4dbf-ab6f-3ef60a1f1c3b"). InnerVolumeSpecName "kube-api-access-l7qvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:04:04 crc kubenswrapper[4861]: I0309 10:04:04.547436 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7qvn\" (UniqueName: \"kubernetes.io/projected/64419f08-ab44-4dbf-ab6f-3ef60a1f1c3b-kube-api-access-l7qvn\") on node \"crc\" DevicePath \"\"" Mar 09 10:04:05 crc kubenswrapper[4861]: I0309 10:04:05.001703 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550844-84kmk" event={"ID":"64419f08-ab44-4dbf-ab6f-3ef60a1f1c3b","Type":"ContainerDied","Data":"3a4e8ad57eb7a1bc2a783134615a827cdebbaff94eb097375b68abe97dadece8"} Mar 09 10:04:05 crc kubenswrapper[4861]: I0309 10:04:05.001749 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550844-84kmk" Mar 09 10:04:05 crc kubenswrapper[4861]: I0309 10:04:05.001754 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a4e8ad57eb7a1bc2a783134615a827cdebbaff94eb097375b68abe97dadece8" Mar 09 10:04:05 crc kubenswrapper[4861]: I0309 10:04:05.401499 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550838-g4gxr"] Mar 09 10:04:05 crc kubenswrapper[4861]: I0309 10:04:05.409013 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550838-g4gxr"] Mar 09 10:04:05 crc kubenswrapper[4861]: I0309 10:04:05.669485 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e005695-9c62-4320-a9f8-192525751618" path="/var/lib/kubelet/pods/2e005695-9c62-4320-a9f8-192525751618/volumes" Mar 09 10:04:08 crc kubenswrapper[4861]: I0309 10:04:08.975278 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5b9d58d97d-h7pvq_23b061c3-2bd5-4b7c-bdf6-76da2791cc8e/barbican-api/0.log" Mar 09 10:04:09 crc kubenswrapper[4861]: I0309 10:04:09.146764 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5b9d58d97d-h7pvq_23b061c3-2bd5-4b7c-bdf6-76da2791cc8e/barbican-api-log/0.log" Mar 09 10:04:09 crc kubenswrapper[4861]: I0309 10:04:09.191284 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7f66785d8-vkcmq_8bc3e378-d567-4ba4-b135-1393faa1dbc6/barbican-keystone-listener/0.log" Mar 09 10:04:09 crc kubenswrapper[4861]: I0309 10:04:09.199842 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7f66785d8-vkcmq_8bc3e378-d567-4ba4-b135-1393faa1dbc6/barbican-keystone-listener-log/0.log" Mar 09 10:04:09 crc kubenswrapper[4861]: I0309 10:04:09.338762 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7b6689bdbc-t6phd_82a35d2d-6934-4c56-a62d-db22ac36a6be/barbican-worker/0.log" Mar 09 10:04:09 crc kubenswrapper[4861]: I0309 10:04:09.375133 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7b6689bdbc-t6phd_82a35d2d-6934-4c56-a62d-db22ac36a6be/barbican-worker-log/0.log" Mar 09 10:04:09 crc kubenswrapper[4861]: I0309 10:04:09.555020 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-6k9cp_b1f90870-1ef5-46d6-b495-f41e2d14a888/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 10:04:09 crc kubenswrapper[4861]: I0309 10:04:09.572672 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_75432149-8e10-4aae-8ad4-fbf3b5a10063/ceilometer-central-agent/0.log" Mar 09 10:04:09 crc kubenswrapper[4861]: I0309 10:04:09.653792 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_75432149-8e10-4aae-8ad4-fbf3b5a10063/ceilometer-notification-agent/0.log" Mar 09 10:04:09 crc kubenswrapper[4861]: I0309 10:04:09.766550 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_75432149-8e10-4aae-8ad4-fbf3b5a10063/proxy-httpd/0.log" Mar 09 10:04:09 crc kubenswrapper[4861]: I0309 10:04:09.785158 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_75432149-8e10-4aae-8ad4-fbf3b5a10063/sg-core/0.log" Mar 09 10:04:09 crc kubenswrapper[4861]: I0309 10:04:09.899961 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_c9fa67cc-6a0f-485d-b064-cd14971058db/cinder-api/0.log" Mar 09 10:04:09 crc kubenswrapper[4861]: I0309 10:04:09.986490 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_c9fa67cc-6a0f-485d-b064-cd14971058db/cinder-api-log/0.log" Mar 09 10:04:10 crc kubenswrapper[4861]: I0309 10:04:10.116672 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_072cabf9-18cb-4562-a6a2-7f2b46a4f9ec/cinder-scheduler/0.log" Mar 09 10:04:10 crc kubenswrapper[4861]: I0309 10:04:10.161973 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_072cabf9-18cb-4562-a6a2-7f2b46a4f9ec/probe/0.log" Mar 09 10:04:10 crc kubenswrapper[4861]: I0309 10:04:10.232072 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-ww9c7_b6f88b43-ae35-4f74-b14a-96332076ed1f/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 10:04:10 crc kubenswrapper[4861]: I0309 10:04:10.488474 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-79fcc958f9-jb5bq_780ba45c-97cb-4382-9d7a-268051c773d1/init/0.log" Mar 09 10:04:10 crc kubenswrapper[4861]: I0309 10:04:10.496733 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-dgmz7_7bbe9e42-4b9b-42e7-bfed-ff93ff905164/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 10:04:10 crc kubenswrapper[4861]: I0309 10:04:10.635763 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-79fcc958f9-jb5bq_780ba45c-97cb-4382-9d7a-268051c773d1/init/0.log" Mar 09 10:04:10 crc kubenswrapper[4861]: I0309 10:04:10.736417 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-79fcc958f9-jb5bq_780ba45c-97cb-4382-9d7a-268051c773d1/dnsmasq-dns/0.log" Mar 09 10:04:10 crc kubenswrapper[4861]: I0309 10:04:10.759770 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-2gml2_28e43d3e-921e-4f6c-be2f-f37e5625374a/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 10:04:10 crc kubenswrapper[4861]: I0309 10:04:10.949443 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_d237bf3c-da06-48d8-aef3-91be47f05320/glance-log/0.log" Mar 09 10:04:10 crc kubenswrapper[4861]: I0309 10:04:10.972609 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_d237bf3c-da06-48d8-aef3-91be47f05320/glance-httpd/0.log" Mar 09 10:04:11 crc kubenswrapper[4861]: I0309 10:04:11.127346 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_6611f0ac-3406-4da9-b81a-1515dddfafcd/glance-log/0.log" Mar 09 10:04:11 crc kubenswrapper[4861]: I0309 10:04:11.136764 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_6611f0ac-3406-4da9-b81a-1515dddfafcd/glance-httpd/0.log" Mar 09 10:04:11 crc kubenswrapper[4861]: I0309 10:04:11.299849 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5487f4d458-lnthc_9049886d-2460-47fe-ac82-2dfde4858bd0/horizon/0.log" Mar 09 10:04:11 crc kubenswrapper[4861]: I0309 10:04:11.462447 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9_2420e9c4-faed-48f0-857d-4aba72c5cab2/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 10:04:11 crc kubenswrapper[4861]: I0309 10:04:11.637684 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5487f4d458-lnthc_9049886d-2460-47fe-ac82-2dfde4858bd0/horizon-log/0.log" Mar 09 10:04:11 crc kubenswrapper[4861]: I0309 10:04:11.650821 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-j8d2h_0bb63c32-5f67-4912-b238-893dc92107b9/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 10:04:11 crc kubenswrapper[4861]: I0309 10:04:11.909319 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29550841-pc6h2_2153c5af-92d8-4f6e-b299-8b06d30603f5/keystone-cron/0.log" Mar 09 10:04:11 crc kubenswrapper[4861]: I0309 10:04:11.951099 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-5cc5bc567f-7k86v_596fb22d-649e-4e00-b847-71b506786832/keystone-api/0.log" Mar 09 10:04:12 crc kubenswrapper[4861]: I0309 10:04:12.097783 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_5a9682bd-f0fc-47d6-9a66-e35fb0630f44/kube-state-metrics/0.log" Mar 09 10:04:12 crc kubenswrapper[4861]: I0309 10:04:12.153604 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-ngps7_d783d4c7-dfa9-4783-a80c-2938d2a5841d/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 10:04:12 crc kubenswrapper[4861]: I0309 10:04:12.483875 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-665f4b6689-tfdk9_db5464b8-011f-4569-a47e-36766fa6c72e/neutron-api/0.log" Mar 09 10:04:12 crc kubenswrapper[4861]: I0309 10:04:12.562411 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-665f4b6689-tfdk9_db5464b8-011f-4569-a47e-36766fa6c72e/neutron-httpd/0.log" Mar 09 10:04:12 crc kubenswrapper[4861]: I0309 10:04:12.718353 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-d9zgg_9e525f15-c77e-4a1c-a161-4db82064bf70/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 10:04:13 crc kubenswrapper[4861]: I0309 10:04:13.281500 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_a7f9f9a6-c593-4015-833b-ef237f492b70/nova-api-log/0.log" Mar 09 10:04:13 crc kubenswrapper[4861]: I0309 10:04:13.512559 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_fc47e276-b337-4696-ac08-1aa31c4b6864/nova-cell0-conductor-conductor/0.log" Mar 09 10:04:13 crc kubenswrapper[4861]: I0309 10:04:13.550270 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_a7f9f9a6-c593-4015-833b-ef237f492b70/nova-api-api/0.log" Mar 09 10:04:13 crc kubenswrapper[4861]: I0309 10:04:13.676512 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_c5b057f4-8239-4e46-b205-81552d6cd5e6/nova-cell1-conductor-conductor/0.log" Mar 09 10:04:14 crc kubenswrapper[4861]: I0309 10:04:14.006162 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_dd2e86d2-700f-4fd8-b89b-84bd5a09069d/nova-cell1-novncproxy-novncproxy/0.log" Mar 09 10:04:14 crc kubenswrapper[4861]: I0309 10:04:14.232841 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-mkzpx_23236f7d-915f-4619-b5ba-611375aef594/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 10:04:14 crc kubenswrapper[4861]: I0309 10:04:14.349111 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_9d5c1be9-8604-4565-9175-703ff865c6eb/nova-metadata-log/0.log" Mar 09 10:04:14 crc kubenswrapper[4861]: I0309 10:04:14.734032 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f095ca7b-1959-4cda-bde8-40ca6446e34d/mysql-bootstrap/0.log" Mar 09 10:04:14 crc kubenswrapper[4861]: I0309 10:04:14.764478 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_0a9f9492-a68d-4f37-bffc-4f13ebe23db7/nova-scheduler-scheduler/0.log" Mar 09 10:04:14 crc kubenswrapper[4861]: I0309 10:04:14.901518 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f095ca7b-1959-4cda-bde8-40ca6446e34d/mysql-bootstrap/0.log" Mar 09 10:04:14 crc kubenswrapper[4861]: I0309 10:04:14.976535 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f095ca7b-1959-4cda-bde8-40ca6446e34d/galera/0.log" Mar 09 10:04:15 crc kubenswrapper[4861]: I0309 10:04:15.114544 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_0ab732e3-1122-4f45-a9af-b36eaa88c19e/mysql-bootstrap/0.log" Mar 09 10:04:15 crc kubenswrapper[4861]: I0309 10:04:15.339075 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_0ab732e3-1122-4f45-a9af-b36eaa88c19e/mysql-bootstrap/0.log" Mar 09 10:04:15 crc kubenswrapper[4861]: I0309 10:04:15.360921 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_0ab732e3-1122-4f45-a9af-b36eaa88c19e/galera/0.log" Mar 09 10:04:15 crc kubenswrapper[4861]: I0309 10:04:15.443234 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_9d5c1be9-8604-4565-9175-703ff865c6eb/nova-metadata-metadata/0.log" Mar 09 10:04:15 crc kubenswrapper[4861]: I0309 10:04:15.519019 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_62f873db-0b4f-4a99-bc1d-7cdff56989a2/openstackclient/0.log" Mar 09 10:04:15 crc kubenswrapper[4861]: I0309 10:04:15.640148 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-xvlmj_82770fbe-3052-4367-9c2d-a19a11d3a695/openstack-network-exporter/0.log" Mar 09 10:04:15 crc kubenswrapper[4861]: I0309 10:04:15.795821 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hmb5w_cafa7cbf-ac96-4bb5-a33e-90d69df5d797/ovsdb-server-init/0.log" Mar 09 10:04:15 crc kubenswrapper[4861]: I0309 10:04:15.884324 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hmb5w_cafa7cbf-ac96-4bb5-a33e-90d69df5d797/ovsdb-server-init/0.log" Mar 09 10:04:15 crc kubenswrapper[4861]: I0309 10:04:15.929400 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hmb5w_cafa7cbf-ac96-4bb5-a33e-90d69df5d797/ovs-vswitchd/0.log" Mar 09 10:04:15 crc kubenswrapper[4861]: I0309 10:04:15.932935 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hmb5w_cafa7cbf-ac96-4bb5-a33e-90d69df5d797/ovsdb-server/0.log" Mar 09 10:04:16 crc kubenswrapper[4861]: I0309 10:04:16.103192 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-s7nq5_4e354b06-2ae2-41af-b5d7-2909bca8cff6/ovn-controller/0.log" Mar 09 10:04:16 crc kubenswrapper[4861]: I0309 10:04:16.181039 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-9rkkl_47423b67-9acf-48b2-b8b5-d47b822ad425/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 10:04:16 crc kubenswrapper[4861]: I0309 10:04:16.346697 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_dea6c69a-803b-498c-b7e2-7d76629de3dc/openstack-network-exporter/0.log" Mar 09 10:04:16 crc kubenswrapper[4861]: I0309 10:04:16.433561 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_dea6c69a-803b-498c-b7e2-7d76629de3dc/ovn-northd/0.log" Mar 09 10:04:16 crc kubenswrapper[4861]: I0309 10:04:16.539758 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_1d597158-3a33-4518-a0b9-37cf5b309a28/openstack-network-exporter/0.log" Mar 09 10:04:16 crc kubenswrapper[4861]: I0309 10:04:16.561472 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_1d597158-3a33-4518-a0b9-37cf5b309a28/ovsdbserver-nb/0.log" Mar 09 10:04:16 crc kubenswrapper[4861]: I0309 10:04:16.689046 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_fc895133-add5-4388-8e97-1b0d16306648/openstack-network-exporter/0.log" Mar 09 10:04:16 crc kubenswrapper[4861]: I0309 10:04:16.721155 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_fc895133-add5-4388-8e97-1b0d16306648/ovsdbserver-sb/0.log" Mar 09 10:04:16 crc kubenswrapper[4861]: I0309 10:04:16.953686 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-678fb94c4b-9x5d2_80b2797f-285c-4a23-9385-b4845acb2820/placement-api/0.log" Mar 09 10:04:16 crc kubenswrapper[4861]: I0309 10:04:16.988469 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-678fb94c4b-9x5d2_80b2797f-285c-4a23-9385-b4845acb2820/placement-log/0.log" Mar 09 10:04:17 crc kubenswrapper[4861]: I0309 10:04:17.016276 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_36ab59d0-e730-43a5-a7f1-99f136e5f9d3/setup-container/0.log" Mar 09 10:04:17 crc kubenswrapper[4861]: I0309 10:04:17.301295 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_36ab59d0-e730-43a5-a7f1-99f136e5f9d3/setup-container/0.log" Mar 09 10:04:17 crc kubenswrapper[4861]: I0309 10:04:17.375287 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_36ab59d0-e730-43a5-a7f1-99f136e5f9d3/rabbitmq/0.log" Mar 09 10:04:17 crc kubenswrapper[4861]: I0309 10:04:17.424327 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_2c3f8770-f9a3-49ae-81e0-caad7b40ac46/setup-container/0.log" Mar 09 10:04:17 crc kubenswrapper[4861]: I0309 10:04:17.548843 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_2c3f8770-f9a3-49ae-81e0-caad7b40ac46/setup-container/0.log" Mar 09 10:04:17 crc kubenswrapper[4861]: I0309 10:04:17.632966 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_2c3f8770-f9a3-49ae-81e0-caad7b40ac46/rabbitmq/0.log" Mar 09 10:04:17 crc kubenswrapper[4861]: I0309 10:04:17.699382 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-62xxn_e3b0d4f8-537e-4894-bcf9-0cfa00a145ec/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 10:04:17 crc kubenswrapper[4861]: I0309 10:04:17.869959 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-gvccp_cd678163-1379-40da-be83-c4ace8b0cf0d/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 10:04:17 crc kubenswrapper[4861]: I0309 10:04:17.921200 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-kfh6m_438a18ff-fdc3-44f3-9c51-df15a691c389/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 10:04:18 crc kubenswrapper[4861]: I0309 10:04:18.102132 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-5mlpf_df0bad47-fa01-426d-af7b-e09057048052/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 10:04:18 crc kubenswrapper[4861]: I0309 10:04:18.146313 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-xq9tq_e8f222a3-04ea-475e-aab7-97cf0ba5021c/ssh-known-hosts-edpm-deployment/0.log" Mar 09 10:04:18 crc kubenswrapper[4861]: I0309 10:04:18.396023 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6f4f458d55-lxkls_1ed378c6-5773-4dd7-9889-52bcf62216e5/proxy-server/0.log" Mar 09 10:04:18 crc kubenswrapper[4861]: I0309 10:04:18.483201 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6f4f458d55-lxkls_1ed378c6-5773-4dd7-9889-52bcf62216e5/proxy-httpd/0.log" Mar 09 10:04:18 crc kubenswrapper[4861]: I0309 10:04:18.572523 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-qjxtd_6f0d289d-af18-4534-a0c6-c90f51e93fd8/swift-ring-rebalance/0.log" Mar 09 10:04:18 crc kubenswrapper[4861]: I0309 10:04:18.705677 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d1ad2fa7-36fc-4cd0-98ac-07b48c42e794/account-auditor/0.log" Mar 09 10:04:18 crc kubenswrapper[4861]: I0309 10:04:18.710968 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d1ad2fa7-36fc-4cd0-98ac-07b48c42e794/account-reaper/0.log" Mar 09 10:04:18 crc kubenswrapper[4861]: I0309 10:04:18.839820 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d1ad2fa7-36fc-4cd0-98ac-07b48c42e794/account-replicator/0.log" Mar 09 10:04:18 crc kubenswrapper[4861]: I0309 10:04:18.867327 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d1ad2fa7-36fc-4cd0-98ac-07b48c42e794/account-server/0.log" Mar 09 10:04:18 crc kubenswrapper[4861]: I0309 10:04:18.942219 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d1ad2fa7-36fc-4cd0-98ac-07b48c42e794/container-replicator/0.log" Mar 09 10:04:18 crc kubenswrapper[4861]: I0309 10:04:18.967557 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d1ad2fa7-36fc-4cd0-98ac-07b48c42e794/container-auditor/0.log" Mar 09 10:04:19 crc kubenswrapper[4861]: I0309 10:04:19.080978 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d1ad2fa7-36fc-4cd0-98ac-07b48c42e794/container-server/0.log" Mar 09 10:04:19 crc kubenswrapper[4861]: I0309 10:04:19.118872 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d1ad2fa7-36fc-4cd0-98ac-07b48c42e794/container-updater/0.log" Mar 09 10:04:19 crc kubenswrapper[4861]: I0309 10:04:19.179877 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d1ad2fa7-36fc-4cd0-98ac-07b48c42e794/object-auditor/0.log" Mar 09 10:04:19 crc kubenswrapper[4861]: I0309 10:04:19.181445 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d1ad2fa7-36fc-4cd0-98ac-07b48c42e794/object-expirer/0.log" Mar 09 10:04:19 crc kubenswrapper[4861]: I0309 10:04:19.351092 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d1ad2fa7-36fc-4cd0-98ac-07b48c42e794/object-server/0.log" Mar 09 10:04:19 crc kubenswrapper[4861]: I0309 10:04:19.372436 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d1ad2fa7-36fc-4cd0-98ac-07b48c42e794/object-updater/0.log" Mar 09 10:04:19 crc kubenswrapper[4861]: I0309 10:04:19.378644 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d1ad2fa7-36fc-4cd0-98ac-07b48c42e794/object-replicator/0.log" Mar 09 10:04:19 crc kubenswrapper[4861]: I0309 10:04:19.457361 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d1ad2fa7-36fc-4cd0-98ac-07b48c42e794/rsync/0.log" Mar 09 10:04:19 crc kubenswrapper[4861]: I0309 10:04:19.595540 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d1ad2fa7-36fc-4cd0-98ac-07b48c42e794/swift-recon-cron/0.log" Mar 09 10:04:19 crc kubenswrapper[4861]: I0309 10:04:19.728706 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-jbbn4_7c47d068-c590-40cb-aeb0-1cc5132d40dd/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 10:04:19 crc kubenswrapper[4861]: I0309 10:04:19.834268 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_f7ed5e40-0dc4-417c-bef9-cbf919777c67/tempest-tests-tempest-tests-runner/0.log" Mar 09 10:04:19 crc kubenswrapper[4861]: I0309 10:04:19.959265 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_571b04e8-dc75-4bf7-921c-82ee1f86b023/test-operator-logs-container/0.log" Mar 09 10:04:20 crc kubenswrapper[4861]: I0309 10:04:20.100100 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-7n4l5_ec37a9fa-7555-4e81-af4a-dad48b85942c/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 10:04:29 crc kubenswrapper[4861]: I0309 10:04:29.172667 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_a3671a10-52be-44e3-9c3d-11ba14e8e449/memcached/0.log" Mar 09 10:04:38 crc kubenswrapper[4861]: I0309 10:04:38.071540 4861 scope.go:117] "RemoveContainer" containerID="0dae7640fa970823283fa042152987d7e0ece6b44431e37ea15241134f184d51" Mar 09 10:04:45 crc kubenswrapper[4861]: I0309 10:04:45.289294 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-5d87c9d997-2wbrm_5cccaa46-1901-457b-b093-9edfb512b68f/manager/0.log" Mar 09 10:04:45 crc kubenswrapper[4861]: I0309 10:04:45.559586 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfa6w5lp_2530cbbb-c2de-41ce-b6d2-a9593ed9226d/util/0.log" Mar 09 10:04:45 crc kubenswrapper[4861]: I0309 10:04:45.780881 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfa6w5lp_2530cbbb-c2de-41ce-b6d2-a9593ed9226d/pull/0.log" Mar 09 10:04:45 crc kubenswrapper[4861]: I0309 10:04:45.817529 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfa6w5lp_2530cbbb-c2de-41ce-b6d2-a9593ed9226d/util/0.log" Mar 09 10:04:46 crc kubenswrapper[4861]: I0309 10:04:46.200306 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfa6w5lp_2530cbbb-c2de-41ce-b6d2-a9593ed9226d/pull/0.log" Mar 09 10:04:46 crc kubenswrapper[4861]: I0309 10:04:46.349902 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfa6w5lp_2530cbbb-c2de-41ce-b6d2-a9593ed9226d/util/0.log" Mar 09 10:04:46 crc kubenswrapper[4861]: I0309 10:04:46.437360 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfa6w5lp_2530cbbb-c2de-41ce-b6d2-a9593ed9226d/pull/0.log" Mar 09 10:04:46 crc kubenswrapper[4861]: I0309 10:04:46.586834 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfa6w5lp_2530cbbb-c2de-41ce-b6d2-a9593ed9226d/extract/0.log" Mar 09 10:04:46 crc kubenswrapper[4861]: I0309 10:04:46.605586 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-55d77d7b5c-89f2d_861a14c0-5dcd-4126-b386-65467726a9dd/manager/0.log" Mar 09 10:04:46 crc kubenswrapper[4861]: I0309 10:04:46.946272 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-64db6967f8-pdxgs_8fb737b1-978f-4f1e-98db-f1c542ef77d9/manager/0.log" Mar 09 10:04:46 crc kubenswrapper[4861]: I0309 10:04:46.970920 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-cf99c678f-jw5nb_23566122-1654-40c1-8dd6-577280d0dcec/manager/0.log" Mar 09 10:04:47 crc kubenswrapper[4861]: I0309 10:04:47.222184 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-78bc7f9bd9-bwxj2_32ddb619-584a-4ff4-a988-63d565043353/manager/0.log" Mar 09 10:04:47 crc kubenswrapper[4861]: I0309 10:04:47.523954 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-545456dc4-blz6f_e9e766bf-fdea-451e-a58c-a8818fccf4b4/manager/0.log" Mar 09 10:04:47 crc kubenswrapper[4861]: I0309 10:04:47.839912 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7c789f89c6-pktcs_fbbd2d76-31fb-46d7-a422-af5f3e51baaf/manager/0.log" Mar 09 10:04:47 crc kubenswrapper[4861]: I0309 10:04:47.891895 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-f7fcc58b9-srw9z_1b8226be-5eb4-4156-a168-f843edac34ce/manager/0.log" Mar 09 10:04:48 crc kubenswrapper[4861]: I0309 10:04:48.169704 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-67d996989d-8vnvb_78b36bea-6c3d-4794-b38a-6b4a5b3e9f5d/manager/0.log" Mar 09 10:04:48 crc kubenswrapper[4861]: I0309 10:04:48.287588 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-7b6bfb6475-lbhm2_12c3b94d-baff-4b5d-864e-371f5b3857f5/manager/0.log" Mar 09 10:04:48 crc kubenswrapper[4861]: I0309 10:04:48.516022 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-54688575f-9dhrv_091caccf-659b-42dd-b9cb-05aeea2548ce/manager/0.log" Mar 09 10:04:48 crc kubenswrapper[4861]: I0309 10:04:48.748112 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-74b6b5dc96-lnm2d_511b2722-0227-4a4f-931c-e69ad12e60de/manager/0.log" Mar 09 10:04:48 crc kubenswrapper[4861]: I0309 10:04:48.790337 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5d86c7ddb7-2cb86_eec501ad-33c8-4195-8817-3078202db97a/manager/0.log" Mar 09 10:04:49 crc kubenswrapper[4861]: I0309 10:04:49.000424 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6f64bd8c75j8xx4_72b49679-1f56-42df-bafc-a899cd2da3cf/manager/0.log" Mar 09 10:04:49 crc kubenswrapper[4861]: I0309 10:04:49.487955 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-568b7cf6db-zpgv9_4f92d232-a96d-4774-8bfb-ece261f9b9d4/operator/0.log" Mar 09 10:04:49 crc kubenswrapper[4861]: I0309 10:04:49.866237 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-cqfnf_793c9771-3185-4264-b109-a94fcc50a305/registry-server/0.log" Mar 09 10:04:50 crc kubenswrapper[4861]: I0309 10:04:50.086035 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-75684d597f-qqwld_73ae93db-3260-4af6-9724-52e8b97a0245/manager/0.log" Mar 09 10:04:50 crc kubenswrapper[4861]: I0309 10:04:50.270504 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-648564c9fc-2bqxz_b386f8ad-7867-4d35-83f8-382a379e3c1e/manager/0.log" Mar 09 10:04:50 crc kubenswrapper[4861]: I0309 10:04:50.433439 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-xnfpv_e649bda4-59a3-47e6-92e2-910c01b2f7c2/operator/0.log" Mar 09 10:04:50 crc kubenswrapper[4861]: I0309 10:04:50.539885 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-9b9ff9f4d-hrnq8_925518c8-3714-4180-ad1f-9bee534dd0dc/manager/0.log" Mar 09 10:04:50 crc kubenswrapper[4861]: I0309 10:04:50.838731 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5fdb694969-vlxrh_fdd188d8-f434-493e-a8f5-3506031b0f83/manager/0.log" Mar 09 10:04:50 crc kubenswrapper[4861]: I0309 10:04:50.859344 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-55b5ff4dbb-ztmwj_175e8d6d-930a-484b-b0b3-d45f37da4239/manager/0.log" Mar 09 10:04:51 crc kubenswrapper[4861]: I0309 10:04:51.111291 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-bccc79885-pmwxp_4cbd2609-0983-4da2-a0c7-fa66387e36ae/manager/0.log" Mar 09 10:04:51 crc kubenswrapper[4861]: I0309 10:04:51.164560 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-59b6c9788f-lfprc_ec436429-c762-4e15-8f82-19a10cdc7941/manager/0.log" Mar 09 10:04:52 crc kubenswrapper[4861]: I0309 10:04:52.002160 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6db6876945-s4tc6_eaaa08cd-22f8-40a3-9cac-7e29137ea358/manager/0.log" Mar 09 10:05:11 crc kubenswrapper[4861]: I0309 10:05:11.057300 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-trcr9_d8afafe8-bf56-46a7-bab9-c5a1c221a740/control-plane-machine-set-operator/0.log" Mar 09 10:05:11 crc kubenswrapper[4861]: I0309 10:05:11.203082 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-gqnxr_da4004a6-c6fd-41d6-a651-b4aaec2d6454/kube-rbac-proxy/0.log" Mar 09 10:05:11 crc kubenswrapper[4861]: I0309 10:05:11.213692 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-gqnxr_da4004a6-c6fd-41d6-a651-b4aaec2d6454/machine-api-operator/0.log" Mar 09 10:05:23 crc kubenswrapper[4861]: I0309 10:05:23.526283 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-nrqjp_07ac624a-3ef3-4179-96d7-aa49ff085d5e/cert-manager-controller/0.log" Mar 09 10:05:23 crc kubenswrapper[4861]: I0309 10:05:23.833086 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-6wl44_6f869345-5b73-43d1-9617-bf883a753bb8/cert-manager-cainjector/0.log" Mar 09 10:05:23 crc kubenswrapper[4861]: I0309 10:05:23.944613 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-g724f_55473545-bf70-472a-96e5-18cc3bfac07d/cert-manager-webhook/0.log" Mar 09 10:05:37 crc kubenswrapper[4861]: I0309 10:05:37.608491 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5dcbbd79cf-mgp5h_d29132f4-b735-40f3-94da-033f1174963f/nmstate-console-plugin/0.log" Mar 09 10:05:37 crc kubenswrapper[4861]: I0309 10:05:37.804672 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-qvk5q_09f364f8-e2b6-4ffe-b51a-37af17081bf8/nmstate-handler/0.log" Mar 09 10:05:37 crc kubenswrapper[4861]: I0309 10:05:37.870482 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-wz5b2_194a66ef-2b30-40b7-bbfd-5a2c3a51ad55/kube-rbac-proxy/0.log" Mar 09 10:05:37 crc kubenswrapper[4861]: I0309 10:05:37.926942 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-wz5b2_194a66ef-2b30-40b7-bbfd-5a2c3a51ad55/nmstate-metrics/0.log" Mar 09 10:05:38 crc kubenswrapper[4861]: I0309 10:05:38.066281 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-75c5dccd6c-bgvbh_ec451b1d-d99e-48c4-a550-83bac053d5dc/nmstate-operator/0.log" Mar 09 10:05:38 crc kubenswrapper[4861]: I0309 10:05:38.159181 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-786f45cff4-27zgp_f313fc1b-02bd-4bbd-bfdb-a18a300ed8bb/nmstate-webhook/0.log" Mar 09 10:05:54 crc kubenswrapper[4861]: I0309 10:05:54.605823 4861 patch_prober.go:28] interesting pod/machine-config-daemon-5g7gc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 10:05:54 crc kubenswrapper[4861]: I0309 10:05:54.606352 4861 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 10:06:00 crc kubenswrapper[4861]: I0309 10:06:00.148744 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550846-ffzgt"] Mar 09 10:06:00 crc kubenswrapper[4861]: E0309 10:06:00.149568 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64419f08-ab44-4dbf-ab6f-3ef60a1f1c3b" containerName="oc" Mar 09 10:06:00 crc kubenswrapper[4861]: I0309 10:06:00.149580 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="64419f08-ab44-4dbf-ab6f-3ef60a1f1c3b" containerName="oc" Mar 09 10:06:00 crc kubenswrapper[4861]: I0309 10:06:00.149753 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="64419f08-ab44-4dbf-ab6f-3ef60a1f1c3b" containerName="oc" Mar 09 10:06:00 crc kubenswrapper[4861]: I0309 10:06:00.150346 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550846-ffzgt" Mar 09 10:06:00 crc kubenswrapper[4861]: I0309 10:06:00.158236 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 10:06:00 crc kubenswrapper[4861]: I0309 10:06:00.158275 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 10:06:00 crc kubenswrapper[4861]: I0309 10:06:00.159114 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k86l8" Mar 09 10:06:00 crc kubenswrapper[4861]: I0309 10:06:00.166369 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550846-ffzgt"] Mar 09 10:06:00 crc kubenswrapper[4861]: I0309 10:06:00.283396 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4zpp\" (UniqueName: \"kubernetes.io/projected/f05e1178-3781-49c1-a69f-8247af50a922-kube-api-access-z4zpp\") pod \"auto-csr-approver-29550846-ffzgt\" (UID: \"f05e1178-3781-49c1-a69f-8247af50a922\") " pod="openshift-infra/auto-csr-approver-29550846-ffzgt" Mar 09 10:06:00 crc kubenswrapper[4861]: I0309 10:06:00.385707 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4zpp\" (UniqueName: \"kubernetes.io/projected/f05e1178-3781-49c1-a69f-8247af50a922-kube-api-access-z4zpp\") pod \"auto-csr-approver-29550846-ffzgt\" (UID: \"f05e1178-3781-49c1-a69f-8247af50a922\") " pod="openshift-infra/auto-csr-approver-29550846-ffzgt" Mar 09 10:06:00 crc kubenswrapper[4861]: I0309 10:06:00.414221 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4zpp\" (UniqueName: \"kubernetes.io/projected/f05e1178-3781-49c1-a69f-8247af50a922-kube-api-access-z4zpp\") pod \"auto-csr-approver-29550846-ffzgt\" (UID: \"f05e1178-3781-49c1-a69f-8247af50a922\") " pod="openshift-infra/auto-csr-approver-29550846-ffzgt" Mar 09 10:06:00 crc kubenswrapper[4861]: I0309 10:06:00.471138 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550846-ffzgt" Mar 09 10:06:00 crc kubenswrapper[4861]: I0309 10:06:00.999300 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550846-ffzgt"] Mar 09 10:06:01 crc kubenswrapper[4861]: I0309 10:06:01.073815 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550846-ffzgt" event={"ID":"f05e1178-3781-49c1-a69f-8247af50a922","Type":"ContainerStarted","Data":"8f0f41eeec3f05d4378c1c7b247e992d4e9bfc3b749c444428f3d9d8e714dde9"} Mar 09 10:06:03 crc kubenswrapper[4861]: I0309 10:06:03.093945 4861 generic.go:334] "Generic (PLEG): container finished" podID="f05e1178-3781-49c1-a69f-8247af50a922" containerID="542882b525db1989b92af9b58fe0999c0683a62bbd7717d146555b4991e800b5" exitCode=0 Mar 09 10:06:03 crc kubenswrapper[4861]: I0309 10:06:03.094012 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550846-ffzgt" event={"ID":"f05e1178-3781-49c1-a69f-8247af50a922","Type":"ContainerDied","Data":"542882b525db1989b92af9b58fe0999c0683a62bbd7717d146555b4991e800b5"} Mar 09 10:06:04 crc kubenswrapper[4861]: I0309 10:06:04.430603 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550846-ffzgt" Mar 09 10:06:04 crc kubenswrapper[4861]: I0309 10:06:04.458599 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4zpp\" (UniqueName: \"kubernetes.io/projected/f05e1178-3781-49c1-a69f-8247af50a922-kube-api-access-z4zpp\") pod \"f05e1178-3781-49c1-a69f-8247af50a922\" (UID: \"f05e1178-3781-49c1-a69f-8247af50a922\") " Mar 09 10:06:04 crc kubenswrapper[4861]: I0309 10:06:04.465027 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f05e1178-3781-49c1-a69f-8247af50a922-kube-api-access-z4zpp" (OuterVolumeSpecName: "kube-api-access-z4zpp") pod "f05e1178-3781-49c1-a69f-8247af50a922" (UID: "f05e1178-3781-49c1-a69f-8247af50a922"). InnerVolumeSpecName "kube-api-access-z4zpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:06:04 crc kubenswrapper[4861]: I0309 10:06:04.561912 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4zpp\" (UniqueName: \"kubernetes.io/projected/f05e1178-3781-49c1-a69f-8247af50a922-kube-api-access-z4zpp\") on node \"crc\" DevicePath \"\"" Mar 09 10:06:05 crc kubenswrapper[4861]: I0309 10:06:05.110805 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550846-ffzgt" event={"ID":"f05e1178-3781-49c1-a69f-8247af50a922","Type":"ContainerDied","Data":"8f0f41eeec3f05d4378c1c7b247e992d4e9bfc3b749c444428f3d9d8e714dde9"} Mar 09 10:06:05 crc kubenswrapper[4861]: I0309 10:06:05.111160 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f0f41eeec3f05d4378c1c7b247e992d4e9bfc3b749c444428f3d9d8e714dde9" Mar 09 10:06:05 crc kubenswrapper[4861]: I0309 10:06:05.110858 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550846-ffzgt" Mar 09 10:06:05 crc kubenswrapper[4861]: I0309 10:06:05.507940 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550840-nlgn5"] Mar 09 10:06:05 crc kubenswrapper[4861]: I0309 10:06:05.517017 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550840-nlgn5"] Mar 09 10:06:05 crc kubenswrapper[4861]: I0309 10:06:05.669031 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97334747-9900-462d-b915-910d721ee722" path="/var/lib/kubelet/pods/97334747-9900-462d-b915-910d721ee722/volumes" Mar 09 10:06:07 crc kubenswrapper[4861]: I0309 10:06:07.629891 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-hxljm_2e2a9a00-e47a-4d97-9b06-58dd635a7a55/kube-rbac-proxy/0.log" Mar 09 10:06:07 crc kubenswrapper[4861]: I0309 10:06:07.840549 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p5vjx_416fe11c-136c-4b38-9f85-2ba8df311664/cp-frr-files/0.log" Mar 09 10:06:07 crc kubenswrapper[4861]: I0309 10:06:07.874512 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-hxljm_2e2a9a00-e47a-4d97-9b06-58dd635a7a55/controller/0.log" Mar 09 10:06:08 crc kubenswrapper[4861]: I0309 10:06:08.100855 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p5vjx_416fe11c-136c-4b38-9f85-2ba8df311664/cp-reloader/0.log" Mar 09 10:06:08 crc kubenswrapper[4861]: I0309 10:06:08.133060 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p5vjx_416fe11c-136c-4b38-9f85-2ba8df311664/cp-metrics/0.log" Mar 09 10:06:08 crc kubenswrapper[4861]: I0309 10:06:08.145819 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p5vjx_416fe11c-136c-4b38-9f85-2ba8df311664/cp-reloader/0.log" Mar 09 10:06:08 crc kubenswrapper[4861]: I0309 10:06:08.151171 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p5vjx_416fe11c-136c-4b38-9f85-2ba8df311664/cp-frr-files/0.log" Mar 09 10:06:08 crc kubenswrapper[4861]: I0309 10:06:08.391971 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p5vjx_416fe11c-136c-4b38-9f85-2ba8df311664/cp-frr-files/0.log" Mar 09 10:06:08 crc kubenswrapper[4861]: I0309 10:06:08.414342 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p5vjx_416fe11c-136c-4b38-9f85-2ba8df311664/cp-metrics/0.log" Mar 09 10:06:08 crc kubenswrapper[4861]: I0309 10:06:08.416272 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p5vjx_416fe11c-136c-4b38-9f85-2ba8df311664/cp-metrics/0.log" Mar 09 10:06:08 crc kubenswrapper[4861]: I0309 10:06:08.434541 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p5vjx_416fe11c-136c-4b38-9f85-2ba8df311664/cp-reloader/0.log" Mar 09 10:06:08 crc kubenswrapper[4861]: I0309 10:06:08.645080 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p5vjx_416fe11c-136c-4b38-9f85-2ba8df311664/controller/0.log" Mar 09 10:06:08 crc kubenswrapper[4861]: I0309 10:06:08.666992 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p5vjx_416fe11c-136c-4b38-9f85-2ba8df311664/cp-metrics/0.log" Mar 09 10:06:08 crc kubenswrapper[4861]: I0309 10:06:08.679672 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p5vjx_416fe11c-136c-4b38-9f85-2ba8df311664/cp-frr-files/0.log" Mar 09 10:06:08 crc kubenswrapper[4861]: I0309 10:06:08.686306 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p5vjx_416fe11c-136c-4b38-9f85-2ba8df311664/cp-reloader/0.log" Mar 09 10:06:08 crc kubenswrapper[4861]: I0309 10:06:08.888494 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p5vjx_416fe11c-136c-4b38-9f85-2ba8df311664/frr-metrics/0.log" Mar 09 10:06:08 crc kubenswrapper[4861]: I0309 10:06:08.906272 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p5vjx_416fe11c-136c-4b38-9f85-2ba8df311664/kube-rbac-proxy-frr/0.log" Mar 09 10:06:08 crc kubenswrapper[4861]: I0309 10:06:08.948226 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p5vjx_416fe11c-136c-4b38-9f85-2ba8df311664/kube-rbac-proxy/0.log" Mar 09 10:06:09 crc kubenswrapper[4861]: I0309 10:06:09.111574 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p5vjx_416fe11c-136c-4b38-9f85-2ba8df311664/reloader/0.log" Mar 09 10:06:09 crc kubenswrapper[4861]: I0309 10:06:09.144996 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-2qds9_b5b4c14f-550c-483f-8a1d-5b596130b713/frr-k8s-webhook-server/0.log" Mar 09 10:06:09 crc kubenswrapper[4861]: I0309 10:06:09.412271 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-57f84dc5b8-mv5gg_bcfd9e4a-11a5-40dc-aa44-e6348ac2069b/manager/0.log" Mar 09 10:06:09 crc kubenswrapper[4861]: I0309 10:06:09.518575 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7569c9dcdc-v9vw2_9eb5d549-165a-4d97-8526-e082c80ed71b/webhook-server/0.log" Mar 09 10:06:09 crc kubenswrapper[4861]: I0309 10:06:09.755653 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-knlc5_48e2ee3b-9d66-4ffb-a9e7-f30dbe52d03b/kube-rbac-proxy/0.log" Mar 09 10:06:10 crc kubenswrapper[4861]: I0309 10:06:10.285214 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-knlc5_48e2ee3b-9d66-4ffb-a9e7-f30dbe52d03b/speaker/0.log" Mar 09 10:06:10 crc kubenswrapper[4861]: I0309 10:06:10.513611 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p5vjx_416fe11c-136c-4b38-9f85-2ba8df311664/frr/0.log" Mar 09 10:06:21 crc kubenswrapper[4861]: I0309 10:06:21.916114 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ggwqp_8668713c-12cf-457c-a09c-5302f11d19cc/util/0.log" Mar 09 10:06:22 crc kubenswrapper[4861]: I0309 10:06:22.053883 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ggwqp_8668713c-12cf-457c-a09c-5302f11d19cc/pull/0.log" Mar 09 10:06:22 crc kubenswrapper[4861]: I0309 10:06:22.088188 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ggwqp_8668713c-12cf-457c-a09c-5302f11d19cc/util/0.log" Mar 09 10:06:22 crc kubenswrapper[4861]: I0309 10:06:22.089055 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ggwqp_8668713c-12cf-457c-a09c-5302f11d19cc/pull/0.log" Mar 09 10:06:22 crc kubenswrapper[4861]: I0309 10:06:22.298505 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ggwqp_8668713c-12cf-457c-a09c-5302f11d19cc/util/0.log" Mar 09 10:06:22 crc kubenswrapper[4861]: I0309 10:06:22.323933 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ggwqp_8668713c-12cf-457c-a09c-5302f11d19cc/extract/0.log" Mar 09 10:06:22 crc kubenswrapper[4861]: I0309 10:06:22.330056 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ggwqp_8668713c-12cf-457c-a09c-5302f11d19cc/pull/0.log" Mar 09 10:06:22 crc kubenswrapper[4861]: I0309 10:06:22.469708 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gcl57_87e64062-b33e-43aa-b264-7a26f9a3e0a0/extract-utilities/0.log" Mar 09 10:06:22 crc kubenswrapper[4861]: I0309 10:06:22.644098 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gcl57_87e64062-b33e-43aa-b264-7a26f9a3e0a0/extract-content/0.log" Mar 09 10:06:22 crc kubenswrapper[4861]: I0309 10:06:22.644324 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gcl57_87e64062-b33e-43aa-b264-7a26f9a3e0a0/extract-utilities/0.log" Mar 09 10:06:22 crc kubenswrapper[4861]: I0309 10:06:22.653183 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gcl57_87e64062-b33e-43aa-b264-7a26f9a3e0a0/extract-content/0.log" Mar 09 10:06:22 crc kubenswrapper[4861]: I0309 10:06:22.836316 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gcl57_87e64062-b33e-43aa-b264-7a26f9a3e0a0/extract-utilities/0.log" Mar 09 10:06:22 crc kubenswrapper[4861]: I0309 10:06:22.906261 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gcl57_87e64062-b33e-43aa-b264-7a26f9a3e0a0/extract-content/0.log" Mar 09 10:06:23 crc kubenswrapper[4861]: I0309 10:06:23.088076 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qjh8b_d0523999-9c2d-4335-8c1e-249abc1099b9/extract-utilities/0.log" Mar 09 10:06:23 crc kubenswrapper[4861]: I0309 10:06:23.280198 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qjh8b_d0523999-9c2d-4335-8c1e-249abc1099b9/extract-utilities/0.log" Mar 09 10:06:23 crc kubenswrapper[4861]: I0309 10:06:23.280986 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qjh8b_d0523999-9c2d-4335-8c1e-249abc1099b9/extract-content/0.log" Mar 09 10:06:23 crc kubenswrapper[4861]: I0309 10:06:23.390330 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qjh8b_d0523999-9c2d-4335-8c1e-249abc1099b9/extract-content/0.log" Mar 09 10:06:23 crc kubenswrapper[4861]: I0309 10:06:23.531604 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gcl57_87e64062-b33e-43aa-b264-7a26f9a3e0a0/registry-server/0.log" Mar 09 10:06:23 crc kubenswrapper[4861]: I0309 10:06:23.536155 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qjh8b_d0523999-9c2d-4335-8c1e-249abc1099b9/extract-content/0.log" Mar 09 10:06:23 crc kubenswrapper[4861]: I0309 10:06:23.579218 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qjh8b_d0523999-9c2d-4335-8c1e-249abc1099b9/extract-utilities/0.log" Mar 09 10:06:23 crc kubenswrapper[4861]: I0309 10:06:23.738979 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s7b95_4410072b-3e11-4357-9ce8-2c754f336515/util/0.log" Mar 09 10:06:23 crc kubenswrapper[4861]: I0309 10:06:23.925696 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qjh8b_d0523999-9c2d-4335-8c1e-249abc1099b9/registry-server/0.log" Mar 09 10:06:24 crc kubenswrapper[4861]: I0309 10:06:24.009656 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s7b95_4410072b-3e11-4357-9ce8-2c754f336515/pull/0.log" Mar 09 10:06:24 crc kubenswrapper[4861]: I0309 10:06:24.015534 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s7b95_4410072b-3e11-4357-9ce8-2c754f336515/util/0.log" Mar 09 10:06:24 crc kubenswrapper[4861]: I0309 10:06:24.031057 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s7b95_4410072b-3e11-4357-9ce8-2c754f336515/pull/0.log" Mar 09 10:06:24 crc kubenswrapper[4861]: I0309 10:06:24.200476 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s7b95_4410072b-3e11-4357-9ce8-2c754f336515/util/0.log" Mar 09 10:06:24 crc kubenswrapper[4861]: I0309 10:06:24.228250 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s7b95_4410072b-3e11-4357-9ce8-2c754f336515/pull/0.log" Mar 09 10:06:24 crc kubenswrapper[4861]: I0309 10:06:24.239906 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s7b95_4410072b-3e11-4357-9ce8-2c754f336515/extract/0.log" Mar 09 10:06:24 crc kubenswrapper[4861]: I0309 10:06:24.408154 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-9l4hx_5c398209-0537-461f-a2a8-b626abd10525/marketplace-operator/0.log" Mar 09 10:06:24 crc kubenswrapper[4861]: I0309 10:06:24.470093 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-f2cbr_a5118375-51c4-460f-a7e4-4b0dc454daa2/extract-utilities/0.log" Mar 09 10:06:24 crc kubenswrapper[4861]: I0309 10:06:24.605824 4861 patch_prober.go:28] interesting pod/machine-config-daemon-5g7gc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 10:06:24 crc kubenswrapper[4861]: I0309 10:06:24.605897 4861 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 10:06:24 crc kubenswrapper[4861]: I0309 10:06:24.664842 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-f2cbr_a5118375-51c4-460f-a7e4-4b0dc454daa2/extract-content/0.log" Mar 09 10:06:24 crc kubenswrapper[4861]: I0309 10:06:24.679939 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-f2cbr_a5118375-51c4-460f-a7e4-4b0dc454daa2/extract-utilities/0.log" Mar 09 10:06:24 crc kubenswrapper[4861]: I0309 10:06:24.683003 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-f2cbr_a5118375-51c4-460f-a7e4-4b0dc454daa2/extract-content/0.log" Mar 09 10:06:24 crc kubenswrapper[4861]: I0309 10:06:24.825706 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-f2cbr_a5118375-51c4-460f-a7e4-4b0dc454daa2/extract-utilities/0.log" Mar 09 10:06:24 crc kubenswrapper[4861]: I0309 10:06:24.878808 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-f2cbr_a5118375-51c4-460f-a7e4-4b0dc454daa2/extract-content/0.log" Mar 09 10:06:25 crc kubenswrapper[4861]: I0309 10:06:25.010059 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-f2cbr_a5118375-51c4-460f-a7e4-4b0dc454daa2/registry-server/0.log" Mar 09 10:06:25 crc kubenswrapper[4861]: I0309 10:06:25.066591 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ktkhz_f16d7e47-0abb-4811-9e99-3f68d1fd64ab/extract-utilities/0.log" Mar 09 10:06:25 crc kubenswrapper[4861]: I0309 10:06:25.207420 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ktkhz_f16d7e47-0abb-4811-9e99-3f68d1fd64ab/extract-content/0.log" Mar 09 10:06:25 crc kubenswrapper[4861]: I0309 10:06:25.218886 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ktkhz_f16d7e47-0abb-4811-9e99-3f68d1fd64ab/extract-utilities/0.log" Mar 09 10:06:25 crc kubenswrapper[4861]: I0309 10:06:25.237715 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ktkhz_f16d7e47-0abb-4811-9e99-3f68d1fd64ab/extract-content/0.log" Mar 09 10:06:25 crc kubenswrapper[4861]: I0309 10:06:25.429654 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ktkhz_f16d7e47-0abb-4811-9e99-3f68d1fd64ab/extract-utilities/0.log" Mar 09 10:06:25 crc kubenswrapper[4861]: I0309 10:06:25.433461 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ktkhz_f16d7e47-0abb-4811-9e99-3f68d1fd64ab/extract-content/0.log" Mar 09 10:06:26 crc kubenswrapper[4861]: I0309 10:06:26.049091 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ktkhz_f16d7e47-0abb-4811-9e99-3f68d1fd64ab/registry-server/0.log" Mar 09 10:06:38 crc kubenswrapper[4861]: I0309 10:06:38.189755 4861 scope.go:117] "RemoveContainer" containerID="e0779ff71ef1d50137bdf8113f7509f5ee9b094ac47c1e7095475b66e02d3aab" Mar 09 10:06:54 crc kubenswrapper[4861]: I0309 10:06:54.605565 4861 patch_prober.go:28] interesting pod/machine-config-daemon-5g7gc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 10:06:54 crc kubenswrapper[4861]: I0309 10:06:54.606031 4861 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 10:06:54 crc kubenswrapper[4861]: I0309 10:06:54.606071 4861 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" Mar 09 10:06:54 crc kubenswrapper[4861]: I0309 10:06:54.606766 4861 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7bbbec7e8a5f7da112767a5599dcb0b362a1472008cac65425c981acbf405224"} pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 10:06:54 crc kubenswrapper[4861]: I0309 10:06:54.606814 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" containerID="cri-o://7bbbec7e8a5f7da112767a5599dcb0b362a1472008cac65425c981acbf405224" gracePeriod=600 Mar 09 10:06:54 crc kubenswrapper[4861]: E0309 10:06:54.734724 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 10:06:55 crc kubenswrapper[4861]: E0309 10:06:55.443619 4861 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.163:45880->38.102.83.163:39285: write tcp 38.102.83.163:45880->38.102.83.163:39285: write: broken pipe Mar 09 10:06:55 crc kubenswrapper[4861]: I0309 10:06:55.564635 4861 generic.go:334] "Generic (PLEG): container finished" podID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerID="7bbbec7e8a5f7da112767a5599dcb0b362a1472008cac65425c981acbf405224" exitCode=0 Mar 09 10:06:55 crc kubenswrapper[4861]: I0309 10:06:55.564690 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" event={"ID":"6f7875e3-174f-4c67-8675-d878de74aa4f","Type":"ContainerDied","Data":"7bbbec7e8a5f7da112767a5599dcb0b362a1472008cac65425c981acbf405224"} Mar 09 10:06:55 crc kubenswrapper[4861]: I0309 10:06:55.564722 4861 scope.go:117] "RemoveContainer" containerID="7e8b0aadd87c38d58bff92f69ce179da04d7d49c2a37109605ba66bad8f23ee6" Mar 09 10:06:55 crc kubenswrapper[4861]: I0309 10:06:55.565326 4861 scope.go:117] "RemoveContainer" containerID="7bbbec7e8a5f7da112767a5599dcb0b362a1472008cac65425c981acbf405224" Mar 09 10:06:55 crc kubenswrapper[4861]: E0309 10:06:55.565693 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 10:07:09 crc kubenswrapper[4861]: I0309 10:07:09.660268 4861 scope.go:117] "RemoveContainer" containerID="7bbbec7e8a5f7da112767a5599dcb0b362a1472008cac65425c981acbf405224" Mar 09 10:07:09 crc kubenswrapper[4861]: E0309 10:07:09.661283 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 10:07:24 crc kubenswrapper[4861]: I0309 10:07:24.657956 4861 scope.go:117] "RemoveContainer" containerID="7bbbec7e8a5f7da112767a5599dcb0b362a1472008cac65425c981acbf405224" Mar 09 10:07:24 crc kubenswrapper[4861]: E0309 10:07:24.658635 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 10:07:37 crc kubenswrapper[4861]: I0309 10:07:37.669810 4861 scope.go:117] "RemoveContainer" containerID="7bbbec7e8a5f7da112767a5599dcb0b362a1472008cac65425c981acbf405224" Mar 09 10:07:37 crc kubenswrapper[4861]: E0309 10:07:37.670551 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 10:07:50 crc kubenswrapper[4861]: I0309 10:07:50.658186 4861 scope.go:117] "RemoveContainer" containerID="7bbbec7e8a5f7da112767a5599dcb0b362a1472008cac65425c981acbf405224" Mar 09 10:07:50 crc kubenswrapper[4861]: E0309 10:07:50.659039 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 10:08:00 crc kubenswrapper[4861]: I0309 10:08:00.143947 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550848-x9fsv"] Mar 09 10:08:00 crc kubenswrapper[4861]: E0309 10:08:00.145002 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f05e1178-3781-49c1-a69f-8247af50a922" containerName="oc" Mar 09 10:08:00 crc kubenswrapper[4861]: I0309 10:08:00.145019 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="f05e1178-3781-49c1-a69f-8247af50a922" containerName="oc" Mar 09 10:08:00 crc kubenswrapper[4861]: I0309 10:08:00.145296 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="f05e1178-3781-49c1-a69f-8247af50a922" containerName="oc" Mar 09 10:08:00 crc kubenswrapper[4861]: I0309 10:08:00.146070 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550848-x9fsv" Mar 09 10:08:00 crc kubenswrapper[4861]: I0309 10:08:00.154823 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 10:08:00 crc kubenswrapper[4861]: I0309 10:08:00.154939 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 10:08:00 crc kubenswrapper[4861]: I0309 10:08:00.155397 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k86l8" Mar 09 10:08:00 crc kubenswrapper[4861]: I0309 10:08:00.161952 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550848-x9fsv"] Mar 09 10:08:00 crc kubenswrapper[4861]: I0309 10:08:00.218114 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dghdd\" (UniqueName: \"kubernetes.io/projected/7aa3697f-7f16-47f8-8c14-c39f309602d2-kube-api-access-dghdd\") pod \"auto-csr-approver-29550848-x9fsv\" (UID: \"7aa3697f-7f16-47f8-8c14-c39f309602d2\") " pod="openshift-infra/auto-csr-approver-29550848-x9fsv" Mar 09 10:08:00 crc kubenswrapper[4861]: I0309 10:08:00.320898 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dghdd\" (UniqueName: \"kubernetes.io/projected/7aa3697f-7f16-47f8-8c14-c39f309602d2-kube-api-access-dghdd\") pod \"auto-csr-approver-29550848-x9fsv\" (UID: \"7aa3697f-7f16-47f8-8c14-c39f309602d2\") " pod="openshift-infra/auto-csr-approver-29550848-x9fsv" Mar 09 10:08:00 crc kubenswrapper[4861]: I0309 10:08:00.340986 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dghdd\" (UniqueName: \"kubernetes.io/projected/7aa3697f-7f16-47f8-8c14-c39f309602d2-kube-api-access-dghdd\") pod \"auto-csr-approver-29550848-x9fsv\" (UID: \"7aa3697f-7f16-47f8-8c14-c39f309602d2\") " pod="openshift-infra/auto-csr-approver-29550848-x9fsv" Mar 09 10:08:00 crc kubenswrapper[4861]: I0309 10:08:00.473402 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550848-x9fsv" Mar 09 10:08:00 crc kubenswrapper[4861]: I0309 10:08:00.927039 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550848-x9fsv"] Mar 09 10:08:00 crc kubenswrapper[4861]: I0309 10:08:00.933713 4861 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 10:08:01 crc kubenswrapper[4861]: I0309 10:08:01.165466 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550848-x9fsv" event={"ID":"7aa3697f-7f16-47f8-8c14-c39f309602d2","Type":"ContainerStarted","Data":"d949460f7fc7ec998f635122714c649dcb63beab6187fcbc63743d99f686b0f2"} Mar 09 10:08:03 crc kubenswrapper[4861]: I0309 10:08:03.187624 4861 generic.go:334] "Generic (PLEG): container finished" podID="7aa3697f-7f16-47f8-8c14-c39f309602d2" containerID="14acbca4a79b72e30e8cc1f45a779deec009fccc1e0216175acaaa5f8b817002" exitCode=0 Mar 09 10:08:03 crc kubenswrapper[4861]: I0309 10:08:03.187710 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550848-x9fsv" event={"ID":"7aa3697f-7f16-47f8-8c14-c39f309602d2","Type":"ContainerDied","Data":"14acbca4a79b72e30e8cc1f45a779deec009fccc1e0216175acaaa5f8b817002"} Mar 09 10:08:04 crc kubenswrapper[4861]: I0309 10:08:04.517331 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550848-x9fsv" Mar 09 10:08:04 crc kubenswrapper[4861]: I0309 10:08:04.618226 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dghdd\" (UniqueName: \"kubernetes.io/projected/7aa3697f-7f16-47f8-8c14-c39f309602d2-kube-api-access-dghdd\") pod \"7aa3697f-7f16-47f8-8c14-c39f309602d2\" (UID: \"7aa3697f-7f16-47f8-8c14-c39f309602d2\") " Mar 09 10:08:04 crc kubenswrapper[4861]: I0309 10:08:04.625311 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7aa3697f-7f16-47f8-8c14-c39f309602d2-kube-api-access-dghdd" (OuterVolumeSpecName: "kube-api-access-dghdd") pod "7aa3697f-7f16-47f8-8c14-c39f309602d2" (UID: "7aa3697f-7f16-47f8-8c14-c39f309602d2"). InnerVolumeSpecName "kube-api-access-dghdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:08:04 crc kubenswrapper[4861]: I0309 10:08:04.721763 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dghdd\" (UniqueName: \"kubernetes.io/projected/7aa3697f-7f16-47f8-8c14-c39f309602d2-kube-api-access-dghdd\") on node \"crc\" DevicePath \"\"" Mar 09 10:08:05 crc kubenswrapper[4861]: I0309 10:08:05.208170 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550848-x9fsv" event={"ID":"7aa3697f-7f16-47f8-8c14-c39f309602d2","Type":"ContainerDied","Data":"d949460f7fc7ec998f635122714c649dcb63beab6187fcbc63743d99f686b0f2"} Mar 09 10:08:05 crc kubenswrapper[4861]: I0309 10:08:05.208231 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d949460f7fc7ec998f635122714c649dcb63beab6187fcbc63743d99f686b0f2" Mar 09 10:08:05 crc kubenswrapper[4861]: I0309 10:08:05.208301 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550848-x9fsv" Mar 09 10:08:05 crc kubenswrapper[4861]: I0309 10:08:05.599712 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550842-gxl6n"] Mar 09 10:08:05 crc kubenswrapper[4861]: I0309 10:08:05.608834 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550842-gxl6n"] Mar 09 10:08:05 crc kubenswrapper[4861]: I0309 10:08:05.659245 4861 scope.go:117] "RemoveContainer" containerID="7bbbec7e8a5f7da112767a5599dcb0b362a1472008cac65425c981acbf405224" Mar 09 10:08:05 crc kubenswrapper[4861]: E0309 10:08:05.660046 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 10:08:05 crc kubenswrapper[4861]: I0309 10:08:05.672120 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35216638-05e0-40bf-b9b8-57924a749838" path="/var/lib/kubelet/pods/35216638-05e0-40bf-b9b8-57924a749838/volumes" Mar 09 10:08:16 crc kubenswrapper[4861]: I0309 10:08:16.659081 4861 scope.go:117] "RemoveContainer" containerID="7bbbec7e8a5f7da112767a5599dcb0b362a1472008cac65425c981acbf405224" Mar 09 10:08:16 crc kubenswrapper[4861]: E0309 10:08:16.659952 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 10:08:18 crc kubenswrapper[4861]: I0309 10:08:18.319759 4861 generic.go:334] "Generic (PLEG): container finished" podID="1a5514de-866f-42de-ab75-f4988a3108a7" containerID="96a6a23b425a481921ee3e4a1a25e4bbdbca2e8678a0acbf476847e09d83de06" exitCode=0 Mar 09 10:08:18 crc kubenswrapper[4861]: I0309 10:08:18.319888 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qf6cx/must-gather-hdwgq" event={"ID":"1a5514de-866f-42de-ab75-f4988a3108a7","Type":"ContainerDied","Data":"96a6a23b425a481921ee3e4a1a25e4bbdbca2e8678a0acbf476847e09d83de06"} Mar 09 10:08:18 crc kubenswrapper[4861]: I0309 10:08:18.321653 4861 scope.go:117] "RemoveContainer" containerID="96a6a23b425a481921ee3e4a1a25e4bbdbca2e8678a0acbf476847e09d83de06" Mar 09 10:08:18 crc kubenswrapper[4861]: I0309 10:08:18.765314 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qf6cx_must-gather-hdwgq_1a5514de-866f-42de-ab75-f4988a3108a7/gather/0.log" Mar 09 10:08:20 crc kubenswrapper[4861]: E0309 10:08:20.849219 4861 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.163:41130->38.102.83.163:39285: write tcp 38.102.83.163:41130->38.102.83.163:39285: write: connection reset by peer Mar 09 10:08:26 crc kubenswrapper[4861]: I0309 10:08:26.584318 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-qf6cx/must-gather-hdwgq"] Mar 09 10:08:26 crc kubenswrapper[4861]: I0309 10:08:26.585056 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-qf6cx/must-gather-hdwgq" podUID="1a5514de-866f-42de-ab75-f4988a3108a7" containerName="copy" containerID="cri-o://e30208b5d25820e86ffdb3840bb40b2d2680a12af7065ddd048dbe67d401afdf" gracePeriod=2 Mar 09 10:08:26 crc kubenswrapper[4861]: I0309 10:08:26.593333 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-qf6cx/must-gather-hdwgq"] Mar 09 10:08:27 crc kubenswrapper[4861]: I0309 10:08:27.004729 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qf6cx_must-gather-hdwgq_1a5514de-866f-42de-ab75-f4988a3108a7/copy/0.log" Mar 09 10:08:27 crc kubenswrapper[4861]: I0309 10:08:27.005595 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qf6cx/must-gather-hdwgq" Mar 09 10:08:27 crc kubenswrapper[4861]: I0309 10:08:27.076559 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1a5514de-866f-42de-ab75-f4988a3108a7-must-gather-output\") pod \"1a5514de-866f-42de-ab75-f4988a3108a7\" (UID: \"1a5514de-866f-42de-ab75-f4988a3108a7\") " Mar 09 10:08:27 crc kubenswrapper[4861]: I0309 10:08:27.076813 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbhm9\" (UniqueName: \"kubernetes.io/projected/1a5514de-866f-42de-ab75-f4988a3108a7-kube-api-access-sbhm9\") pod \"1a5514de-866f-42de-ab75-f4988a3108a7\" (UID: \"1a5514de-866f-42de-ab75-f4988a3108a7\") " Mar 09 10:08:27 crc kubenswrapper[4861]: I0309 10:08:27.082064 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a5514de-866f-42de-ab75-f4988a3108a7-kube-api-access-sbhm9" (OuterVolumeSpecName: "kube-api-access-sbhm9") pod "1a5514de-866f-42de-ab75-f4988a3108a7" (UID: "1a5514de-866f-42de-ab75-f4988a3108a7"). InnerVolumeSpecName "kube-api-access-sbhm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:08:27 crc kubenswrapper[4861]: I0309 10:08:27.179296 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbhm9\" (UniqueName: \"kubernetes.io/projected/1a5514de-866f-42de-ab75-f4988a3108a7-kube-api-access-sbhm9\") on node \"crc\" DevicePath \"\"" Mar 09 10:08:27 crc kubenswrapper[4861]: I0309 10:08:27.252069 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a5514de-866f-42de-ab75-f4988a3108a7-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "1a5514de-866f-42de-ab75-f4988a3108a7" (UID: "1a5514de-866f-42de-ab75-f4988a3108a7"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:08:27 crc kubenswrapper[4861]: I0309 10:08:27.281293 4861 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1a5514de-866f-42de-ab75-f4988a3108a7-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 09 10:08:27 crc kubenswrapper[4861]: I0309 10:08:27.408950 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qf6cx_must-gather-hdwgq_1a5514de-866f-42de-ab75-f4988a3108a7/copy/0.log" Mar 09 10:08:27 crc kubenswrapper[4861]: I0309 10:08:27.409740 4861 generic.go:334] "Generic (PLEG): container finished" podID="1a5514de-866f-42de-ab75-f4988a3108a7" containerID="e30208b5d25820e86ffdb3840bb40b2d2680a12af7065ddd048dbe67d401afdf" exitCode=143 Mar 09 10:08:27 crc kubenswrapper[4861]: I0309 10:08:27.409769 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qf6cx/must-gather-hdwgq" Mar 09 10:08:27 crc kubenswrapper[4861]: I0309 10:08:27.409811 4861 scope.go:117] "RemoveContainer" containerID="e30208b5d25820e86ffdb3840bb40b2d2680a12af7065ddd048dbe67d401afdf" Mar 09 10:08:27 crc kubenswrapper[4861]: I0309 10:08:27.447571 4861 scope.go:117] "RemoveContainer" containerID="96a6a23b425a481921ee3e4a1a25e4bbdbca2e8678a0acbf476847e09d83de06" Mar 09 10:08:27 crc kubenswrapper[4861]: I0309 10:08:27.552575 4861 scope.go:117] "RemoveContainer" containerID="e30208b5d25820e86ffdb3840bb40b2d2680a12af7065ddd048dbe67d401afdf" Mar 09 10:08:27 crc kubenswrapper[4861]: E0309 10:08:27.556227 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e30208b5d25820e86ffdb3840bb40b2d2680a12af7065ddd048dbe67d401afdf\": container with ID starting with e30208b5d25820e86ffdb3840bb40b2d2680a12af7065ddd048dbe67d401afdf not found: ID does not exist" containerID="e30208b5d25820e86ffdb3840bb40b2d2680a12af7065ddd048dbe67d401afdf" Mar 09 10:08:27 crc kubenswrapper[4861]: I0309 10:08:27.556285 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e30208b5d25820e86ffdb3840bb40b2d2680a12af7065ddd048dbe67d401afdf"} err="failed to get container status \"e30208b5d25820e86ffdb3840bb40b2d2680a12af7065ddd048dbe67d401afdf\": rpc error: code = NotFound desc = could not find container \"e30208b5d25820e86ffdb3840bb40b2d2680a12af7065ddd048dbe67d401afdf\": container with ID starting with e30208b5d25820e86ffdb3840bb40b2d2680a12af7065ddd048dbe67d401afdf not found: ID does not exist" Mar 09 10:08:27 crc kubenswrapper[4861]: I0309 10:08:27.556314 4861 scope.go:117] "RemoveContainer" containerID="96a6a23b425a481921ee3e4a1a25e4bbdbca2e8678a0acbf476847e09d83de06" Mar 09 10:08:27 crc kubenswrapper[4861]: E0309 10:08:27.557171 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96a6a23b425a481921ee3e4a1a25e4bbdbca2e8678a0acbf476847e09d83de06\": container with ID starting with 96a6a23b425a481921ee3e4a1a25e4bbdbca2e8678a0acbf476847e09d83de06 not found: ID does not exist" containerID="96a6a23b425a481921ee3e4a1a25e4bbdbca2e8678a0acbf476847e09d83de06" Mar 09 10:08:27 crc kubenswrapper[4861]: I0309 10:08:27.557212 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96a6a23b425a481921ee3e4a1a25e4bbdbca2e8678a0acbf476847e09d83de06"} err="failed to get container status \"96a6a23b425a481921ee3e4a1a25e4bbdbca2e8678a0acbf476847e09d83de06\": rpc error: code = NotFound desc = could not find container \"96a6a23b425a481921ee3e4a1a25e4bbdbca2e8678a0acbf476847e09d83de06\": container with ID starting with 96a6a23b425a481921ee3e4a1a25e4bbdbca2e8678a0acbf476847e09d83de06 not found: ID does not exist" Mar 09 10:08:27 crc kubenswrapper[4861]: I0309 10:08:27.669454 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a5514de-866f-42de-ab75-f4988a3108a7" path="/var/lib/kubelet/pods/1a5514de-866f-42de-ab75-f4988a3108a7/volumes" Mar 09 10:08:27 crc kubenswrapper[4861]: E0309 10:08:27.710546 4861 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a5514de_866f_42de_ab75_f4988a3108a7.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a5514de_866f_42de_ab75_f4988a3108a7.slice/crio-adebb954dd2d70e987503695b8b5892875f99d7836038bcfc3d4d46212aea582\": RecentStats: unable to find data in memory cache]" Mar 09 10:08:31 crc kubenswrapper[4861]: I0309 10:08:31.658197 4861 scope.go:117] "RemoveContainer" containerID="7bbbec7e8a5f7da112767a5599dcb0b362a1472008cac65425c981acbf405224" Mar 09 10:08:31 crc kubenswrapper[4861]: E0309 10:08:31.659068 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 10:08:38 crc kubenswrapper[4861]: I0309 10:08:38.295518 4861 scope.go:117] "RemoveContainer" containerID="22736e38955e37a14a35ce5c997c6afbacca015792f9f0409156eb4d6ba8a45b" Mar 09 10:08:45 crc kubenswrapper[4861]: I0309 10:08:45.658622 4861 scope.go:117] "RemoveContainer" containerID="7bbbec7e8a5f7da112767a5599dcb0b362a1472008cac65425c981acbf405224" Mar 09 10:08:45 crc kubenswrapper[4861]: E0309 10:08:45.659611 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 10:09:00 crc kubenswrapper[4861]: I0309 10:09:00.151038 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-t8l6z"] Mar 09 10:09:00 crc kubenswrapper[4861]: E0309 10:09:00.153217 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7aa3697f-7f16-47f8-8c14-c39f309602d2" containerName="oc" Mar 09 10:09:00 crc kubenswrapper[4861]: I0309 10:09:00.153237 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="7aa3697f-7f16-47f8-8c14-c39f309602d2" containerName="oc" Mar 09 10:09:00 crc kubenswrapper[4861]: E0309 10:09:00.153252 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a5514de-866f-42de-ab75-f4988a3108a7" containerName="gather" Mar 09 10:09:00 crc kubenswrapper[4861]: I0309 10:09:00.153260 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a5514de-866f-42de-ab75-f4988a3108a7" containerName="gather" Mar 09 10:09:00 crc kubenswrapper[4861]: E0309 10:09:00.153277 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a5514de-866f-42de-ab75-f4988a3108a7" containerName="copy" Mar 09 10:09:00 crc kubenswrapper[4861]: I0309 10:09:00.153284 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a5514de-866f-42de-ab75-f4988a3108a7" containerName="copy" Mar 09 10:09:00 crc kubenswrapper[4861]: I0309 10:09:00.153490 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a5514de-866f-42de-ab75-f4988a3108a7" containerName="gather" Mar 09 10:09:00 crc kubenswrapper[4861]: I0309 10:09:00.153507 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a5514de-866f-42de-ab75-f4988a3108a7" containerName="copy" Mar 09 10:09:00 crc kubenswrapper[4861]: I0309 10:09:00.153518 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="7aa3697f-7f16-47f8-8c14-c39f309602d2" containerName="oc" Mar 09 10:09:00 crc kubenswrapper[4861]: I0309 10:09:00.155016 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t8l6z" Mar 09 10:09:00 crc kubenswrapper[4861]: I0309 10:09:00.173629 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t8l6z"] Mar 09 10:09:00 crc kubenswrapper[4861]: I0309 10:09:00.298923 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqnkx\" (UniqueName: \"kubernetes.io/projected/7d247dbf-11cb-47d5-b668-d1f5e43f7024-kube-api-access-mqnkx\") pod \"certified-operators-t8l6z\" (UID: \"7d247dbf-11cb-47d5-b668-d1f5e43f7024\") " pod="openshift-marketplace/certified-operators-t8l6z" Mar 09 10:09:00 crc kubenswrapper[4861]: I0309 10:09:00.299002 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d247dbf-11cb-47d5-b668-d1f5e43f7024-utilities\") pod \"certified-operators-t8l6z\" (UID: \"7d247dbf-11cb-47d5-b668-d1f5e43f7024\") " pod="openshift-marketplace/certified-operators-t8l6z" Mar 09 10:09:00 crc kubenswrapper[4861]: I0309 10:09:00.299121 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d247dbf-11cb-47d5-b668-d1f5e43f7024-catalog-content\") pod \"certified-operators-t8l6z\" (UID: \"7d247dbf-11cb-47d5-b668-d1f5e43f7024\") " pod="openshift-marketplace/certified-operators-t8l6z" Mar 09 10:09:00 crc kubenswrapper[4861]: I0309 10:09:00.401465 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqnkx\" (UniqueName: \"kubernetes.io/projected/7d247dbf-11cb-47d5-b668-d1f5e43f7024-kube-api-access-mqnkx\") pod \"certified-operators-t8l6z\" (UID: \"7d247dbf-11cb-47d5-b668-d1f5e43f7024\") " pod="openshift-marketplace/certified-operators-t8l6z" Mar 09 10:09:00 crc kubenswrapper[4861]: I0309 10:09:00.401563 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d247dbf-11cb-47d5-b668-d1f5e43f7024-utilities\") pod \"certified-operators-t8l6z\" (UID: \"7d247dbf-11cb-47d5-b668-d1f5e43f7024\") " pod="openshift-marketplace/certified-operators-t8l6z" Mar 09 10:09:00 crc kubenswrapper[4861]: I0309 10:09:00.401651 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d247dbf-11cb-47d5-b668-d1f5e43f7024-catalog-content\") pod \"certified-operators-t8l6z\" (UID: \"7d247dbf-11cb-47d5-b668-d1f5e43f7024\") " pod="openshift-marketplace/certified-operators-t8l6z" Mar 09 10:09:00 crc kubenswrapper[4861]: I0309 10:09:00.402179 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d247dbf-11cb-47d5-b668-d1f5e43f7024-utilities\") pod \"certified-operators-t8l6z\" (UID: \"7d247dbf-11cb-47d5-b668-d1f5e43f7024\") " pod="openshift-marketplace/certified-operators-t8l6z" Mar 09 10:09:00 crc kubenswrapper[4861]: I0309 10:09:00.402258 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d247dbf-11cb-47d5-b668-d1f5e43f7024-catalog-content\") pod \"certified-operators-t8l6z\" (UID: \"7d247dbf-11cb-47d5-b668-d1f5e43f7024\") " pod="openshift-marketplace/certified-operators-t8l6z" Mar 09 10:09:00 crc kubenswrapper[4861]: I0309 10:09:00.424990 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqnkx\" (UniqueName: \"kubernetes.io/projected/7d247dbf-11cb-47d5-b668-d1f5e43f7024-kube-api-access-mqnkx\") pod \"certified-operators-t8l6z\" (UID: \"7d247dbf-11cb-47d5-b668-d1f5e43f7024\") " pod="openshift-marketplace/certified-operators-t8l6z" Mar 09 10:09:00 crc kubenswrapper[4861]: I0309 10:09:00.477337 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t8l6z" Mar 09 10:09:00 crc kubenswrapper[4861]: I0309 10:09:00.659020 4861 scope.go:117] "RemoveContainer" containerID="7bbbec7e8a5f7da112767a5599dcb0b362a1472008cac65425c981acbf405224" Mar 09 10:09:00 crc kubenswrapper[4861]: E0309 10:09:00.659579 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 10:09:00 crc kubenswrapper[4861]: I0309 10:09:00.966730 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t8l6z"] Mar 09 10:09:01 crc kubenswrapper[4861]: I0309 10:09:01.723423 4861 generic.go:334] "Generic (PLEG): container finished" podID="7d247dbf-11cb-47d5-b668-d1f5e43f7024" containerID="a5773fc5906456cc57f63a0bafdaaf2984932c8beec5f83467cb5c31b5de1832" exitCode=0 Mar 09 10:09:01 crc kubenswrapper[4861]: I0309 10:09:01.723495 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t8l6z" event={"ID":"7d247dbf-11cb-47d5-b668-d1f5e43f7024","Type":"ContainerDied","Data":"a5773fc5906456cc57f63a0bafdaaf2984932c8beec5f83467cb5c31b5de1832"} Mar 09 10:09:01 crc kubenswrapper[4861]: I0309 10:09:01.723562 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t8l6z" event={"ID":"7d247dbf-11cb-47d5-b668-d1f5e43f7024","Type":"ContainerStarted","Data":"f63c15216878f2c81b1548134b506c8b9bde3003631aded4e2dc15f024ea68f1"} Mar 09 10:09:03 crc kubenswrapper[4861]: I0309 10:09:03.746018 4861 generic.go:334] "Generic (PLEG): container finished" podID="7d247dbf-11cb-47d5-b668-d1f5e43f7024" containerID="49ed0e3f39c16d54c66254e72dcb47dc7f6c45a5cd5f815ae6f5ac0731a011fb" exitCode=0 Mar 09 10:09:03 crc kubenswrapper[4861]: I0309 10:09:03.746110 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t8l6z" event={"ID":"7d247dbf-11cb-47d5-b668-d1f5e43f7024","Type":"ContainerDied","Data":"49ed0e3f39c16d54c66254e72dcb47dc7f6c45a5cd5f815ae6f5ac0731a011fb"} Mar 09 10:09:04 crc kubenswrapper[4861]: I0309 10:09:04.758613 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t8l6z" event={"ID":"7d247dbf-11cb-47d5-b668-d1f5e43f7024","Type":"ContainerStarted","Data":"e461bfcc796a0331aaeba01094ddd73d3b46cdb1651f8b838d23896e0b0778f9"} Mar 09 10:09:04 crc kubenswrapper[4861]: I0309 10:09:04.777837 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-t8l6z" podStartSLOduration=2.2776126789999998 podStartE2EDuration="4.777810597s" podCreationTimestamp="2026-03-09 10:09:00 +0000 UTC" firstStartedPulling="2026-03-09 10:09:01.725640744 +0000 UTC m=+3784.810680145" lastFinishedPulling="2026-03-09 10:09:04.225838662 +0000 UTC m=+3787.310878063" observedRunningTime="2026-03-09 10:09:04.772450853 +0000 UTC m=+3787.857490254" watchObservedRunningTime="2026-03-09 10:09:04.777810597 +0000 UTC m=+3787.862849998" Mar 09 10:09:10 crc kubenswrapper[4861]: I0309 10:09:10.477634 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-t8l6z" Mar 09 10:09:10 crc kubenswrapper[4861]: I0309 10:09:10.478020 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-t8l6z" Mar 09 10:09:10 crc kubenswrapper[4861]: I0309 10:09:10.531915 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-t8l6z" Mar 09 10:09:10 crc kubenswrapper[4861]: I0309 10:09:10.862631 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-t8l6z" Mar 09 10:09:10 crc kubenswrapper[4861]: I0309 10:09:10.916274 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t8l6z"] Mar 09 10:09:12 crc kubenswrapper[4861]: I0309 10:09:12.842960 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-t8l6z" podUID="7d247dbf-11cb-47d5-b668-d1f5e43f7024" containerName="registry-server" containerID="cri-o://e461bfcc796a0331aaeba01094ddd73d3b46cdb1651f8b838d23896e0b0778f9" gracePeriod=2 Mar 09 10:09:13 crc kubenswrapper[4861]: I0309 10:09:13.852411 4861 generic.go:334] "Generic (PLEG): container finished" podID="7d247dbf-11cb-47d5-b668-d1f5e43f7024" containerID="e461bfcc796a0331aaeba01094ddd73d3b46cdb1651f8b838d23896e0b0778f9" exitCode=0 Mar 09 10:09:13 crc kubenswrapper[4861]: I0309 10:09:13.852495 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t8l6z" event={"ID":"7d247dbf-11cb-47d5-b668-d1f5e43f7024","Type":"ContainerDied","Data":"e461bfcc796a0331aaeba01094ddd73d3b46cdb1651f8b838d23896e0b0778f9"} Mar 09 10:09:14 crc kubenswrapper[4861]: I0309 10:09:14.208643 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t8l6z" Mar 09 10:09:14 crc kubenswrapper[4861]: I0309 10:09:14.330023 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d247dbf-11cb-47d5-b668-d1f5e43f7024-utilities\") pod \"7d247dbf-11cb-47d5-b668-d1f5e43f7024\" (UID: \"7d247dbf-11cb-47d5-b668-d1f5e43f7024\") " Mar 09 10:09:14 crc kubenswrapper[4861]: I0309 10:09:14.330420 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d247dbf-11cb-47d5-b668-d1f5e43f7024-catalog-content\") pod \"7d247dbf-11cb-47d5-b668-d1f5e43f7024\" (UID: \"7d247dbf-11cb-47d5-b668-d1f5e43f7024\") " Mar 09 10:09:14 crc kubenswrapper[4861]: I0309 10:09:14.330531 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqnkx\" (UniqueName: \"kubernetes.io/projected/7d247dbf-11cb-47d5-b668-d1f5e43f7024-kube-api-access-mqnkx\") pod \"7d247dbf-11cb-47d5-b668-d1f5e43f7024\" (UID: \"7d247dbf-11cb-47d5-b668-d1f5e43f7024\") " Mar 09 10:09:14 crc kubenswrapper[4861]: I0309 10:09:14.331064 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d247dbf-11cb-47d5-b668-d1f5e43f7024-utilities" (OuterVolumeSpecName: "utilities") pod "7d247dbf-11cb-47d5-b668-d1f5e43f7024" (UID: "7d247dbf-11cb-47d5-b668-d1f5e43f7024"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:09:14 crc kubenswrapper[4861]: I0309 10:09:14.337169 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d247dbf-11cb-47d5-b668-d1f5e43f7024-kube-api-access-mqnkx" (OuterVolumeSpecName: "kube-api-access-mqnkx") pod "7d247dbf-11cb-47d5-b668-d1f5e43f7024" (UID: "7d247dbf-11cb-47d5-b668-d1f5e43f7024"). InnerVolumeSpecName "kube-api-access-mqnkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:09:14 crc kubenswrapper[4861]: I0309 10:09:14.433007 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqnkx\" (UniqueName: \"kubernetes.io/projected/7d247dbf-11cb-47d5-b668-d1f5e43f7024-kube-api-access-mqnkx\") on node \"crc\" DevicePath \"\"" Mar 09 10:09:14 crc kubenswrapper[4861]: I0309 10:09:14.433049 4861 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d247dbf-11cb-47d5-b668-d1f5e43f7024-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 10:09:14 crc kubenswrapper[4861]: I0309 10:09:14.867012 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t8l6z" event={"ID":"7d247dbf-11cb-47d5-b668-d1f5e43f7024","Type":"ContainerDied","Data":"f63c15216878f2c81b1548134b506c8b9bde3003631aded4e2dc15f024ea68f1"} Mar 09 10:09:14 crc kubenswrapper[4861]: I0309 10:09:14.867076 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t8l6z" Mar 09 10:09:14 crc kubenswrapper[4861]: I0309 10:09:14.867084 4861 scope.go:117] "RemoveContainer" containerID="e461bfcc796a0331aaeba01094ddd73d3b46cdb1651f8b838d23896e0b0778f9" Mar 09 10:09:14 crc kubenswrapper[4861]: I0309 10:09:14.897632 4861 scope.go:117] "RemoveContainer" containerID="49ed0e3f39c16d54c66254e72dcb47dc7f6c45a5cd5f815ae6f5ac0731a011fb" Mar 09 10:09:14 crc kubenswrapper[4861]: I0309 10:09:14.918434 4861 scope.go:117] "RemoveContainer" containerID="a5773fc5906456cc57f63a0bafdaaf2984932c8beec5f83467cb5c31b5de1832" Mar 09 10:09:14 crc kubenswrapper[4861]: I0309 10:09:14.990026 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d247dbf-11cb-47d5-b668-d1f5e43f7024-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7d247dbf-11cb-47d5-b668-d1f5e43f7024" (UID: "7d247dbf-11cb-47d5-b668-d1f5e43f7024"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:09:15 crc kubenswrapper[4861]: I0309 10:09:15.044554 4861 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d247dbf-11cb-47d5-b668-d1f5e43f7024-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 10:09:15 crc kubenswrapper[4861]: I0309 10:09:15.202137 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t8l6z"] Mar 09 10:09:15 crc kubenswrapper[4861]: I0309 10:09:15.209427 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-t8l6z"] Mar 09 10:09:15 crc kubenswrapper[4861]: I0309 10:09:15.659390 4861 scope.go:117] "RemoveContainer" containerID="7bbbec7e8a5f7da112767a5599dcb0b362a1472008cac65425c981acbf405224" Mar 09 10:09:15 crc kubenswrapper[4861]: E0309 10:09:15.659663 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 10:09:15 crc kubenswrapper[4861]: I0309 10:09:15.668119 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d247dbf-11cb-47d5-b668-d1f5e43f7024" path="/var/lib/kubelet/pods/7d247dbf-11cb-47d5-b668-d1f5e43f7024/volumes" Mar 09 10:09:28 crc kubenswrapper[4861]: I0309 10:09:28.658418 4861 scope.go:117] "RemoveContainer" containerID="7bbbec7e8a5f7da112767a5599dcb0b362a1472008cac65425c981acbf405224" Mar 09 10:09:28 crc kubenswrapper[4861]: E0309 10:09:28.660102 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 10:09:38 crc kubenswrapper[4861]: I0309 10:09:38.452654 4861 scope.go:117] "RemoveContainer" containerID="9c3ff4f259657f8189b7fa75b2b33d7656978117585aa35efe1e507abbd6e84f" Mar 09 10:09:39 crc kubenswrapper[4861]: I0309 10:09:39.658316 4861 scope.go:117] "RemoveContainer" containerID="7bbbec7e8a5f7da112767a5599dcb0b362a1472008cac65425c981acbf405224" Mar 09 10:09:39 crc kubenswrapper[4861]: E0309 10:09:39.658855 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 10:09:50 crc kubenswrapper[4861]: I0309 10:09:50.657882 4861 scope.go:117] "RemoveContainer" containerID="7bbbec7e8a5f7da112767a5599dcb0b362a1472008cac65425c981acbf405224" Mar 09 10:09:50 crc kubenswrapper[4861]: E0309 10:09:50.658786 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 10:10:00 crc kubenswrapper[4861]: I0309 10:10:00.146272 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550850-4rt8d"] Mar 09 10:10:00 crc kubenswrapper[4861]: E0309 10:10:00.147177 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d247dbf-11cb-47d5-b668-d1f5e43f7024" containerName="extract-content" Mar 09 10:10:00 crc kubenswrapper[4861]: I0309 10:10:00.147196 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d247dbf-11cb-47d5-b668-d1f5e43f7024" containerName="extract-content" Mar 09 10:10:00 crc kubenswrapper[4861]: E0309 10:10:00.147237 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d247dbf-11cb-47d5-b668-d1f5e43f7024" containerName="extract-utilities" Mar 09 10:10:00 crc kubenswrapper[4861]: I0309 10:10:00.147245 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d247dbf-11cb-47d5-b668-d1f5e43f7024" containerName="extract-utilities" Mar 09 10:10:00 crc kubenswrapper[4861]: E0309 10:10:00.147254 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d247dbf-11cb-47d5-b668-d1f5e43f7024" containerName="registry-server" Mar 09 10:10:00 crc kubenswrapper[4861]: I0309 10:10:00.147261 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d247dbf-11cb-47d5-b668-d1f5e43f7024" containerName="registry-server" Mar 09 10:10:00 crc kubenswrapper[4861]: I0309 10:10:00.147552 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d247dbf-11cb-47d5-b668-d1f5e43f7024" containerName="registry-server" Mar 09 10:10:00 crc kubenswrapper[4861]: I0309 10:10:00.148311 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550850-4rt8d" Mar 09 10:10:00 crc kubenswrapper[4861]: I0309 10:10:00.153044 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k86l8" Mar 09 10:10:00 crc kubenswrapper[4861]: I0309 10:10:00.153327 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 10:10:00 crc kubenswrapper[4861]: I0309 10:10:00.154284 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 10:10:00 crc kubenswrapper[4861]: I0309 10:10:00.163699 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550850-4rt8d"] Mar 09 10:10:00 crc kubenswrapper[4861]: I0309 10:10:00.320308 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxqk7\" (UniqueName: \"kubernetes.io/projected/19172363-d5d0-4123-af55-e3c88be4c003-kube-api-access-jxqk7\") pod \"auto-csr-approver-29550850-4rt8d\" (UID: \"19172363-d5d0-4123-af55-e3c88be4c003\") " pod="openshift-infra/auto-csr-approver-29550850-4rt8d" Mar 09 10:10:00 crc kubenswrapper[4861]: I0309 10:10:00.422241 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxqk7\" (UniqueName: \"kubernetes.io/projected/19172363-d5d0-4123-af55-e3c88be4c003-kube-api-access-jxqk7\") pod \"auto-csr-approver-29550850-4rt8d\" (UID: \"19172363-d5d0-4123-af55-e3c88be4c003\") " pod="openshift-infra/auto-csr-approver-29550850-4rt8d" Mar 09 10:10:00 crc kubenswrapper[4861]: I0309 10:10:00.441284 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxqk7\" (UniqueName: \"kubernetes.io/projected/19172363-d5d0-4123-af55-e3c88be4c003-kube-api-access-jxqk7\") pod \"auto-csr-approver-29550850-4rt8d\" (UID: \"19172363-d5d0-4123-af55-e3c88be4c003\") " pod="openshift-infra/auto-csr-approver-29550850-4rt8d" Mar 09 10:10:00 crc kubenswrapper[4861]: I0309 10:10:00.474297 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550850-4rt8d" Mar 09 10:10:00 crc kubenswrapper[4861]: I0309 10:10:00.705131 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-x62ks"] Mar 09 10:10:00 crc kubenswrapper[4861]: I0309 10:10:00.707460 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x62ks" Mar 09 10:10:00 crc kubenswrapper[4861]: I0309 10:10:00.714101 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x62ks"] Mar 09 10:10:00 crc kubenswrapper[4861]: I0309 10:10:00.833247 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5a24f08-e9ba-4913-8e33-decbadf7a5e1-catalog-content\") pod \"redhat-marketplace-x62ks\" (UID: \"d5a24f08-e9ba-4913-8e33-decbadf7a5e1\") " pod="openshift-marketplace/redhat-marketplace-x62ks" Mar 09 10:10:00 crc kubenswrapper[4861]: I0309 10:10:00.833425 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5a24f08-e9ba-4913-8e33-decbadf7a5e1-utilities\") pod \"redhat-marketplace-x62ks\" (UID: \"d5a24f08-e9ba-4913-8e33-decbadf7a5e1\") " pod="openshift-marketplace/redhat-marketplace-x62ks" Mar 09 10:10:00 crc kubenswrapper[4861]: I0309 10:10:00.833528 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjf9c\" (UniqueName: \"kubernetes.io/projected/d5a24f08-e9ba-4913-8e33-decbadf7a5e1-kube-api-access-zjf9c\") pod \"redhat-marketplace-x62ks\" (UID: \"d5a24f08-e9ba-4913-8e33-decbadf7a5e1\") " pod="openshift-marketplace/redhat-marketplace-x62ks" Mar 09 10:10:00 crc kubenswrapper[4861]: I0309 10:10:00.932594 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550850-4rt8d"] Mar 09 10:10:00 crc kubenswrapper[4861]: I0309 10:10:00.935133 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5a24f08-e9ba-4913-8e33-decbadf7a5e1-catalog-content\") pod \"redhat-marketplace-x62ks\" (UID: \"d5a24f08-e9ba-4913-8e33-decbadf7a5e1\") " pod="openshift-marketplace/redhat-marketplace-x62ks" Mar 09 10:10:00 crc kubenswrapper[4861]: I0309 10:10:00.935213 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5a24f08-e9ba-4913-8e33-decbadf7a5e1-utilities\") pod \"redhat-marketplace-x62ks\" (UID: \"d5a24f08-e9ba-4913-8e33-decbadf7a5e1\") " pod="openshift-marketplace/redhat-marketplace-x62ks" Mar 09 10:10:00 crc kubenswrapper[4861]: I0309 10:10:00.935239 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjf9c\" (UniqueName: \"kubernetes.io/projected/d5a24f08-e9ba-4913-8e33-decbadf7a5e1-kube-api-access-zjf9c\") pod \"redhat-marketplace-x62ks\" (UID: \"d5a24f08-e9ba-4913-8e33-decbadf7a5e1\") " pod="openshift-marketplace/redhat-marketplace-x62ks" Mar 09 10:10:00 crc kubenswrapper[4861]: I0309 10:10:00.935934 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5a24f08-e9ba-4913-8e33-decbadf7a5e1-utilities\") pod \"redhat-marketplace-x62ks\" (UID: \"d5a24f08-e9ba-4913-8e33-decbadf7a5e1\") " pod="openshift-marketplace/redhat-marketplace-x62ks" Mar 09 10:10:00 crc kubenswrapper[4861]: I0309 10:10:00.935971 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5a24f08-e9ba-4913-8e33-decbadf7a5e1-catalog-content\") pod \"redhat-marketplace-x62ks\" (UID: \"d5a24f08-e9ba-4913-8e33-decbadf7a5e1\") " pod="openshift-marketplace/redhat-marketplace-x62ks" Mar 09 10:10:00 crc kubenswrapper[4861]: I0309 10:10:00.956834 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjf9c\" (UniqueName: \"kubernetes.io/projected/d5a24f08-e9ba-4913-8e33-decbadf7a5e1-kube-api-access-zjf9c\") pod \"redhat-marketplace-x62ks\" (UID: \"d5a24f08-e9ba-4913-8e33-decbadf7a5e1\") " pod="openshift-marketplace/redhat-marketplace-x62ks" Mar 09 10:10:01 crc kubenswrapper[4861]: I0309 10:10:01.037230 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x62ks" Mar 09 10:10:01 crc kubenswrapper[4861]: I0309 10:10:01.279655 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550850-4rt8d" event={"ID":"19172363-d5d0-4123-af55-e3c88be4c003","Type":"ContainerStarted","Data":"16eff71bef0d0a4f6c34d392fb8aeddb2bc83ffd4a12afc3e04a3f97ee556e70"} Mar 09 10:10:01 crc kubenswrapper[4861]: I0309 10:10:01.511535 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-x62ks"] Mar 09 10:10:01 crc kubenswrapper[4861]: W0309 10:10:01.514974 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5a24f08_e9ba_4913_8e33_decbadf7a5e1.slice/crio-dc26ec138c14241cb55c679aabf09576fd033e22bebb783044c7a997e8e90e55 WatchSource:0}: Error finding container dc26ec138c14241cb55c679aabf09576fd033e22bebb783044c7a997e8e90e55: Status 404 returned error can't find the container with id dc26ec138c14241cb55c679aabf09576fd033e22bebb783044c7a997e8e90e55 Mar 09 10:10:01 crc kubenswrapper[4861]: I0309 10:10:01.658095 4861 scope.go:117] "RemoveContainer" containerID="7bbbec7e8a5f7da112767a5599dcb0b362a1472008cac65425c981acbf405224" Mar 09 10:10:01 crc kubenswrapper[4861]: E0309 10:10:01.658420 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 10:10:02 crc kubenswrapper[4861]: I0309 10:10:02.295358 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x62ks" event={"ID":"d5a24f08-e9ba-4913-8e33-decbadf7a5e1","Type":"ContainerDied","Data":"40f78f30795384702452035ca0d7a49c5250f5255f1cb31d0c5ee3e6b7cd1e35"} Mar 09 10:10:02 crc kubenswrapper[4861]: I0309 10:10:02.295208 4861 generic.go:334] "Generic (PLEG): container finished" podID="d5a24f08-e9ba-4913-8e33-decbadf7a5e1" containerID="40f78f30795384702452035ca0d7a49c5250f5255f1cb31d0c5ee3e6b7cd1e35" exitCode=0 Mar 09 10:10:02 crc kubenswrapper[4861]: I0309 10:10:02.297079 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x62ks" event={"ID":"d5a24f08-e9ba-4913-8e33-decbadf7a5e1","Type":"ContainerStarted","Data":"dc26ec138c14241cb55c679aabf09576fd033e22bebb783044c7a997e8e90e55"} Mar 09 10:10:03 crc kubenswrapper[4861]: I0309 10:10:03.308257 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x62ks" event={"ID":"d5a24f08-e9ba-4913-8e33-decbadf7a5e1","Type":"ContainerStarted","Data":"b8d819ec9769fac5ae3f7baa4e27a4e26f928dc589f265aaeb0bc708dacc856c"} Mar 09 10:10:03 crc kubenswrapper[4861]: I0309 10:10:03.310201 4861 generic.go:334] "Generic (PLEG): container finished" podID="19172363-d5d0-4123-af55-e3c88be4c003" containerID="406f31b9548ff94dc29c9f306cb7699fd28f6cb51aebac0b5d0913dd27ddff64" exitCode=0 Mar 09 10:10:03 crc kubenswrapper[4861]: I0309 10:10:03.310315 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550850-4rt8d" event={"ID":"19172363-d5d0-4123-af55-e3c88be4c003","Type":"ContainerDied","Data":"406f31b9548ff94dc29c9f306cb7699fd28f6cb51aebac0b5d0913dd27ddff64"} Mar 09 10:10:04 crc kubenswrapper[4861]: I0309 10:10:04.327075 4861 generic.go:334] "Generic (PLEG): container finished" podID="d5a24f08-e9ba-4913-8e33-decbadf7a5e1" containerID="b8d819ec9769fac5ae3f7baa4e27a4e26f928dc589f265aaeb0bc708dacc856c" exitCode=0 Mar 09 10:10:04 crc kubenswrapper[4861]: I0309 10:10:04.327169 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x62ks" event={"ID":"d5a24f08-e9ba-4913-8e33-decbadf7a5e1","Type":"ContainerDied","Data":"b8d819ec9769fac5ae3f7baa4e27a4e26f928dc589f265aaeb0bc708dacc856c"} Mar 09 10:10:04 crc kubenswrapper[4861]: I0309 10:10:04.688202 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550850-4rt8d" Mar 09 10:10:04 crc kubenswrapper[4861]: I0309 10:10:04.806504 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxqk7\" (UniqueName: \"kubernetes.io/projected/19172363-d5d0-4123-af55-e3c88be4c003-kube-api-access-jxqk7\") pod \"19172363-d5d0-4123-af55-e3c88be4c003\" (UID: \"19172363-d5d0-4123-af55-e3c88be4c003\") " Mar 09 10:10:04 crc kubenswrapper[4861]: I0309 10:10:04.812097 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19172363-d5d0-4123-af55-e3c88be4c003-kube-api-access-jxqk7" (OuterVolumeSpecName: "kube-api-access-jxqk7") pod "19172363-d5d0-4123-af55-e3c88be4c003" (UID: "19172363-d5d0-4123-af55-e3c88be4c003"). InnerVolumeSpecName "kube-api-access-jxqk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:10:04 crc kubenswrapper[4861]: I0309 10:10:04.908630 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxqk7\" (UniqueName: \"kubernetes.io/projected/19172363-d5d0-4123-af55-e3c88be4c003-kube-api-access-jxqk7\") on node \"crc\" DevicePath \"\"" Mar 09 10:10:05 crc kubenswrapper[4861]: I0309 10:10:05.340534 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x62ks" event={"ID":"d5a24f08-e9ba-4913-8e33-decbadf7a5e1","Type":"ContainerStarted","Data":"9dfb09fd022097f9d21cfee352bd9ec0d6e36b2d6dde4322d0910c9b46be1cfa"} Mar 09 10:10:05 crc kubenswrapper[4861]: I0309 10:10:05.343647 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550850-4rt8d" event={"ID":"19172363-d5d0-4123-af55-e3c88be4c003","Type":"ContainerDied","Data":"16eff71bef0d0a4f6c34d392fb8aeddb2bc83ffd4a12afc3e04a3f97ee556e70"} Mar 09 10:10:05 crc kubenswrapper[4861]: I0309 10:10:05.343698 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16eff71bef0d0a4f6c34d392fb8aeddb2bc83ffd4a12afc3e04a3f97ee556e70" Mar 09 10:10:05 crc kubenswrapper[4861]: I0309 10:10:05.343712 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550850-4rt8d" Mar 09 10:10:05 crc kubenswrapper[4861]: I0309 10:10:05.369875 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-x62ks" podStartSLOduration=2.915066435 podStartE2EDuration="5.36985314s" podCreationTimestamp="2026-03-09 10:10:00 +0000 UTC" firstStartedPulling="2026-03-09 10:10:02.297978036 +0000 UTC m=+3845.383017437" lastFinishedPulling="2026-03-09 10:10:04.752764741 +0000 UTC m=+3847.837804142" observedRunningTime="2026-03-09 10:10:05.366360657 +0000 UTC m=+3848.451400068" watchObservedRunningTime="2026-03-09 10:10:05.36985314 +0000 UTC m=+3848.454892561" Mar 09 10:10:05 crc kubenswrapper[4861]: I0309 10:10:05.781825 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550844-84kmk"] Mar 09 10:10:05 crc kubenswrapper[4861]: I0309 10:10:05.791821 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550844-84kmk"] Mar 09 10:10:07 crc kubenswrapper[4861]: I0309 10:10:07.670139 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64419f08-ab44-4dbf-ab6f-3ef60a1f1c3b" path="/var/lib/kubelet/pods/64419f08-ab44-4dbf-ab6f-3ef60a1f1c3b/volumes" Mar 09 10:10:11 crc kubenswrapper[4861]: I0309 10:10:11.037955 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-x62ks" Mar 09 10:10:11 crc kubenswrapper[4861]: I0309 10:10:11.038608 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-x62ks" Mar 09 10:10:11 crc kubenswrapper[4861]: I0309 10:10:11.090793 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-x62ks" Mar 09 10:10:11 crc kubenswrapper[4861]: I0309 10:10:11.447052 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-x62ks" Mar 09 10:10:11 crc kubenswrapper[4861]: I0309 10:10:11.499320 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x62ks"] Mar 09 10:10:12 crc kubenswrapper[4861]: I0309 10:10:12.657663 4861 scope.go:117] "RemoveContainer" containerID="7bbbec7e8a5f7da112767a5599dcb0b362a1472008cac65425c981acbf405224" Mar 09 10:10:12 crc kubenswrapper[4861]: E0309 10:10:12.658243 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 10:10:13 crc kubenswrapper[4861]: I0309 10:10:13.433440 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-x62ks" podUID="d5a24f08-e9ba-4913-8e33-decbadf7a5e1" containerName="registry-server" containerID="cri-o://9dfb09fd022097f9d21cfee352bd9ec0d6e36b2d6dde4322d0910c9b46be1cfa" gracePeriod=2 Mar 09 10:10:13 crc kubenswrapper[4861]: I0309 10:10:13.959742 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x62ks" Mar 09 10:10:14 crc kubenswrapper[4861]: I0309 10:10:14.082944 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5a24f08-e9ba-4913-8e33-decbadf7a5e1-catalog-content\") pod \"d5a24f08-e9ba-4913-8e33-decbadf7a5e1\" (UID: \"d5a24f08-e9ba-4913-8e33-decbadf7a5e1\") " Mar 09 10:10:14 crc kubenswrapper[4861]: I0309 10:10:14.083434 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjf9c\" (UniqueName: \"kubernetes.io/projected/d5a24f08-e9ba-4913-8e33-decbadf7a5e1-kube-api-access-zjf9c\") pod \"d5a24f08-e9ba-4913-8e33-decbadf7a5e1\" (UID: \"d5a24f08-e9ba-4913-8e33-decbadf7a5e1\") " Mar 09 10:10:14 crc kubenswrapper[4861]: I0309 10:10:14.083506 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5a24f08-e9ba-4913-8e33-decbadf7a5e1-utilities\") pod \"d5a24f08-e9ba-4913-8e33-decbadf7a5e1\" (UID: \"d5a24f08-e9ba-4913-8e33-decbadf7a5e1\") " Mar 09 10:10:14 crc kubenswrapper[4861]: I0309 10:10:14.084910 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5a24f08-e9ba-4913-8e33-decbadf7a5e1-utilities" (OuterVolumeSpecName: "utilities") pod "d5a24f08-e9ba-4913-8e33-decbadf7a5e1" (UID: "d5a24f08-e9ba-4913-8e33-decbadf7a5e1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:10:14 crc kubenswrapper[4861]: I0309 10:10:14.085445 4861 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d5a24f08-e9ba-4913-8e33-decbadf7a5e1-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 10:10:14 crc kubenswrapper[4861]: I0309 10:10:14.090979 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5a24f08-e9ba-4913-8e33-decbadf7a5e1-kube-api-access-zjf9c" (OuterVolumeSpecName: "kube-api-access-zjf9c") pod "d5a24f08-e9ba-4913-8e33-decbadf7a5e1" (UID: "d5a24f08-e9ba-4913-8e33-decbadf7a5e1"). InnerVolumeSpecName "kube-api-access-zjf9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:10:14 crc kubenswrapper[4861]: I0309 10:10:14.115827 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5a24f08-e9ba-4913-8e33-decbadf7a5e1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d5a24f08-e9ba-4913-8e33-decbadf7a5e1" (UID: "d5a24f08-e9ba-4913-8e33-decbadf7a5e1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:10:14 crc kubenswrapper[4861]: I0309 10:10:14.187349 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjf9c\" (UniqueName: \"kubernetes.io/projected/d5a24f08-e9ba-4913-8e33-decbadf7a5e1-kube-api-access-zjf9c\") on node \"crc\" DevicePath \"\"" Mar 09 10:10:14 crc kubenswrapper[4861]: I0309 10:10:14.187403 4861 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d5a24f08-e9ba-4913-8e33-decbadf7a5e1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 10:10:14 crc kubenswrapper[4861]: I0309 10:10:14.445791 4861 generic.go:334] "Generic (PLEG): container finished" podID="d5a24f08-e9ba-4913-8e33-decbadf7a5e1" containerID="9dfb09fd022097f9d21cfee352bd9ec0d6e36b2d6dde4322d0910c9b46be1cfa" exitCode=0 Mar 09 10:10:14 crc kubenswrapper[4861]: I0309 10:10:14.445847 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x62ks" event={"ID":"d5a24f08-e9ba-4913-8e33-decbadf7a5e1","Type":"ContainerDied","Data":"9dfb09fd022097f9d21cfee352bd9ec0d6e36b2d6dde4322d0910c9b46be1cfa"} Mar 09 10:10:14 crc kubenswrapper[4861]: I0309 10:10:14.445879 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-x62ks" event={"ID":"d5a24f08-e9ba-4913-8e33-decbadf7a5e1","Type":"ContainerDied","Data":"dc26ec138c14241cb55c679aabf09576fd033e22bebb783044c7a997e8e90e55"} Mar 09 10:10:14 crc kubenswrapper[4861]: I0309 10:10:14.445901 4861 scope.go:117] "RemoveContainer" containerID="9dfb09fd022097f9d21cfee352bd9ec0d6e36b2d6dde4322d0910c9b46be1cfa" Mar 09 10:10:14 crc kubenswrapper[4861]: I0309 10:10:14.446647 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-x62ks" Mar 09 10:10:14 crc kubenswrapper[4861]: I0309 10:10:14.468067 4861 scope.go:117] "RemoveContainer" containerID="b8d819ec9769fac5ae3f7baa4e27a4e26f928dc589f265aaeb0bc708dacc856c" Mar 09 10:10:14 crc kubenswrapper[4861]: I0309 10:10:14.486554 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-x62ks"] Mar 09 10:10:14 crc kubenswrapper[4861]: I0309 10:10:14.495522 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-x62ks"] Mar 09 10:10:14 crc kubenswrapper[4861]: I0309 10:10:14.517660 4861 scope.go:117] "RemoveContainer" containerID="40f78f30795384702452035ca0d7a49c5250f5255f1cb31d0c5ee3e6b7cd1e35" Mar 09 10:10:14 crc kubenswrapper[4861]: I0309 10:10:14.542560 4861 scope.go:117] "RemoveContainer" containerID="9dfb09fd022097f9d21cfee352bd9ec0d6e36b2d6dde4322d0910c9b46be1cfa" Mar 09 10:10:14 crc kubenswrapper[4861]: E0309 10:10:14.550584 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9dfb09fd022097f9d21cfee352bd9ec0d6e36b2d6dde4322d0910c9b46be1cfa\": container with ID starting with 9dfb09fd022097f9d21cfee352bd9ec0d6e36b2d6dde4322d0910c9b46be1cfa not found: ID does not exist" containerID="9dfb09fd022097f9d21cfee352bd9ec0d6e36b2d6dde4322d0910c9b46be1cfa" Mar 09 10:10:14 crc kubenswrapper[4861]: I0309 10:10:14.550694 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dfb09fd022097f9d21cfee352bd9ec0d6e36b2d6dde4322d0910c9b46be1cfa"} err="failed to get container status \"9dfb09fd022097f9d21cfee352bd9ec0d6e36b2d6dde4322d0910c9b46be1cfa\": rpc error: code = NotFound desc = could not find container \"9dfb09fd022097f9d21cfee352bd9ec0d6e36b2d6dde4322d0910c9b46be1cfa\": container with ID starting with 9dfb09fd022097f9d21cfee352bd9ec0d6e36b2d6dde4322d0910c9b46be1cfa not found: ID does not exist" Mar 09 10:10:14 crc kubenswrapper[4861]: I0309 10:10:14.550736 4861 scope.go:117] "RemoveContainer" containerID="b8d819ec9769fac5ae3f7baa4e27a4e26f928dc589f265aaeb0bc708dacc856c" Mar 09 10:10:14 crc kubenswrapper[4861]: E0309 10:10:14.551895 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8d819ec9769fac5ae3f7baa4e27a4e26f928dc589f265aaeb0bc708dacc856c\": container with ID starting with b8d819ec9769fac5ae3f7baa4e27a4e26f928dc589f265aaeb0bc708dacc856c not found: ID does not exist" containerID="b8d819ec9769fac5ae3f7baa4e27a4e26f928dc589f265aaeb0bc708dacc856c" Mar 09 10:10:14 crc kubenswrapper[4861]: I0309 10:10:14.551974 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8d819ec9769fac5ae3f7baa4e27a4e26f928dc589f265aaeb0bc708dacc856c"} err="failed to get container status \"b8d819ec9769fac5ae3f7baa4e27a4e26f928dc589f265aaeb0bc708dacc856c\": rpc error: code = NotFound desc = could not find container \"b8d819ec9769fac5ae3f7baa4e27a4e26f928dc589f265aaeb0bc708dacc856c\": container with ID starting with b8d819ec9769fac5ae3f7baa4e27a4e26f928dc589f265aaeb0bc708dacc856c not found: ID does not exist" Mar 09 10:10:14 crc kubenswrapper[4861]: I0309 10:10:14.552010 4861 scope.go:117] "RemoveContainer" containerID="40f78f30795384702452035ca0d7a49c5250f5255f1cb31d0c5ee3e6b7cd1e35" Mar 09 10:10:14 crc kubenswrapper[4861]: E0309 10:10:14.552305 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40f78f30795384702452035ca0d7a49c5250f5255f1cb31d0c5ee3e6b7cd1e35\": container with ID starting with 40f78f30795384702452035ca0d7a49c5250f5255f1cb31d0c5ee3e6b7cd1e35 not found: ID does not exist" containerID="40f78f30795384702452035ca0d7a49c5250f5255f1cb31d0c5ee3e6b7cd1e35" Mar 09 10:10:14 crc kubenswrapper[4861]: I0309 10:10:14.552357 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40f78f30795384702452035ca0d7a49c5250f5255f1cb31d0c5ee3e6b7cd1e35"} err="failed to get container status \"40f78f30795384702452035ca0d7a49c5250f5255f1cb31d0c5ee3e6b7cd1e35\": rpc error: code = NotFound desc = could not find container \"40f78f30795384702452035ca0d7a49c5250f5255f1cb31d0c5ee3e6b7cd1e35\": container with ID starting with 40f78f30795384702452035ca0d7a49c5250f5255f1cb31d0c5ee3e6b7cd1e35 not found: ID does not exist" Mar 09 10:10:15 crc kubenswrapper[4861]: I0309 10:10:15.670202 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5a24f08-e9ba-4913-8e33-decbadf7a5e1" path="/var/lib/kubelet/pods/d5a24f08-e9ba-4913-8e33-decbadf7a5e1/volumes" Mar 09 10:10:23 crc kubenswrapper[4861]: I0309 10:10:23.660268 4861 scope.go:117] "RemoveContainer" containerID="7bbbec7e8a5f7da112767a5599dcb0b362a1472008cac65425c981acbf405224" Mar 09 10:10:23 crc kubenswrapper[4861]: E0309 10:10:23.665886 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 10:10:35 crc kubenswrapper[4861]: I0309 10:10:35.657958 4861 scope.go:117] "RemoveContainer" containerID="7bbbec7e8a5f7da112767a5599dcb0b362a1472008cac65425c981acbf405224" Mar 09 10:10:35 crc kubenswrapper[4861]: E0309 10:10:35.658797 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 10:10:38 crc kubenswrapper[4861]: I0309 10:10:38.527123 4861 scope.go:117] "RemoveContainer" containerID="fc2c6afa73b4227406e48803bf40c6c5d86f0c387b3d847cb91eb1b35bf15333" Mar 09 10:10:50 crc kubenswrapper[4861]: I0309 10:10:50.657558 4861 scope.go:117] "RemoveContainer" containerID="7bbbec7e8a5f7da112767a5599dcb0b362a1472008cac65425c981acbf405224" Mar 09 10:10:50 crc kubenswrapper[4861]: E0309 10:10:50.658323 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 10:11:01 crc kubenswrapper[4861]: I0309 10:11:01.657845 4861 scope.go:117] "RemoveContainer" containerID="7bbbec7e8a5f7da112767a5599dcb0b362a1472008cac65425c981acbf405224" Mar 09 10:11:01 crc kubenswrapper[4861]: E0309 10:11:01.658836 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 10:11:15 crc kubenswrapper[4861]: I0309 10:11:15.664399 4861 scope.go:117] "RemoveContainer" containerID="7bbbec7e8a5f7da112767a5599dcb0b362a1472008cac65425c981acbf405224" Mar 09 10:11:15 crc kubenswrapper[4861]: E0309 10:11:15.665221 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 10:11:30 crc kubenswrapper[4861]: I0309 10:11:30.658706 4861 scope.go:117] "RemoveContainer" containerID="7bbbec7e8a5f7da112767a5599dcb0b362a1472008cac65425c981acbf405224" Mar 09 10:11:30 crc kubenswrapper[4861]: E0309 10:11:30.659603 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 10:11:39 crc kubenswrapper[4861]: I0309 10:11:39.676416 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wqw8f/must-gather-nz6cl"] Mar 09 10:11:39 crc kubenswrapper[4861]: E0309 10:11:39.677452 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19172363-d5d0-4123-af55-e3c88be4c003" containerName="oc" Mar 09 10:11:39 crc kubenswrapper[4861]: I0309 10:11:39.677472 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="19172363-d5d0-4123-af55-e3c88be4c003" containerName="oc" Mar 09 10:11:39 crc kubenswrapper[4861]: E0309 10:11:39.677498 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5a24f08-e9ba-4913-8e33-decbadf7a5e1" containerName="extract-content" Mar 09 10:11:39 crc kubenswrapper[4861]: I0309 10:11:39.677506 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5a24f08-e9ba-4913-8e33-decbadf7a5e1" containerName="extract-content" Mar 09 10:11:39 crc kubenswrapper[4861]: E0309 10:11:39.677525 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5a24f08-e9ba-4913-8e33-decbadf7a5e1" containerName="registry-server" Mar 09 10:11:39 crc kubenswrapper[4861]: I0309 10:11:39.677534 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5a24f08-e9ba-4913-8e33-decbadf7a5e1" containerName="registry-server" Mar 09 10:11:39 crc kubenswrapper[4861]: E0309 10:11:39.677575 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5a24f08-e9ba-4913-8e33-decbadf7a5e1" containerName="extract-utilities" Mar 09 10:11:39 crc kubenswrapper[4861]: I0309 10:11:39.677585 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5a24f08-e9ba-4913-8e33-decbadf7a5e1" containerName="extract-utilities" Mar 09 10:11:39 crc kubenswrapper[4861]: I0309 10:11:39.677795 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5a24f08-e9ba-4913-8e33-decbadf7a5e1" containerName="registry-server" Mar 09 10:11:39 crc kubenswrapper[4861]: I0309 10:11:39.677822 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="19172363-d5d0-4123-af55-e3c88be4c003" containerName="oc" Mar 09 10:11:39 crc kubenswrapper[4861]: I0309 10:11:39.679423 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wqw8f/must-gather-nz6cl" Mar 09 10:11:39 crc kubenswrapper[4861]: I0309 10:11:39.685231 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-wqw8f"/"openshift-service-ca.crt" Mar 09 10:11:39 crc kubenswrapper[4861]: I0309 10:11:39.685276 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-wqw8f"/"kube-root-ca.crt" Mar 09 10:11:39 crc kubenswrapper[4861]: I0309 10:11:39.712187 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/13cd8779-33ad-4eb1-93f2-2d1dec868cb9-must-gather-output\") pod \"must-gather-nz6cl\" (UID: \"13cd8779-33ad-4eb1-93f2-2d1dec868cb9\") " pod="openshift-must-gather-wqw8f/must-gather-nz6cl" Mar 09 10:11:39 crc kubenswrapper[4861]: I0309 10:11:39.712354 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v48wx\" (UniqueName: \"kubernetes.io/projected/13cd8779-33ad-4eb1-93f2-2d1dec868cb9-kube-api-access-v48wx\") pod \"must-gather-nz6cl\" (UID: \"13cd8779-33ad-4eb1-93f2-2d1dec868cb9\") " pod="openshift-must-gather-wqw8f/must-gather-nz6cl" Mar 09 10:11:39 crc kubenswrapper[4861]: I0309 10:11:39.771886 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wqw8f/must-gather-nz6cl"] Mar 09 10:11:39 crc kubenswrapper[4861]: I0309 10:11:39.813589 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v48wx\" (UniqueName: \"kubernetes.io/projected/13cd8779-33ad-4eb1-93f2-2d1dec868cb9-kube-api-access-v48wx\") pod \"must-gather-nz6cl\" (UID: \"13cd8779-33ad-4eb1-93f2-2d1dec868cb9\") " pod="openshift-must-gather-wqw8f/must-gather-nz6cl" Mar 09 10:11:39 crc kubenswrapper[4861]: I0309 10:11:39.813685 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/13cd8779-33ad-4eb1-93f2-2d1dec868cb9-must-gather-output\") pod \"must-gather-nz6cl\" (UID: \"13cd8779-33ad-4eb1-93f2-2d1dec868cb9\") " pod="openshift-must-gather-wqw8f/must-gather-nz6cl" Mar 09 10:11:39 crc kubenswrapper[4861]: I0309 10:11:39.814150 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/13cd8779-33ad-4eb1-93f2-2d1dec868cb9-must-gather-output\") pod \"must-gather-nz6cl\" (UID: \"13cd8779-33ad-4eb1-93f2-2d1dec868cb9\") " pod="openshift-must-gather-wqw8f/must-gather-nz6cl" Mar 09 10:11:39 crc kubenswrapper[4861]: I0309 10:11:39.831437 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v48wx\" (UniqueName: \"kubernetes.io/projected/13cd8779-33ad-4eb1-93f2-2d1dec868cb9-kube-api-access-v48wx\") pod \"must-gather-nz6cl\" (UID: \"13cd8779-33ad-4eb1-93f2-2d1dec868cb9\") " pod="openshift-must-gather-wqw8f/must-gather-nz6cl" Mar 09 10:11:40 crc kubenswrapper[4861]: I0309 10:11:40.007387 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wqw8f/must-gather-nz6cl" Mar 09 10:11:40 crc kubenswrapper[4861]: I0309 10:11:40.452714 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wqw8f/must-gather-nz6cl"] Mar 09 10:11:41 crc kubenswrapper[4861]: I0309 10:11:41.146620 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wqw8f/must-gather-nz6cl" event={"ID":"13cd8779-33ad-4eb1-93f2-2d1dec868cb9","Type":"ContainerStarted","Data":"abed4f95404b849d5539b361a3af5603da617586feb32afb7f2419909a73754c"} Mar 09 10:11:41 crc kubenswrapper[4861]: I0309 10:11:41.147248 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wqw8f/must-gather-nz6cl" event={"ID":"13cd8779-33ad-4eb1-93f2-2d1dec868cb9","Type":"ContainerStarted","Data":"1ae5f47fb5b55e0c37696da9f6e90ba678bd3f553f434d7e332a7e53b00b566d"} Mar 09 10:11:41 crc kubenswrapper[4861]: I0309 10:11:41.147265 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wqw8f/must-gather-nz6cl" event={"ID":"13cd8779-33ad-4eb1-93f2-2d1dec868cb9","Type":"ContainerStarted","Data":"f9239b072ea4b045aac7e33359af4372aeeee6cd8ea4425ea0c48ef6ca7b9a47"} Mar 09 10:11:41 crc kubenswrapper[4861]: I0309 10:11:41.168724 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wqw8f/must-gather-nz6cl" podStartSLOduration=2.16870561 podStartE2EDuration="2.16870561s" podCreationTimestamp="2026-03-09 10:11:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 10:11:41.164539429 +0000 UTC m=+3944.249578830" watchObservedRunningTime="2026-03-09 10:11:41.16870561 +0000 UTC m=+3944.253745011" Mar 09 10:11:44 crc kubenswrapper[4861]: I0309 10:11:44.609431 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wqw8f/crc-debug-fshsr"] Mar 09 10:11:44 crc kubenswrapper[4861]: I0309 10:11:44.613848 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wqw8f/crc-debug-fshsr" Mar 09 10:11:44 crc kubenswrapper[4861]: I0309 10:11:44.622803 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-wqw8f"/"default-dockercfg-r4n5s" Mar 09 10:11:44 crc kubenswrapper[4861]: I0309 10:11:44.658017 4861 scope.go:117] "RemoveContainer" containerID="7bbbec7e8a5f7da112767a5599dcb0b362a1472008cac65425c981acbf405224" Mar 09 10:11:44 crc kubenswrapper[4861]: E0309 10:11:44.658464 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 10:11:44 crc kubenswrapper[4861]: I0309 10:11:44.719335 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v974q\" (UniqueName: \"kubernetes.io/projected/b75097e6-51ad-4fb9-ba08-7accb1d1d866-kube-api-access-v974q\") pod \"crc-debug-fshsr\" (UID: \"b75097e6-51ad-4fb9-ba08-7accb1d1d866\") " pod="openshift-must-gather-wqw8f/crc-debug-fshsr" Mar 09 10:11:44 crc kubenswrapper[4861]: I0309 10:11:44.719561 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b75097e6-51ad-4fb9-ba08-7accb1d1d866-host\") pod \"crc-debug-fshsr\" (UID: \"b75097e6-51ad-4fb9-ba08-7accb1d1d866\") " pod="openshift-must-gather-wqw8f/crc-debug-fshsr" Mar 09 10:11:44 crc kubenswrapper[4861]: I0309 10:11:44.821418 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b75097e6-51ad-4fb9-ba08-7accb1d1d866-host\") pod \"crc-debug-fshsr\" (UID: \"b75097e6-51ad-4fb9-ba08-7accb1d1d866\") " pod="openshift-must-gather-wqw8f/crc-debug-fshsr" Mar 09 10:11:44 crc kubenswrapper[4861]: I0309 10:11:44.821510 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v974q\" (UniqueName: \"kubernetes.io/projected/b75097e6-51ad-4fb9-ba08-7accb1d1d866-kube-api-access-v974q\") pod \"crc-debug-fshsr\" (UID: \"b75097e6-51ad-4fb9-ba08-7accb1d1d866\") " pod="openshift-must-gather-wqw8f/crc-debug-fshsr" Mar 09 10:11:44 crc kubenswrapper[4861]: I0309 10:11:44.821593 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b75097e6-51ad-4fb9-ba08-7accb1d1d866-host\") pod \"crc-debug-fshsr\" (UID: \"b75097e6-51ad-4fb9-ba08-7accb1d1d866\") " pod="openshift-must-gather-wqw8f/crc-debug-fshsr" Mar 09 10:11:44 crc kubenswrapper[4861]: I0309 10:11:44.840842 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v974q\" (UniqueName: \"kubernetes.io/projected/b75097e6-51ad-4fb9-ba08-7accb1d1d866-kube-api-access-v974q\") pod \"crc-debug-fshsr\" (UID: \"b75097e6-51ad-4fb9-ba08-7accb1d1d866\") " pod="openshift-must-gather-wqw8f/crc-debug-fshsr" Mar 09 10:11:44 crc kubenswrapper[4861]: I0309 10:11:44.960838 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wqw8f/crc-debug-fshsr" Mar 09 10:11:45 crc kubenswrapper[4861]: W0309 10:11:45.024201 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb75097e6_51ad_4fb9_ba08_7accb1d1d866.slice/crio-e3d1c81da30d0f3baffd84c091b6ee79d76350d21bfe0a35280b439e3ae14c74 WatchSource:0}: Error finding container e3d1c81da30d0f3baffd84c091b6ee79d76350d21bfe0a35280b439e3ae14c74: Status 404 returned error can't find the container with id e3d1c81da30d0f3baffd84c091b6ee79d76350d21bfe0a35280b439e3ae14c74 Mar 09 10:11:45 crc kubenswrapper[4861]: I0309 10:11:45.184096 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wqw8f/crc-debug-fshsr" event={"ID":"b75097e6-51ad-4fb9-ba08-7accb1d1d866","Type":"ContainerStarted","Data":"e3d1c81da30d0f3baffd84c091b6ee79d76350d21bfe0a35280b439e3ae14c74"} Mar 09 10:11:46 crc kubenswrapper[4861]: I0309 10:11:46.195519 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wqw8f/crc-debug-fshsr" event={"ID":"b75097e6-51ad-4fb9-ba08-7accb1d1d866","Type":"ContainerStarted","Data":"162595f9247d08dd3b707d0360e464245380ea9d519c544096833e89f7ea5340"} Mar 09 10:11:46 crc kubenswrapper[4861]: I0309 10:11:46.223782 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wqw8f/crc-debug-fshsr" podStartSLOduration=2.223763409 podStartE2EDuration="2.223763409s" podCreationTimestamp="2026-03-09 10:11:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 10:11:46.209942669 +0000 UTC m=+3949.294982070" watchObservedRunningTime="2026-03-09 10:11:46.223763409 +0000 UTC m=+3949.308802810" Mar 09 10:11:55 crc kubenswrapper[4861]: I0309 10:11:55.657658 4861 scope.go:117] "RemoveContainer" containerID="7bbbec7e8a5f7da112767a5599dcb0b362a1472008cac65425c981acbf405224" Mar 09 10:11:56 crc kubenswrapper[4861]: I0309 10:11:56.280648 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" event={"ID":"6f7875e3-174f-4c67-8675-d878de74aa4f","Type":"ContainerStarted","Data":"8fd7c92f085870944d490eb7e486a6ca47dfe81622ad08419017e4cba450475b"} Mar 09 10:12:00 crc kubenswrapper[4861]: I0309 10:12:00.144317 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550852-hspj2"] Mar 09 10:12:00 crc kubenswrapper[4861]: I0309 10:12:00.148226 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550852-hspj2" Mar 09 10:12:00 crc kubenswrapper[4861]: I0309 10:12:00.150875 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 10:12:00 crc kubenswrapper[4861]: I0309 10:12:00.150985 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 10:12:00 crc kubenswrapper[4861]: I0309 10:12:00.151288 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k86l8" Mar 09 10:12:00 crc kubenswrapper[4861]: I0309 10:12:00.160650 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550852-hspj2"] Mar 09 10:12:00 crc kubenswrapper[4861]: I0309 10:12:00.298067 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plcgx\" (UniqueName: \"kubernetes.io/projected/a6f55d8b-22d1-4b1a-99a4-9ae7ce5e6366-kube-api-access-plcgx\") pod \"auto-csr-approver-29550852-hspj2\" (UID: \"a6f55d8b-22d1-4b1a-99a4-9ae7ce5e6366\") " pod="openshift-infra/auto-csr-approver-29550852-hspj2" Mar 09 10:12:00 crc kubenswrapper[4861]: I0309 10:12:00.400173 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plcgx\" (UniqueName: \"kubernetes.io/projected/a6f55d8b-22d1-4b1a-99a4-9ae7ce5e6366-kube-api-access-plcgx\") pod \"auto-csr-approver-29550852-hspj2\" (UID: \"a6f55d8b-22d1-4b1a-99a4-9ae7ce5e6366\") " pod="openshift-infra/auto-csr-approver-29550852-hspj2" Mar 09 10:12:00 crc kubenswrapper[4861]: I0309 10:12:00.421636 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plcgx\" (UniqueName: \"kubernetes.io/projected/a6f55d8b-22d1-4b1a-99a4-9ae7ce5e6366-kube-api-access-plcgx\") pod \"auto-csr-approver-29550852-hspj2\" (UID: \"a6f55d8b-22d1-4b1a-99a4-9ae7ce5e6366\") " pod="openshift-infra/auto-csr-approver-29550852-hspj2" Mar 09 10:12:00 crc kubenswrapper[4861]: I0309 10:12:00.475643 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550852-hspj2" Mar 09 10:12:00 crc kubenswrapper[4861]: I0309 10:12:00.966194 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550852-hspj2"] Mar 09 10:12:00 crc kubenswrapper[4861]: W0309 10:12:00.967079 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6f55d8b_22d1_4b1a_99a4_9ae7ce5e6366.slice/crio-2b68d9395f92da5797d37d710c457995bc34fab427e5d6d045b2087c27bd441f WatchSource:0}: Error finding container 2b68d9395f92da5797d37d710c457995bc34fab427e5d6d045b2087c27bd441f: Status 404 returned error can't find the container with id 2b68d9395f92da5797d37d710c457995bc34fab427e5d6d045b2087c27bd441f Mar 09 10:12:01 crc kubenswrapper[4861]: I0309 10:12:01.324524 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550852-hspj2" event={"ID":"a6f55d8b-22d1-4b1a-99a4-9ae7ce5e6366","Type":"ContainerStarted","Data":"2b68d9395f92da5797d37d710c457995bc34fab427e5d6d045b2087c27bd441f"} Mar 09 10:12:02 crc kubenswrapper[4861]: I0309 10:12:02.336749 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550852-hspj2" event={"ID":"a6f55d8b-22d1-4b1a-99a4-9ae7ce5e6366","Type":"ContainerStarted","Data":"bcff50287b12a70d33f0ccd64336df345f79598f2f3d1d475b8a0083a67ec703"} Mar 09 10:12:02 crc kubenswrapper[4861]: I0309 10:12:02.352122 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550852-hspj2" podStartSLOduration=1.420664797 podStartE2EDuration="2.352101225s" podCreationTimestamp="2026-03-09 10:12:00 +0000 UTC" firstStartedPulling="2026-03-09 10:12:00.969775739 +0000 UTC m=+3964.054815140" lastFinishedPulling="2026-03-09 10:12:01.901212167 +0000 UTC m=+3964.986251568" observedRunningTime="2026-03-09 10:12:02.352029312 +0000 UTC m=+3965.437068733" watchObservedRunningTime="2026-03-09 10:12:02.352101225 +0000 UTC m=+3965.437140666" Mar 09 10:12:02 crc kubenswrapper[4861]: E0309 10:12:02.637157 4861 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6f55d8b_22d1_4b1a_99a4_9ae7ce5e6366.slice/crio-bcff50287b12a70d33f0ccd64336df345f79598f2f3d1d475b8a0083a67ec703.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6f55d8b_22d1_4b1a_99a4_9ae7ce5e6366.slice/crio-conmon-bcff50287b12a70d33f0ccd64336df345f79598f2f3d1d475b8a0083a67ec703.scope\": RecentStats: unable to find data in memory cache]" Mar 09 10:12:03 crc kubenswrapper[4861]: I0309 10:12:03.347168 4861 generic.go:334] "Generic (PLEG): container finished" podID="a6f55d8b-22d1-4b1a-99a4-9ae7ce5e6366" containerID="bcff50287b12a70d33f0ccd64336df345f79598f2f3d1d475b8a0083a67ec703" exitCode=0 Mar 09 10:12:03 crc kubenswrapper[4861]: I0309 10:12:03.347404 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550852-hspj2" event={"ID":"a6f55d8b-22d1-4b1a-99a4-9ae7ce5e6366","Type":"ContainerDied","Data":"bcff50287b12a70d33f0ccd64336df345f79598f2f3d1d475b8a0083a67ec703"} Mar 09 10:12:04 crc kubenswrapper[4861]: I0309 10:12:04.720699 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550852-hspj2" Mar 09 10:12:04 crc kubenswrapper[4861]: I0309 10:12:04.789511 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plcgx\" (UniqueName: \"kubernetes.io/projected/a6f55d8b-22d1-4b1a-99a4-9ae7ce5e6366-kube-api-access-plcgx\") pod \"a6f55d8b-22d1-4b1a-99a4-9ae7ce5e6366\" (UID: \"a6f55d8b-22d1-4b1a-99a4-9ae7ce5e6366\") " Mar 09 10:12:04 crc kubenswrapper[4861]: I0309 10:12:04.798982 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6f55d8b-22d1-4b1a-99a4-9ae7ce5e6366-kube-api-access-plcgx" (OuterVolumeSpecName: "kube-api-access-plcgx") pod "a6f55d8b-22d1-4b1a-99a4-9ae7ce5e6366" (UID: "a6f55d8b-22d1-4b1a-99a4-9ae7ce5e6366"). InnerVolumeSpecName "kube-api-access-plcgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:12:04 crc kubenswrapper[4861]: I0309 10:12:04.892767 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plcgx\" (UniqueName: \"kubernetes.io/projected/a6f55d8b-22d1-4b1a-99a4-9ae7ce5e6366-kube-api-access-plcgx\") on node \"crc\" DevicePath \"\"" Mar 09 10:12:05 crc kubenswrapper[4861]: I0309 10:12:05.365595 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550852-hspj2" event={"ID":"a6f55d8b-22d1-4b1a-99a4-9ae7ce5e6366","Type":"ContainerDied","Data":"2b68d9395f92da5797d37d710c457995bc34fab427e5d6d045b2087c27bd441f"} Mar 09 10:12:05 crc kubenswrapper[4861]: I0309 10:12:05.365652 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b68d9395f92da5797d37d710c457995bc34fab427e5d6d045b2087c27bd441f" Mar 09 10:12:05 crc kubenswrapper[4861]: I0309 10:12:05.365987 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550852-hspj2" Mar 09 10:12:05 crc kubenswrapper[4861]: I0309 10:12:05.421004 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550846-ffzgt"] Mar 09 10:12:05 crc kubenswrapper[4861]: I0309 10:12:05.429937 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550846-ffzgt"] Mar 09 10:12:05 crc kubenswrapper[4861]: I0309 10:12:05.668771 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f05e1178-3781-49c1-a69f-8247af50a922" path="/var/lib/kubelet/pods/f05e1178-3781-49c1-a69f-8247af50a922/volumes" Mar 09 10:12:19 crc kubenswrapper[4861]: I0309 10:12:19.486945 4861 generic.go:334] "Generic (PLEG): container finished" podID="b75097e6-51ad-4fb9-ba08-7accb1d1d866" containerID="162595f9247d08dd3b707d0360e464245380ea9d519c544096833e89f7ea5340" exitCode=0 Mar 09 10:12:19 crc kubenswrapper[4861]: I0309 10:12:19.487029 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wqw8f/crc-debug-fshsr" event={"ID":"b75097e6-51ad-4fb9-ba08-7accb1d1d866","Type":"ContainerDied","Data":"162595f9247d08dd3b707d0360e464245380ea9d519c544096833e89f7ea5340"} Mar 09 10:12:20 crc kubenswrapper[4861]: I0309 10:12:20.594664 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wqw8f/crc-debug-fshsr" Mar 09 10:12:20 crc kubenswrapper[4861]: I0309 10:12:20.630332 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-wqw8f/crc-debug-fshsr"] Mar 09 10:12:20 crc kubenswrapper[4861]: I0309 10:12:20.639686 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-wqw8f/crc-debug-fshsr"] Mar 09 10:12:20 crc kubenswrapper[4861]: I0309 10:12:20.707706 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v974q\" (UniqueName: \"kubernetes.io/projected/b75097e6-51ad-4fb9-ba08-7accb1d1d866-kube-api-access-v974q\") pod \"b75097e6-51ad-4fb9-ba08-7accb1d1d866\" (UID: \"b75097e6-51ad-4fb9-ba08-7accb1d1d866\") " Mar 09 10:12:20 crc kubenswrapper[4861]: I0309 10:12:20.707826 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b75097e6-51ad-4fb9-ba08-7accb1d1d866-host\") pod \"b75097e6-51ad-4fb9-ba08-7accb1d1d866\" (UID: \"b75097e6-51ad-4fb9-ba08-7accb1d1d866\") " Mar 09 10:12:20 crc kubenswrapper[4861]: I0309 10:12:20.707960 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b75097e6-51ad-4fb9-ba08-7accb1d1d866-host" (OuterVolumeSpecName: "host") pod "b75097e6-51ad-4fb9-ba08-7accb1d1d866" (UID: "b75097e6-51ad-4fb9-ba08-7accb1d1d866"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 10:12:20 crc kubenswrapper[4861]: I0309 10:12:20.708327 4861 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b75097e6-51ad-4fb9-ba08-7accb1d1d866-host\") on node \"crc\" DevicePath \"\"" Mar 09 10:12:20 crc kubenswrapper[4861]: I0309 10:12:20.724230 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b75097e6-51ad-4fb9-ba08-7accb1d1d866-kube-api-access-v974q" (OuterVolumeSpecName: "kube-api-access-v974q") pod "b75097e6-51ad-4fb9-ba08-7accb1d1d866" (UID: "b75097e6-51ad-4fb9-ba08-7accb1d1d866"). InnerVolumeSpecName "kube-api-access-v974q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:12:20 crc kubenswrapper[4861]: I0309 10:12:20.810756 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v974q\" (UniqueName: \"kubernetes.io/projected/b75097e6-51ad-4fb9-ba08-7accb1d1d866-kube-api-access-v974q\") on node \"crc\" DevicePath \"\"" Mar 09 10:12:21 crc kubenswrapper[4861]: I0309 10:12:21.506099 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3d1c81da30d0f3baffd84c091b6ee79d76350d21bfe0a35280b439e3ae14c74" Mar 09 10:12:21 crc kubenswrapper[4861]: I0309 10:12:21.506186 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wqw8f/crc-debug-fshsr" Mar 09 10:12:21 crc kubenswrapper[4861]: I0309 10:12:21.668745 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b75097e6-51ad-4fb9-ba08-7accb1d1d866" path="/var/lib/kubelet/pods/b75097e6-51ad-4fb9-ba08-7accb1d1d866/volumes" Mar 09 10:12:21 crc kubenswrapper[4861]: I0309 10:12:21.819520 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wqw8f/crc-debug-559dz"] Mar 09 10:12:21 crc kubenswrapper[4861]: E0309 10:12:21.820011 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b75097e6-51ad-4fb9-ba08-7accb1d1d866" containerName="container-00" Mar 09 10:12:21 crc kubenswrapper[4861]: I0309 10:12:21.820036 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="b75097e6-51ad-4fb9-ba08-7accb1d1d866" containerName="container-00" Mar 09 10:12:21 crc kubenswrapper[4861]: E0309 10:12:21.820074 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6f55d8b-22d1-4b1a-99a4-9ae7ce5e6366" containerName="oc" Mar 09 10:12:21 crc kubenswrapper[4861]: I0309 10:12:21.820084 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6f55d8b-22d1-4b1a-99a4-9ae7ce5e6366" containerName="oc" Mar 09 10:12:21 crc kubenswrapper[4861]: I0309 10:12:21.820325 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6f55d8b-22d1-4b1a-99a4-9ae7ce5e6366" containerName="oc" Mar 09 10:12:21 crc kubenswrapper[4861]: I0309 10:12:21.820360 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="b75097e6-51ad-4fb9-ba08-7accb1d1d866" containerName="container-00" Mar 09 10:12:21 crc kubenswrapper[4861]: I0309 10:12:21.821237 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wqw8f/crc-debug-559dz" Mar 09 10:12:21 crc kubenswrapper[4861]: I0309 10:12:21.823601 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-wqw8f"/"default-dockercfg-r4n5s" Mar 09 10:12:21 crc kubenswrapper[4861]: I0309 10:12:21.930951 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dc58dba0-5093-4545-8962-f80724f213f6-host\") pod \"crc-debug-559dz\" (UID: \"dc58dba0-5093-4545-8962-f80724f213f6\") " pod="openshift-must-gather-wqw8f/crc-debug-559dz" Mar 09 10:12:21 crc kubenswrapper[4861]: I0309 10:12:21.931015 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hl4lb\" (UniqueName: \"kubernetes.io/projected/dc58dba0-5093-4545-8962-f80724f213f6-kube-api-access-hl4lb\") pod \"crc-debug-559dz\" (UID: \"dc58dba0-5093-4545-8962-f80724f213f6\") " pod="openshift-must-gather-wqw8f/crc-debug-559dz" Mar 09 10:12:22 crc kubenswrapper[4861]: I0309 10:12:22.032500 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dc58dba0-5093-4545-8962-f80724f213f6-host\") pod \"crc-debug-559dz\" (UID: \"dc58dba0-5093-4545-8962-f80724f213f6\") " pod="openshift-must-gather-wqw8f/crc-debug-559dz" Mar 09 10:12:22 crc kubenswrapper[4861]: I0309 10:12:22.032564 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hl4lb\" (UniqueName: \"kubernetes.io/projected/dc58dba0-5093-4545-8962-f80724f213f6-kube-api-access-hl4lb\") pod \"crc-debug-559dz\" (UID: \"dc58dba0-5093-4545-8962-f80724f213f6\") " pod="openshift-must-gather-wqw8f/crc-debug-559dz" Mar 09 10:12:22 crc kubenswrapper[4861]: I0309 10:12:22.032716 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dc58dba0-5093-4545-8962-f80724f213f6-host\") pod \"crc-debug-559dz\" (UID: \"dc58dba0-5093-4545-8962-f80724f213f6\") " pod="openshift-must-gather-wqw8f/crc-debug-559dz" Mar 09 10:12:22 crc kubenswrapper[4861]: I0309 10:12:22.050684 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hl4lb\" (UniqueName: \"kubernetes.io/projected/dc58dba0-5093-4545-8962-f80724f213f6-kube-api-access-hl4lb\") pod \"crc-debug-559dz\" (UID: \"dc58dba0-5093-4545-8962-f80724f213f6\") " pod="openshift-must-gather-wqw8f/crc-debug-559dz" Mar 09 10:12:22 crc kubenswrapper[4861]: I0309 10:12:22.139267 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wqw8f/crc-debug-559dz" Mar 09 10:12:22 crc kubenswrapper[4861]: I0309 10:12:22.520124 4861 generic.go:334] "Generic (PLEG): container finished" podID="dc58dba0-5093-4545-8962-f80724f213f6" containerID="0bed95835d789d58d0c0715d28da59a92410e8efeaf86b0c98e5a467fb529fe6" exitCode=0 Mar 09 10:12:22 crc kubenswrapper[4861]: I0309 10:12:22.520470 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wqw8f/crc-debug-559dz" event={"ID":"dc58dba0-5093-4545-8962-f80724f213f6","Type":"ContainerDied","Data":"0bed95835d789d58d0c0715d28da59a92410e8efeaf86b0c98e5a467fb529fe6"} Mar 09 10:12:22 crc kubenswrapper[4861]: I0309 10:12:22.520557 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wqw8f/crc-debug-559dz" event={"ID":"dc58dba0-5093-4545-8962-f80724f213f6","Type":"ContainerStarted","Data":"ef6b62cce50d1b853a95ac988a63face0843ee272de87206c6eb3f230fd0c7ac"} Mar 09 10:12:22 crc kubenswrapper[4861]: I0309 10:12:22.915651 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-wqw8f/crc-debug-559dz"] Mar 09 10:12:22 crc kubenswrapper[4861]: I0309 10:12:22.926040 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-wqw8f/crc-debug-559dz"] Mar 09 10:12:23 crc kubenswrapper[4861]: I0309 10:12:23.691150 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wqw8f/crc-debug-559dz" Mar 09 10:12:23 crc kubenswrapper[4861]: I0309 10:12:23.759622 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dc58dba0-5093-4545-8962-f80724f213f6-host\") pod \"dc58dba0-5093-4545-8962-f80724f213f6\" (UID: \"dc58dba0-5093-4545-8962-f80724f213f6\") " Mar 09 10:12:23 crc kubenswrapper[4861]: I0309 10:12:23.759701 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hl4lb\" (UniqueName: \"kubernetes.io/projected/dc58dba0-5093-4545-8962-f80724f213f6-kube-api-access-hl4lb\") pod \"dc58dba0-5093-4545-8962-f80724f213f6\" (UID: \"dc58dba0-5093-4545-8962-f80724f213f6\") " Mar 09 10:12:23 crc kubenswrapper[4861]: I0309 10:12:23.759727 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dc58dba0-5093-4545-8962-f80724f213f6-host" (OuterVolumeSpecName: "host") pod "dc58dba0-5093-4545-8962-f80724f213f6" (UID: "dc58dba0-5093-4545-8962-f80724f213f6"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 10:12:23 crc kubenswrapper[4861]: I0309 10:12:23.760534 4861 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dc58dba0-5093-4545-8962-f80724f213f6-host\") on node \"crc\" DevicePath \"\"" Mar 09 10:12:23 crc kubenswrapper[4861]: I0309 10:12:23.765065 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc58dba0-5093-4545-8962-f80724f213f6-kube-api-access-hl4lb" (OuterVolumeSpecName: "kube-api-access-hl4lb") pod "dc58dba0-5093-4545-8962-f80724f213f6" (UID: "dc58dba0-5093-4545-8962-f80724f213f6"). InnerVolumeSpecName "kube-api-access-hl4lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:12:23 crc kubenswrapper[4861]: I0309 10:12:23.862459 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hl4lb\" (UniqueName: \"kubernetes.io/projected/dc58dba0-5093-4545-8962-f80724f213f6-kube-api-access-hl4lb\") on node \"crc\" DevicePath \"\"" Mar 09 10:12:24 crc kubenswrapper[4861]: I0309 10:12:24.086566 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wqw8f/crc-debug-7dj9m"] Mar 09 10:12:24 crc kubenswrapper[4861]: E0309 10:12:24.086943 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc58dba0-5093-4545-8962-f80724f213f6" containerName="container-00" Mar 09 10:12:24 crc kubenswrapper[4861]: I0309 10:12:24.086955 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc58dba0-5093-4545-8962-f80724f213f6" containerName="container-00" Mar 09 10:12:24 crc kubenswrapper[4861]: I0309 10:12:24.087182 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc58dba0-5093-4545-8962-f80724f213f6" containerName="container-00" Mar 09 10:12:24 crc kubenswrapper[4861]: I0309 10:12:24.087773 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wqw8f/crc-debug-7dj9m" Mar 09 10:12:24 crc kubenswrapper[4861]: I0309 10:12:24.168476 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccwjq\" (UniqueName: \"kubernetes.io/projected/2cfa30e7-6438-48c0-890d-42a0d6405a83-kube-api-access-ccwjq\") pod \"crc-debug-7dj9m\" (UID: \"2cfa30e7-6438-48c0-890d-42a0d6405a83\") " pod="openshift-must-gather-wqw8f/crc-debug-7dj9m" Mar 09 10:12:24 crc kubenswrapper[4861]: I0309 10:12:24.168521 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2cfa30e7-6438-48c0-890d-42a0d6405a83-host\") pod \"crc-debug-7dj9m\" (UID: \"2cfa30e7-6438-48c0-890d-42a0d6405a83\") " pod="openshift-must-gather-wqw8f/crc-debug-7dj9m" Mar 09 10:12:24 crc kubenswrapper[4861]: I0309 10:12:24.270861 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccwjq\" (UniqueName: \"kubernetes.io/projected/2cfa30e7-6438-48c0-890d-42a0d6405a83-kube-api-access-ccwjq\") pod \"crc-debug-7dj9m\" (UID: \"2cfa30e7-6438-48c0-890d-42a0d6405a83\") " pod="openshift-must-gather-wqw8f/crc-debug-7dj9m" Mar 09 10:12:24 crc kubenswrapper[4861]: I0309 10:12:24.271112 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2cfa30e7-6438-48c0-890d-42a0d6405a83-host\") pod \"crc-debug-7dj9m\" (UID: \"2cfa30e7-6438-48c0-890d-42a0d6405a83\") " pod="openshift-must-gather-wqw8f/crc-debug-7dj9m" Mar 09 10:12:24 crc kubenswrapper[4861]: I0309 10:12:24.271183 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2cfa30e7-6438-48c0-890d-42a0d6405a83-host\") pod \"crc-debug-7dj9m\" (UID: \"2cfa30e7-6438-48c0-890d-42a0d6405a83\") " pod="openshift-must-gather-wqw8f/crc-debug-7dj9m" Mar 09 10:12:24 crc kubenswrapper[4861]: I0309 10:12:24.289307 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccwjq\" (UniqueName: \"kubernetes.io/projected/2cfa30e7-6438-48c0-890d-42a0d6405a83-kube-api-access-ccwjq\") pod \"crc-debug-7dj9m\" (UID: \"2cfa30e7-6438-48c0-890d-42a0d6405a83\") " pod="openshift-must-gather-wqw8f/crc-debug-7dj9m" Mar 09 10:12:24 crc kubenswrapper[4861]: I0309 10:12:24.404240 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wqw8f/crc-debug-7dj9m" Mar 09 10:12:24 crc kubenswrapper[4861]: I0309 10:12:24.553147 4861 scope.go:117] "RemoveContainer" containerID="0bed95835d789d58d0c0715d28da59a92410e8efeaf86b0c98e5a467fb529fe6" Mar 09 10:12:24 crc kubenswrapper[4861]: I0309 10:12:24.553267 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wqw8f/crc-debug-559dz" Mar 09 10:12:24 crc kubenswrapper[4861]: I0309 10:12:24.564643 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wqw8f/crc-debug-7dj9m" event={"ID":"2cfa30e7-6438-48c0-890d-42a0d6405a83","Type":"ContainerStarted","Data":"ae2eb398ca8493af83d0a1f2576f3b158379db33d1e07544adf55229bd5b38b5"} Mar 09 10:12:25 crc kubenswrapper[4861]: I0309 10:12:25.574694 4861 generic.go:334] "Generic (PLEG): container finished" podID="2cfa30e7-6438-48c0-890d-42a0d6405a83" containerID="ba4b89d3ecce548d7ad6c8f5bf022cc468e768a187dbe927d9e724babd31a110" exitCode=0 Mar 09 10:12:25 crc kubenswrapper[4861]: I0309 10:12:25.574758 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wqw8f/crc-debug-7dj9m" event={"ID":"2cfa30e7-6438-48c0-890d-42a0d6405a83","Type":"ContainerDied","Data":"ba4b89d3ecce548d7ad6c8f5bf022cc468e768a187dbe927d9e724babd31a110"} Mar 09 10:12:25 crc kubenswrapper[4861]: I0309 10:12:25.627662 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-wqw8f/crc-debug-7dj9m"] Mar 09 10:12:25 crc kubenswrapper[4861]: I0309 10:12:25.635985 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-wqw8f/crc-debug-7dj9m"] Mar 09 10:12:25 crc kubenswrapper[4861]: I0309 10:12:25.668233 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc58dba0-5093-4545-8962-f80724f213f6" path="/var/lib/kubelet/pods/dc58dba0-5093-4545-8962-f80724f213f6/volumes" Mar 09 10:12:26 crc kubenswrapper[4861]: I0309 10:12:26.704080 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wqw8f/crc-debug-7dj9m" Mar 09 10:12:26 crc kubenswrapper[4861]: I0309 10:12:26.824620 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2cfa30e7-6438-48c0-890d-42a0d6405a83-host\") pod \"2cfa30e7-6438-48c0-890d-42a0d6405a83\" (UID: \"2cfa30e7-6438-48c0-890d-42a0d6405a83\") " Mar 09 10:12:26 crc kubenswrapper[4861]: I0309 10:12:26.824729 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2cfa30e7-6438-48c0-890d-42a0d6405a83-host" (OuterVolumeSpecName: "host") pod "2cfa30e7-6438-48c0-890d-42a0d6405a83" (UID: "2cfa30e7-6438-48c0-890d-42a0d6405a83"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 10:12:26 crc kubenswrapper[4861]: I0309 10:12:26.824749 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ccwjq\" (UniqueName: \"kubernetes.io/projected/2cfa30e7-6438-48c0-890d-42a0d6405a83-kube-api-access-ccwjq\") pod \"2cfa30e7-6438-48c0-890d-42a0d6405a83\" (UID: \"2cfa30e7-6438-48c0-890d-42a0d6405a83\") " Mar 09 10:12:26 crc kubenswrapper[4861]: I0309 10:12:26.825802 4861 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2cfa30e7-6438-48c0-890d-42a0d6405a83-host\") on node \"crc\" DevicePath \"\"" Mar 09 10:12:26 crc kubenswrapper[4861]: I0309 10:12:26.831618 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cfa30e7-6438-48c0-890d-42a0d6405a83-kube-api-access-ccwjq" (OuterVolumeSpecName: "kube-api-access-ccwjq") pod "2cfa30e7-6438-48c0-890d-42a0d6405a83" (UID: "2cfa30e7-6438-48c0-890d-42a0d6405a83"). InnerVolumeSpecName "kube-api-access-ccwjq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:12:26 crc kubenswrapper[4861]: I0309 10:12:26.927310 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ccwjq\" (UniqueName: \"kubernetes.io/projected/2cfa30e7-6438-48c0-890d-42a0d6405a83-kube-api-access-ccwjq\") on node \"crc\" DevicePath \"\"" Mar 09 10:12:27 crc kubenswrapper[4861]: I0309 10:12:27.595297 4861 scope.go:117] "RemoveContainer" containerID="ba4b89d3ecce548d7ad6c8f5bf022cc468e768a187dbe927d9e724babd31a110" Mar 09 10:12:27 crc kubenswrapper[4861]: I0309 10:12:27.595353 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wqw8f/crc-debug-7dj9m" Mar 09 10:12:27 crc kubenswrapper[4861]: I0309 10:12:27.669515 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cfa30e7-6438-48c0-890d-42a0d6405a83" path="/var/lib/kubelet/pods/2cfa30e7-6438-48c0-890d-42a0d6405a83/volumes" Mar 09 10:12:38 crc kubenswrapper[4861]: I0309 10:12:38.634570 4861 scope.go:117] "RemoveContainer" containerID="542882b525db1989b92af9b58fe0999c0683a62bbd7717d146555b4991e800b5" Mar 09 10:12:51 crc kubenswrapper[4861]: I0309 10:12:51.460936 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9qrfh"] Mar 09 10:12:51 crc kubenswrapper[4861]: E0309 10:12:51.461958 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cfa30e7-6438-48c0-890d-42a0d6405a83" containerName="container-00" Mar 09 10:12:51 crc kubenswrapper[4861]: I0309 10:12:51.461974 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cfa30e7-6438-48c0-890d-42a0d6405a83" containerName="container-00" Mar 09 10:12:51 crc kubenswrapper[4861]: I0309 10:12:51.462186 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cfa30e7-6438-48c0-890d-42a0d6405a83" containerName="container-00" Mar 09 10:12:51 crc kubenswrapper[4861]: I0309 10:12:51.463946 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9qrfh" Mar 09 10:12:51 crc kubenswrapper[4861]: I0309 10:12:51.483955 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd6fb393-541e-443b-8a05-9b2bd02aa078-utilities\") pod \"redhat-operators-9qrfh\" (UID: \"dd6fb393-541e-443b-8a05-9b2bd02aa078\") " pod="openshift-marketplace/redhat-operators-9qrfh" Mar 09 10:12:51 crc kubenswrapper[4861]: I0309 10:12:51.484335 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd6fb393-541e-443b-8a05-9b2bd02aa078-catalog-content\") pod \"redhat-operators-9qrfh\" (UID: \"dd6fb393-541e-443b-8a05-9b2bd02aa078\") " pod="openshift-marketplace/redhat-operators-9qrfh" Mar 09 10:12:51 crc kubenswrapper[4861]: I0309 10:12:51.484850 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8wr2\" (UniqueName: \"kubernetes.io/projected/dd6fb393-541e-443b-8a05-9b2bd02aa078-kube-api-access-b8wr2\") pod \"redhat-operators-9qrfh\" (UID: \"dd6fb393-541e-443b-8a05-9b2bd02aa078\") " pod="openshift-marketplace/redhat-operators-9qrfh" Mar 09 10:12:51 crc kubenswrapper[4861]: I0309 10:12:51.495713 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9qrfh"] Mar 09 10:12:51 crc kubenswrapper[4861]: I0309 10:12:51.586440 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8wr2\" (UniqueName: \"kubernetes.io/projected/dd6fb393-541e-443b-8a05-9b2bd02aa078-kube-api-access-b8wr2\") pod \"redhat-operators-9qrfh\" (UID: \"dd6fb393-541e-443b-8a05-9b2bd02aa078\") " pod="openshift-marketplace/redhat-operators-9qrfh" Mar 09 10:12:51 crc kubenswrapper[4861]: I0309 10:12:51.586548 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd6fb393-541e-443b-8a05-9b2bd02aa078-utilities\") pod \"redhat-operators-9qrfh\" (UID: \"dd6fb393-541e-443b-8a05-9b2bd02aa078\") " pod="openshift-marketplace/redhat-operators-9qrfh" Mar 09 10:12:51 crc kubenswrapper[4861]: I0309 10:12:51.586571 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd6fb393-541e-443b-8a05-9b2bd02aa078-catalog-content\") pod \"redhat-operators-9qrfh\" (UID: \"dd6fb393-541e-443b-8a05-9b2bd02aa078\") " pod="openshift-marketplace/redhat-operators-9qrfh" Mar 09 10:12:51 crc kubenswrapper[4861]: I0309 10:12:51.587105 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd6fb393-541e-443b-8a05-9b2bd02aa078-catalog-content\") pod \"redhat-operators-9qrfh\" (UID: \"dd6fb393-541e-443b-8a05-9b2bd02aa078\") " pod="openshift-marketplace/redhat-operators-9qrfh" Mar 09 10:12:51 crc kubenswrapper[4861]: I0309 10:12:51.587581 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd6fb393-541e-443b-8a05-9b2bd02aa078-utilities\") pod \"redhat-operators-9qrfh\" (UID: \"dd6fb393-541e-443b-8a05-9b2bd02aa078\") " pod="openshift-marketplace/redhat-operators-9qrfh" Mar 09 10:12:51 crc kubenswrapper[4861]: I0309 10:12:51.605826 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8wr2\" (UniqueName: \"kubernetes.io/projected/dd6fb393-541e-443b-8a05-9b2bd02aa078-kube-api-access-b8wr2\") pod \"redhat-operators-9qrfh\" (UID: \"dd6fb393-541e-443b-8a05-9b2bd02aa078\") " pod="openshift-marketplace/redhat-operators-9qrfh" Mar 09 10:12:51 crc kubenswrapper[4861]: I0309 10:12:51.785770 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9qrfh" Mar 09 10:12:52 crc kubenswrapper[4861]: I0309 10:12:52.232659 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9qrfh"] Mar 09 10:12:52 crc kubenswrapper[4861]: I0309 10:12:52.852279 4861 generic.go:334] "Generic (PLEG): container finished" podID="dd6fb393-541e-443b-8a05-9b2bd02aa078" containerID="64ed2924118c58be019a6de8a2c4addae7221e7c123b28de632d785f73e05032" exitCode=0 Mar 09 10:12:52 crc kubenswrapper[4861]: I0309 10:12:52.852394 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9qrfh" event={"ID":"dd6fb393-541e-443b-8a05-9b2bd02aa078","Type":"ContainerDied","Data":"64ed2924118c58be019a6de8a2c4addae7221e7c123b28de632d785f73e05032"} Mar 09 10:12:52 crc kubenswrapper[4861]: I0309 10:12:52.852681 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9qrfh" event={"ID":"dd6fb393-541e-443b-8a05-9b2bd02aa078","Type":"ContainerStarted","Data":"3370b6eb7752962dac5919045c622d63e955c4e8345bd85c567e06ae57494058"} Mar 09 10:12:53 crc kubenswrapper[4861]: I0309 10:12:53.861772 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9qrfh" event={"ID":"dd6fb393-541e-443b-8a05-9b2bd02aa078","Type":"ContainerStarted","Data":"6ff7396c6d990c3813e20eb787f999d70df3b7c40863a53b4587a5341e6d733d"} Mar 09 10:12:54 crc kubenswrapper[4861]: I0309 10:12:54.872323 4861 generic.go:334] "Generic (PLEG): container finished" podID="dd6fb393-541e-443b-8a05-9b2bd02aa078" containerID="6ff7396c6d990c3813e20eb787f999d70df3b7c40863a53b4587a5341e6d733d" exitCode=0 Mar 09 10:12:54 crc kubenswrapper[4861]: I0309 10:12:54.872409 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9qrfh" event={"ID":"dd6fb393-541e-443b-8a05-9b2bd02aa078","Type":"ContainerDied","Data":"6ff7396c6d990c3813e20eb787f999d70df3b7c40863a53b4587a5341e6d733d"} Mar 09 10:12:55 crc kubenswrapper[4861]: I0309 10:12:55.418847 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5b9d58d97d-h7pvq_23b061c3-2bd5-4b7c-bdf6-76da2791cc8e/barbican-api/0.log" Mar 09 10:12:55 crc kubenswrapper[4861]: I0309 10:12:55.641948 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5b9d58d97d-h7pvq_23b061c3-2bd5-4b7c-bdf6-76da2791cc8e/barbican-api-log/0.log" Mar 09 10:12:55 crc kubenswrapper[4861]: I0309 10:12:55.682573 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7f66785d8-vkcmq_8bc3e378-d567-4ba4-b135-1393faa1dbc6/barbican-keystone-listener/0.log" Mar 09 10:12:55 crc kubenswrapper[4861]: I0309 10:12:55.720229 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7f66785d8-vkcmq_8bc3e378-d567-4ba4-b135-1393faa1dbc6/barbican-keystone-listener-log/0.log" Mar 09 10:12:55 crc kubenswrapper[4861]: I0309 10:12:55.865186 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7b6689bdbc-t6phd_82a35d2d-6934-4c56-a62d-db22ac36a6be/barbican-worker/0.log" Mar 09 10:12:55 crc kubenswrapper[4861]: I0309 10:12:55.883798 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9qrfh" event={"ID":"dd6fb393-541e-443b-8a05-9b2bd02aa078","Type":"ContainerStarted","Data":"809f3c8eff195f77c5959a8b59e9237b2d7b2cdd3abc5a968976b0b2e64a68ea"} Mar 09 10:12:55 crc kubenswrapper[4861]: I0309 10:12:55.905724 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7b6689bdbc-t6phd_82a35d2d-6934-4c56-a62d-db22ac36a6be/barbican-worker-log/0.log" Mar 09 10:12:55 crc kubenswrapper[4861]: I0309 10:12:55.911708 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9qrfh" podStartSLOduration=2.43915757 podStartE2EDuration="4.91168437s" podCreationTimestamp="2026-03-09 10:12:51 +0000 UTC" firstStartedPulling="2026-03-09 10:12:52.854522848 +0000 UTC m=+4015.939562249" lastFinishedPulling="2026-03-09 10:12:55.327049648 +0000 UTC m=+4018.412089049" observedRunningTime="2026-03-09 10:12:55.903086689 +0000 UTC m=+4018.988126100" watchObservedRunningTime="2026-03-09 10:12:55.91168437 +0000 UTC m=+4018.996723781" Mar 09 10:12:56 crc kubenswrapper[4861]: I0309 10:12:56.146490 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-6k9cp_b1f90870-1ef5-46d6-b495-f41e2d14a888/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 10:12:56 crc kubenswrapper[4861]: I0309 10:12:56.201409 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_75432149-8e10-4aae-8ad4-fbf3b5a10063/ceilometer-central-agent/0.log" Mar 09 10:12:56 crc kubenswrapper[4861]: I0309 10:12:56.311036 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_75432149-8e10-4aae-8ad4-fbf3b5a10063/ceilometer-notification-agent/0.log" Mar 09 10:12:56 crc kubenswrapper[4861]: I0309 10:12:56.342677 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_75432149-8e10-4aae-8ad4-fbf3b5a10063/proxy-httpd/0.log" Mar 09 10:12:56 crc kubenswrapper[4861]: I0309 10:12:56.382329 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_75432149-8e10-4aae-8ad4-fbf3b5a10063/sg-core/0.log" Mar 09 10:12:56 crc kubenswrapper[4861]: I0309 10:12:56.589064 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_c9fa67cc-6a0f-485d-b064-cd14971058db/cinder-api/0.log" Mar 09 10:12:56 crc kubenswrapper[4861]: I0309 10:12:56.612995 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_c9fa67cc-6a0f-485d-b064-cd14971058db/cinder-api-log/0.log" Mar 09 10:12:56 crc kubenswrapper[4861]: I0309 10:12:56.814613 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_072cabf9-18cb-4562-a6a2-7f2b46a4f9ec/cinder-scheduler/0.log" Mar 09 10:12:56 crc kubenswrapper[4861]: I0309 10:12:56.883000 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_072cabf9-18cb-4562-a6a2-7f2b46a4f9ec/probe/0.log" Mar 09 10:12:57 crc kubenswrapper[4861]: I0309 10:12:57.109049 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-ww9c7_b6f88b43-ae35-4f74-b14a-96332076ed1f/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 10:12:57 crc kubenswrapper[4861]: I0309 10:12:57.339724 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-dgmz7_7bbe9e42-4b9b-42e7-bfed-ff93ff905164/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 10:12:57 crc kubenswrapper[4861]: I0309 10:12:57.434060 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-79fcc958f9-jb5bq_780ba45c-97cb-4382-9d7a-268051c773d1/init/0.log" Mar 09 10:12:57 crc kubenswrapper[4861]: I0309 10:12:57.738459 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-79fcc958f9-jb5bq_780ba45c-97cb-4382-9d7a-268051c773d1/dnsmasq-dns/0.log" Mar 09 10:12:57 crc kubenswrapper[4861]: I0309 10:12:57.783455 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-79fcc958f9-jb5bq_780ba45c-97cb-4382-9d7a-268051c773d1/init/0.log" Mar 09 10:12:57 crc kubenswrapper[4861]: I0309 10:12:57.801692 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-2gml2_28e43d3e-921e-4f6c-be2f-f37e5625374a/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 10:12:58 crc kubenswrapper[4861]: I0309 10:12:58.032333 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_d237bf3c-da06-48d8-aef3-91be47f05320/glance-httpd/0.log" Mar 09 10:12:58 crc kubenswrapper[4861]: I0309 10:12:58.141029 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_d237bf3c-da06-48d8-aef3-91be47f05320/glance-log/0.log" Mar 09 10:12:58 crc kubenswrapper[4861]: I0309 10:12:58.288329 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_6611f0ac-3406-4da9-b81a-1515dddfafcd/glance-httpd/0.log" Mar 09 10:12:58 crc kubenswrapper[4861]: I0309 10:12:58.298151 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_6611f0ac-3406-4da9-b81a-1515dddfafcd/glance-log/0.log" Mar 09 10:12:58 crc kubenswrapper[4861]: I0309 10:12:58.413278 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5487f4d458-lnthc_9049886d-2460-47fe-ac82-2dfde4858bd0/horizon/0.log" Mar 09 10:12:58 crc kubenswrapper[4861]: I0309 10:12:58.669281 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-m4ds9_2420e9c4-faed-48f0-857d-4aba72c5cab2/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 10:12:58 crc kubenswrapper[4861]: I0309 10:12:58.898480 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5487f4d458-lnthc_9049886d-2460-47fe-ac82-2dfde4858bd0/horizon-log/0.log" Mar 09 10:12:58 crc kubenswrapper[4861]: I0309 10:12:58.938395 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-j8d2h_0bb63c32-5f67-4912-b238-893dc92107b9/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 10:12:59 crc kubenswrapper[4861]: I0309 10:12:59.232454 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29550841-pc6h2_2153c5af-92d8-4f6e-b299-8b06d30603f5/keystone-cron/0.log" Mar 09 10:12:59 crc kubenswrapper[4861]: I0309 10:12:59.344267 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-5cc5bc567f-7k86v_596fb22d-649e-4e00-b847-71b506786832/keystone-api/0.log" Mar 09 10:12:59 crc kubenswrapper[4861]: I0309 10:12:59.546864 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_5a9682bd-f0fc-47d6-9a66-e35fb0630f44/kube-state-metrics/0.log" Mar 09 10:12:59 crc kubenswrapper[4861]: I0309 10:12:59.650400 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-ngps7_d783d4c7-dfa9-4783-a80c-2938d2a5841d/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 10:13:00 crc kubenswrapper[4861]: I0309 10:13:00.246900 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-665f4b6689-tfdk9_db5464b8-011f-4569-a47e-36766fa6c72e/neutron-api/0.log" Mar 09 10:13:00 crc kubenswrapper[4861]: I0309 10:13:00.265179 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-665f4b6689-tfdk9_db5464b8-011f-4569-a47e-36766fa6c72e/neutron-httpd/0.log" Mar 09 10:13:00 crc kubenswrapper[4861]: I0309 10:13:00.366413 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-d9zgg_9e525f15-c77e-4a1c-a161-4db82064bf70/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 10:13:01 crc kubenswrapper[4861]: I0309 10:13:01.062859 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_a7f9f9a6-c593-4015-833b-ef237f492b70/nova-api-log/0.log" Mar 09 10:13:01 crc kubenswrapper[4861]: I0309 10:13:01.468669 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_c5b057f4-8239-4e46-b205-81552d6cd5e6/nova-cell1-conductor-conductor/0.log" Mar 09 10:13:01 crc kubenswrapper[4861]: I0309 10:13:01.547807 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_fc47e276-b337-4696-ac08-1aa31c4b6864/nova-cell0-conductor-conductor/0.log" Mar 09 10:13:01 crc kubenswrapper[4861]: I0309 10:13:01.597577 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_a7f9f9a6-c593-4015-833b-ef237f492b70/nova-api-api/0.log" Mar 09 10:13:01 crc kubenswrapper[4861]: I0309 10:13:01.703507 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_dd2e86d2-700f-4fd8-b89b-84bd5a09069d/nova-cell1-novncproxy-novncproxy/0.log" Mar 09 10:13:01 crc kubenswrapper[4861]: I0309 10:13:01.786526 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9qrfh" Mar 09 10:13:01 crc kubenswrapper[4861]: I0309 10:13:01.786596 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9qrfh" Mar 09 10:13:01 crc kubenswrapper[4861]: I0309 10:13:01.843146 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-mkzpx_23236f7d-915f-4619-b5ba-611375aef594/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 10:13:02 crc kubenswrapper[4861]: I0309 10:13:02.080304 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_9d5c1be9-8604-4565-9175-703ff865c6eb/nova-metadata-log/0.log" Mar 09 10:13:02 crc kubenswrapper[4861]: I0309 10:13:02.410355 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_0a9f9492-a68d-4f37-bffc-4f13ebe23db7/nova-scheduler-scheduler/0.log" Mar 09 10:13:02 crc kubenswrapper[4861]: I0309 10:13:02.427464 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f095ca7b-1959-4cda-bde8-40ca6446e34d/mysql-bootstrap/0.log" Mar 09 10:13:02 crc kubenswrapper[4861]: I0309 10:13:02.708134 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f095ca7b-1959-4cda-bde8-40ca6446e34d/mysql-bootstrap/0.log" Mar 09 10:13:02 crc kubenswrapper[4861]: I0309 10:13:02.708435 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f095ca7b-1959-4cda-bde8-40ca6446e34d/galera/0.log" Mar 09 10:13:02 crc kubenswrapper[4861]: I0309 10:13:02.844794 4861 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9qrfh" podUID="dd6fb393-541e-443b-8a05-9b2bd02aa078" containerName="registry-server" probeResult="failure" output=< Mar 09 10:13:02 crc kubenswrapper[4861]: timeout: failed to connect service ":50051" within 1s Mar 09 10:13:02 crc kubenswrapper[4861]: > Mar 09 10:13:02 crc kubenswrapper[4861]: I0309 10:13:02.928564 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_0ab732e3-1122-4f45-a9af-b36eaa88c19e/mysql-bootstrap/0.log" Mar 09 10:13:03 crc kubenswrapper[4861]: I0309 10:13:03.154393 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_0ab732e3-1122-4f45-a9af-b36eaa88c19e/mysql-bootstrap/0.log" Mar 09 10:13:03 crc kubenswrapper[4861]: I0309 10:13:03.212573 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_0ab732e3-1122-4f45-a9af-b36eaa88c19e/galera/0.log" Mar 09 10:13:03 crc kubenswrapper[4861]: I0309 10:13:03.561520 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_62f873db-0b4f-4a99-bc1d-7cdff56989a2/openstackclient/0.log" Mar 09 10:13:03 crc kubenswrapper[4861]: I0309 10:13:03.582894 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-xvlmj_82770fbe-3052-4367-9c2d-a19a11d3a695/openstack-network-exporter/0.log" Mar 09 10:13:03 crc kubenswrapper[4861]: I0309 10:13:03.618246 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_9d5c1be9-8604-4565-9175-703ff865c6eb/nova-metadata-metadata/0.log" Mar 09 10:13:03 crc kubenswrapper[4861]: I0309 10:13:03.819062 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hmb5w_cafa7cbf-ac96-4bb5-a33e-90d69df5d797/ovsdb-server-init/0.log" Mar 09 10:13:04 crc kubenswrapper[4861]: I0309 10:13:04.027825 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hmb5w_cafa7cbf-ac96-4bb5-a33e-90d69df5d797/ovsdb-server-init/0.log" Mar 09 10:13:04 crc kubenswrapper[4861]: I0309 10:13:04.068805 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hmb5w_cafa7cbf-ac96-4bb5-a33e-90d69df5d797/ovsdb-server/0.log" Mar 09 10:13:04 crc kubenswrapper[4861]: I0309 10:13:04.073460 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-hmb5w_cafa7cbf-ac96-4bb5-a33e-90d69df5d797/ovs-vswitchd/0.log" Mar 09 10:13:04 crc kubenswrapper[4861]: I0309 10:13:04.251403 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-s7nq5_4e354b06-2ae2-41af-b5d7-2909bca8cff6/ovn-controller/0.log" Mar 09 10:13:04 crc kubenswrapper[4861]: I0309 10:13:04.392248 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-9rkkl_47423b67-9acf-48b2-b8b5-d47b822ad425/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 10:13:04 crc kubenswrapper[4861]: I0309 10:13:04.481530 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_dea6c69a-803b-498c-b7e2-7d76629de3dc/openstack-network-exporter/0.log" Mar 09 10:13:05 crc kubenswrapper[4861]: I0309 10:13:05.107359 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_dea6c69a-803b-498c-b7e2-7d76629de3dc/ovn-northd/0.log" Mar 09 10:13:05 crc kubenswrapper[4861]: I0309 10:13:05.128326 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_1d597158-3a33-4518-a0b9-37cf5b309a28/ovsdbserver-nb/0.log" Mar 09 10:13:05 crc kubenswrapper[4861]: I0309 10:13:05.154441 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_1d597158-3a33-4518-a0b9-37cf5b309a28/openstack-network-exporter/0.log" Mar 09 10:13:05 crc kubenswrapper[4861]: I0309 10:13:05.350294 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_fc895133-add5-4388-8e97-1b0d16306648/openstack-network-exporter/0.log" Mar 09 10:13:05 crc kubenswrapper[4861]: I0309 10:13:05.423517 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_fc895133-add5-4388-8e97-1b0d16306648/ovsdbserver-sb/0.log" Mar 09 10:13:05 crc kubenswrapper[4861]: I0309 10:13:05.719190 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-678fb94c4b-9x5d2_80b2797f-285c-4a23-9385-b4845acb2820/placement-log/0.log" Mar 09 10:13:05 crc kubenswrapper[4861]: I0309 10:13:05.745069 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-678fb94c4b-9x5d2_80b2797f-285c-4a23-9385-b4845acb2820/placement-api/0.log" Mar 09 10:13:05 crc kubenswrapper[4861]: I0309 10:13:05.786637 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_36ab59d0-e730-43a5-a7f1-99f136e5f9d3/setup-container/0.log" Mar 09 10:13:06 crc kubenswrapper[4861]: I0309 10:13:06.005612 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_2c3f8770-f9a3-49ae-81e0-caad7b40ac46/setup-container/0.log" Mar 09 10:13:06 crc kubenswrapper[4861]: I0309 10:13:06.039106 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_36ab59d0-e730-43a5-a7f1-99f136e5f9d3/rabbitmq/0.log" Mar 09 10:13:06 crc kubenswrapper[4861]: I0309 10:13:06.067554 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_36ab59d0-e730-43a5-a7f1-99f136e5f9d3/setup-container/0.log" Mar 09 10:13:06 crc kubenswrapper[4861]: I0309 10:13:06.246863 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_2c3f8770-f9a3-49ae-81e0-caad7b40ac46/setup-container/0.log" Mar 09 10:13:06 crc kubenswrapper[4861]: I0309 10:13:06.281566 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_2c3f8770-f9a3-49ae-81e0-caad7b40ac46/rabbitmq/0.log" Mar 09 10:13:06 crc kubenswrapper[4861]: I0309 10:13:06.351906 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-62xxn_e3b0d4f8-537e-4894-bcf9-0cfa00a145ec/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 10:13:07 crc kubenswrapper[4861]: I0309 10:13:07.317941 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-kfh6m_438a18ff-fdc3-44f3-9c51-df15a691c389/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 10:13:07 crc kubenswrapper[4861]: I0309 10:13:07.387636 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-gvccp_cd678163-1379-40da-be83-c4ace8b0cf0d/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 10:13:07 crc kubenswrapper[4861]: I0309 10:13:07.582494 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-5mlpf_df0bad47-fa01-426d-af7b-e09057048052/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 10:13:07 crc kubenswrapper[4861]: I0309 10:13:07.691039 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-xq9tq_e8f222a3-04ea-475e-aab7-97cf0ba5021c/ssh-known-hosts-edpm-deployment/0.log" Mar 09 10:13:07 crc kubenswrapper[4861]: I0309 10:13:07.933940 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6f4f458d55-lxkls_1ed378c6-5773-4dd7-9889-52bcf62216e5/proxy-server/0.log" Mar 09 10:13:08 crc kubenswrapper[4861]: I0309 10:13:08.062819 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6f4f458d55-lxkls_1ed378c6-5773-4dd7-9889-52bcf62216e5/proxy-httpd/0.log" Mar 09 10:13:08 crc kubenswrapper[4861]: I0309 10:13:08.197535 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-qjxtd_6f0d289d-af18-4534-a0c6-c90f51e93fd8/swift-ring-rebalance/0.log" Mar 09 10:13:08 crc kubenswrapper[4861]: I0309 10:13:08.272193 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d1ad2fa7-36fc-4cd0-98ac-07b48c42e794/account-auditor/0.log" Mar 09 10:13:08 crc kubenswrapper[4861]: I0309 10:13:08.311266 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d1ad2fa7-36fc-4cd0-98ac-07b48c42e794/account-reaper/0.log" Mar 09 10:13:08 crc kubenswrapper[4861]: I0309 10:13:08.462042 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d1ad2fa7-36fc-4cd0-98ac-07b48c42e794/account-server/0.log" Mar 09 10:13:08 crc kubenswrapper[4861]: I0309 10:13:08.472477 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d1ad2fa7-36fc-4cd0-98ac-07b48c42e794/account-replicator/0.log" Mar 09 10:13:08 crc kubenswrapper[4861]: I0309 10:13:08.546597 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d1ad2fa7-36fc-4cd0-98ac-07b48c42e794/container-auditor/0.log" Mar 09 10:13:08 crc kubenswrapper[4861]: I0309 10:13:08.560829 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d1ad2fa7-36fc-4cd0-98ac-07b48c42e794/container-replicator/0.log" Mar 09 10:13:08 crc kubenswrapper[4861]: I0309 10:13:08.679463 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d1ad2fa7-36fc-4cd0-98ac-07b48c42e794/container-updater/0.log" Mar 09 10:13:08 crc kubenswrapper[4861]: I0309 10:13:08.743145 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d1ad2fa7-36fc-4cd0-98ac-07b48c42e794/container-server/0.log" Mar 09 10:13:08 crc kubenswrapper[4861]: I0309 10:13:08.774378 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d1ad2fa7-36fc-4cd0-98ac-07b48c42e794/object-auditor/0.log" Mar 09 10:13:08 crc kubenswrapper[4861]: I0309 10:13:08.804068 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d1ad2fa7-36fc-4cd0-98ac-07b48c42e794/object-expirer/0.log" Mar 09 10:13:08 crc kubenswrapper[4861]: I0309 10:13:08.952879 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d1ad2fa7-36fc-4cd0-98ac-07b48c42e794/object-replicator/0.log" Mar 09 10:13:08 crc kubenswrapper[4861]: I0309 10:13:08.969227 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d1ad2fa7-36fc-4cd0-98ac-07b48c42e794/object-server/0.log" Mar 09 10:13:09 crc kubenswrapper[4861]: I0309 10:13:09.045316 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d1ad2fa7-36fc-4cd0-98ac-07b48c42e794/object-updater/0.log" Mar 09 10:13:09 crc kubenswrapper[4861]: I0309 10:13:09.124808 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d1ad2fa7-36fc-4cd0-98ac-07b48c42e794/rsync/0.log" Mar 09 10:13:09 crc kubenswrapper[4861]: I0309 10:13:09.169675 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d1ad2fa7-36fc-4cd0-98ac-07b48c42e794/swift-recon-cron/0.log" Mar 09 10:13:09 crc kubenswrapper[4861]: I0309 10:13:09.373677 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-jbbn4_7c47d068-c590-40cb-aeb0-1cc5132d40dd/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 10:13:09 crc kubenswrapper[4861]: I0309 10:13:09.443740 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_f7ed5e40-0dc4-417c-bef9-cbf919777c67/tempest-tests-tempest-tests-runner/0.log" Mar 09 10:13:09 crc kubenswrapper[4861]: I0309 10:13:09.591976 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_571b04e8-dc75-4bf7-921c-82ee1f86b023/test-operator-logs-container/0.log" Mar 09 10:13:09 crc kubenswrapper[4861]: I0309 10:13:09.663689 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-7n4l5_ec37a9fa-7555-4e81-af4a-dad48b85942c/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 10:13:11 crc kubenswrapper[4861]: I0309 10:13:11.833508 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9qrfh" Mar 09 10:13:11 crc kubenswrapper[4861]: I0309 10:13:11.889977 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9qrfh" Mar 09 10:13:12 crc kubenswrapper[4861]: I0309 10:13:12.079345 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9qrfh"] Mar 09 10:13:13 crc kubenswrapper[4861]: I0309 10:13:13.048406 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9qrfh" podUID="dd6fb393-541e-443b-8a05-9b2bd02aa078" containerName="registry-server" containerID="cri-o://809f3c8eff195f77c5959a8b59e9237b2d7b2cdd3abc5a968976b0b2e64a68ea" gracePeriod=2 Mar 09 10:13:13 crc kubenswrapper[4861]: I0309 10:13:13.510463 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9qrfh" Mar 09 10:13:13 crc kubenswrapper[4861]: I0309 10:13:13.649828 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd6fb393-541e-443b-8a05-9b2bd02aa078-catalog-content\") pod \"dd6fb393-541e-443b-8a05-9b2bd02aa078\" (UID: \"dd6fb393-541e-443b-8a05-9b2bd02aa078\") " Mar 09 10:13:13 crc kubenswrapper[4861]: I0309 10:13:13.649966 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd6fb393-541e-443b-8a05-9b2bd02aa078-utilities\") pod \"dd6fb393-541e-443b-8a05-9b2bd02aa078\" (UID: \"dd6fb393-541e-443b-8a05-9b2bd02aa078\") " Mar 09 10:13:13 crc kubenswrapper[4861]: I0309 10:13:13.650129 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8wr2\" (UniqueName: \"kubernetes.io/projected/dd6fb393-541e-443b-8a05-9b2bd02aa078-kube-api-access-b8wr2\") pod \"dd6fb393-541e-443b-8a05-9b2bd02aa078\" (UID: \"dd6fb393-541e-443b-8a05-9b2bd02aa078\") " Mar 09 10:13:13 crc kubenswrapper[4861]: I0309 10:13:13.651552 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd6fb393-541e-443b-8a05-9b2bd02aa078-utilities" (OuterVolumeSpecName: "utilities") pod "dd6fb393-541e-443b-8a05-9b2bd02aa078" (UID: "dd6fb393-541e-443b-8a05-9b2bd02aa078"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:13:13 crc kubenswrapper[4861]: I0309 10:13:13.661937 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd6fb393-541e-443b-8a05-9b2bd02aa078-kube-api-access-b8wr2" (OuterVolumeSpecName: "kube-api-access-b8wr2") pod "dd6fb393-541e-443b-8a05-9b2bd02aa078" (UID: "dd6fb393-541e-443b-8a05-9b2bd02aa078"). InnerVolumeSpecName "kube-api-access-b8wr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:13:13 crc kubenswrapper[4861]: I0309 10:13:13.753095 4861 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd6fb393-541e-443b-8a05-9b2bd02aa078-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 10:13:13 crc kubenswrapper[4861]: I0309 10:13:13.753208 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8wr2\" (UniqueName: \"kubernetes.io/projected/dd6fb393-541e-443b-8a05-9b2bd02aa078-kube-api-access-b8wr2\") on node \"crc\" DevicePath \"\"" Mar 09 10:13:13 crc kubenswrapper[4861]: I0309 10:13:13.835130 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd6fb393-541e-443b-8a05-9b2bd02aa078-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dd6fb393-541e-443b-8a05-9b2bd02aa078" (UID: "dd6fb393-541e-443b-8a05-9b2bd02aa078"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:13:13 crc kubenswrapper[4861]: I0309 10:13:13.854479 4861 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd6fb393-541e-443b-8a05-9b2bd02aa078-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 10:13:14 crc kubenswrapper[4861]: I0309 10:13:14.064268 4861 generic.go:334] "Generic (PLEG): container finished" podID="dd6fb393-541e-443b-8a05-9b2bd02aa078" containerID="809f3c8eff195f77c5959a8b59e9237b2d7b2cdd3abc5a968976b0b2e64a68ea" exitCode=0 Mar 09 10:13:14 crc kubenswrapper[4861]: I0309 10:13:14.064361 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9qrfh" event={"ID":"dd6fb393-541e-443b-8a05-9b2bd02aa078","Type":"ContainerDied","Data":"809f3c8eff195f77c5959a8b59e9237b2d7b2cdd3abc5a968976b0b2e64a68ea"} Mar 09 10:13:14 crc kubenswrapper[4861]: I0309 10:13:14.064388 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9qrfh" Mar 09 10:13:14 crc kubenswrapper[4861]: I0309 10:13:14.064686 4861 scope.go:117] "RemoveContainer" containerID="809f3c8eff195f77c5959a8b59e9237b2d7b2cdd3abc5a968976b0b2e64a68ea" Mar 09 10:13:14 crc kubenswrapper[4861]: I0309 10:13:14.064666 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9qrfh" event={"ID":"dd6fb393-541e-443b-8a05-9b2bd02aa078","Type":"ContainerDied","Data":"3370b6eb7752962dac5919045c622d63e955c4e8345bd85c567e06ae57494058"} Mar 09 10:13:14 crc kubenswrapper[4861]: I0309 10:13:14.091806 4861 scope.go:117] "RemoveContainer" containerID="6ff7396c6d990c3813e20eb787f999d70df3b7c40863a53b4587a5341e6d733d" Mar 09 10:13:14 crc kubenswrapper[4861]: I0309 10:13:14.118325 4861 scope.go:117] "RemoveContainer" containerID="64ed2924118c58be019a6de8a2c4addae7221e7c123b28de632d785f73e05032" Mar 09 10:13:14 crc kubenswrapper[4861]: I0309 10:13:14.123603 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9qrfh"] Mar 09 10:13:14 crc kubenswrapper[4861]: I0309 10:13:14.136163 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9qrfh"] Mar 09 10:13:14 crc kubenswrapper[4861]: I0309 10:13:14.166466 4861 scope.go:117] "RemoveContainer" containerID="809f3c8eff195f77c5959a8b59e9237b2d7b2cdd3abc5a968976b0b2e64a68ea" Mar 09 10:13:14 crc kubenswrapper[4861]: E0309 10:13:14.167879 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"809f3c8eff195f77c5959a8b59e9237b2d7b2cdd3abc5a968976b0b2e64a68ea\": container with ID starting with 809f3c8eff195f77c5959a8b59e9237b2d7b2cdd3abc5a968976b0b2e64a68ea not found: ID does not exist" containerID="809f3c8eff195f77c5959a8b59e9237b2d7b2cdd3abc5a968976b0b2e64a68ea" Mar 09 10:13:14 crc kubenswrapper[4861]: I0309 10:13:14.167933 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"809f3c8eff195f77c5959a8b59e9237b2d7b2cdd3abc5a968976b0b2e64a68ea"} err="failed to get container status \"809f3c8eff195f77c5959a8b59e9237b2d7b2cdd3abc5a968976b0b2e64a68ea\": rpc error: code = NotFound desc = could not find container \"809f3c8eff195f77c5959a8b59e9237b2d7b2cdd3abc5a968976b0b2e64a68ea\": container with ID starting with 809f3c8eff195f77c5959a8b59e9237b2d7b2cdd3abc5a968976b0b2e64a68ea not found: ID does not exist" Mar 09 10:13:14 crc kubenswrapper[4861]: I0309 10:13:14.167964 4861 scope.go:117] "RemoveContainer" containerID="6ff7396c6d990c3813e20eb787f999d70df3b7c40863a53b4587a5341e6d733d" Mar 09 10:13:14 crc kubenswrapper[4861]: E0309 10:13:14.168858 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ff7396c6d990c3813e20eb787f999d70df3b7c40863a53b4587a5341e6d733d\": container with ID starting with 6ff7396c6d990c3813e20eb787f999d70df3b7c40863a53b4587a5341e6d733d not found: ID does not exist" containerID="6ff7396c6d990c3813e20eb787f999d70df3b7c40863a53b4587a5341e6d733d" Mar 09 10:13:14 crc kubenswrapper[4861]: I0309 10:13:14.168894 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ff7396c6d990c3813e20eb787f999d70df3b7c40863a53b4587a5341e6d733d"} err="failed to get container status \"6ff7396c6d990c3813e20eb787f999d70df3b7c40863a53b4587a5341e6d733d\": rpc error: code = NotFound desc = could not find container \"6ff7396c6d990c3813e20eb787f999d70df3b7c40863a53b4587a5341e6d733d\": container with ID starting with 6ff7396c6d990c3813e20eb787f999d70df3b7c40863a53b4587a5341e6d733d not found: ID does not exist" Mar 09 10:13:14 crc kubenswrapper[4861]: I0309 10:13:14.168915 4861 scope.go:117] "RemoveContainer" containerID="64ed2924118c58be019a6de8a2c4addae7221e7c123b28de632d785f73e05032" Mar 09 10:13:14 crc kubenswrapper[4861]: E0309 10:13:14.169261 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64ed2924118c58be019a6de8a2c4addae7221e7c123b28de632d785f73e05032\": container with ID starting with 64ed2924118c58be019a6de8a2c4addae7221e7c123b28de632d785f73e05032 not found: ID does not exist" containerID="64ed2924118c58be019a6de8a2c4addae7221e7c123b28de632d785f73e05032" Mar 09 10:13:14 crc kubenswrapper[4861]: I0309 10:13:14.169300 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64ed2924118c58be019a6de8a2c4addae7221e7c123b28de632d785f73e05032"} err="failed to get container status \"64ed2924118c58be019a6de8a2c4addae7221e7c123b28de632d785f73e05032\": rpc error: code = NotFound desc = could not find container \"64ed2924118c58be019a6de8a2c4addae7221e7c123b28de632d785f73e05032\": container with ID starting with 64ed2924118c58be019a6de8a2c4addae7221e7c123b28de632d785f73e05032 not found: ID does not exist" Mar 09 10:13:15 crc kubenswrapper[4861]: I0309 10:13:15.673988 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd6fb393-541e-443b-8a05-9b2bd02aa078" path="/var/lib/kubelet/pods/dd6fb393-541e-443b-8a05-9b2bd02aa078/volumes" Mar 09 10:13:18 crc kubenswrapper[4861]: I0309 10:13:18.014452 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_a3671a10-52be-44e3-9c3d-11ba14e8e449/memcached/0.log" Mar 09 10:13:37 crc kubenswrapper[4861]: I0309 10:13:37.502357 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-5d87c9d997-2wbrm_5cccaa46-1901-457b-b093-9edfb512b68f/manager/0.log" Mar 09 10:13:37 crc kubenswrapper[4861]: I0309 10:13:37.740889 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfa6w5lp_2530cbbb-c2de-41ce-b6d2-a9593ed9226d/util/0.log" Mar 09 10:13:38 crc kubenswrapper[4861]: I0309 10:13:38.034050 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfa6w5lp_2530cbbb-c2de-41ce-b6d2-a9593ed9226d/pull/0.log" Mar 09 10:13:38 crc kubenswrapper[4861]: I0309 10:13:38.076864 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfa6w5lp_2530cbbb-c2de-41ce-b6d2-a9593ed9226d/util/0.log" Mar 09 10:13:38 crc kubenswrapper[4861]: I0309 10:13:38.629202 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfa6w5lp_2530cbbb-c2de-41ce-b6d2-a9593ed9226d/pull/0.log" Mar 09 10:13:38 crc kubenswrapper[4861]: I0309 10:13:38.903264 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfa6w5lp_2530cbbb-c2de-41ce-b6d2-a9593ed9226d/util/0.log" Mar 09 10:13:38 crc kubenswrapper[4861]: I0309 10:13:38.947276 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfa6w5lp_2530cbbb-c2de-41ce-b6d2-a9593ed9226d/pull/0.log" Mar 09 10:13:39 crc kubenswrapper[4861]: I0309 10:13:39.105542 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ff3fd74506e7f9e2c6e8ba7181b4867be2b110d904ba1aef106c523cfa6w5lp_2530cbbb-c2de-41ce-b6d2-a9593ed9226d/extract/0.log" Mar 09 10:13:39 crc kubenswrapper[4861]: I0309 10:13:39.445460 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-64db6967f8-pdxgs_8fb737b1-978f-4f1e-98db-f1c542ef77d9/manager/0.log" Mar 09 10:13:39 crc kubenswrapper[4861]: I0309 10:13:39.456199 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-55d77d7b5c-89f2d_861a14c0-5dcd-4126-b386-65467726a9dd/manager/0.log" Mar 09 10:13:39 crc kubenswrapper[4861]: I0309 10:13:39.706577 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-cf99c678f-jw5nb_23566122-1654-40c1-8dd6-577280d0dcec/manager/0.log" Mar 09 10:13:39 crc kubenswrapper[4861]: I0309 10:13:39.729426 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-78bc7f9bd9-bwxj2_32ddb619-584a-4ff4-a988-63d565043353/manager/0.log" Mar 09 10:13:40 crc kubenswrapper[4861]: I0309 10:13:40.182572 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-545456dc4-blz6f_e9e766bf-fdea-451e-a58c-a8818fccf4b4/manager/0.log" Mar 09 10:13:40 crc kubenswrapper[4861]: I0309 10:13:40.399549 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-f7fcc58b9-srw9z_1b8226be-5eb4-4156-a168-f843edac34ce/manager/0.log" Mar 09 10:13:40 crc kubenswrapper[4861]: I0309 10:13:40.515478 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7c789f89c6-pktcs_fbbd2d76-31fb-46d7-a422-af5f3e51baaf/manager/0.log" Mar 09 10:13:40 crc kubenswrapper[4861]: I0309 10:13:40.598067 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-67d996989d-8vnvb_78b36bea-6c3d-4794-b38a-6b4a5b3e9f5d/manager/0.log" Mar 09 10:13:40 crc kubenswrapper[4861]: I0309 10:13:40.813268 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-7b6bfb6475-lbhm2_12c3b94d-baff-4b5d-864e-371f5b3857f5/manager/0.log" Mar 09 10:13:41 crc kubenswrapper[4861]: I0309 10:13:41.008987 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-54688575f-9dhrv_091caccf-659b-42dd-b9cb-05aeea2548ce/manager/0.log" Mar 09 10:13:41 crc kubenswrapper[4861]: I0309 10:13:41.131262 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-74b6b5dc96-lnm2d_511b2722-0227-4a4f-931c-e69ad12e60de/manager/0.log" Mar 09 10:13:41 crc kubenswrapper[4861]: I0309 10:13:41.210950 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5d86c7ddb7-2cb86_eec501ad-33c8-4195-8817-3078202db97a/manager/0.log" Mar 09 10:13:41 crc kubenswrapper[4861]: I0309 10:13:41.376672 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6f64bd8c75j8xx4_72b49679-1f56-42df-bafc-a899cd2da3cf/manager/0.log" Mar 09 10:13:41 crc kubenswrapper[4861]: I0309 10:13:41.795248 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-568b7cf6db-zpgv9_4f92d232-a96d-4774-8bfb-ece261f9b9d4/operator/0.log" Mar 09 10:13:41 crc kubenswrapper[4861]: I0309 10:13:41.836966 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-cqfnf_793c9771-3185-4264-b109-a94fcc50a305/registry-server/0.log" Mar 09 10:13:42 crc kubenswrapper[4861]: I0309 10:13:42.064297 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-75684d597f-qqwld_73ae93db-3260-4af6-9724-52e8b97a0245/manager/0.log" Mar 09 10:13:42 crc kubenswrapper[4861]: I0309 10:13:42.119879 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-648564c9fc-2bqxz_b386f8ad-7867-4d35-83f8-382a379e3c1e/manager/0.log" Mar 09 10:13:42 crc kubenswrapper[4861]: I0309 10:13:42.336317 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-xnfpv_e649bda4-59a3-47e6-92e2-910c01b2f7c2/operator/0.log" Mar 09 10:13:42 crc kubenswrapper[4861]: I0309 10:13:42.556711 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-9b9ff9f4d-hrnq8_925518c8-3714-4180-ad1f-9bee534dd0dc/manager/0.log" Mar 09 10:13:42 crc kubenswrapper[4861]: I0309 10:13:42.789264 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5fdb694969-vlxrh_fdd188d8-f434-493e-a8f5-3506031b0f83/manager/0.log" Mar 09 10:13:42 crc kubenswrapper[4861]: I0309 10:13:42.851401 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-55b5ff4dbb-ztmwj_175e8d6d-930a-484b-b0b3-d45f37da4239/manager/0.log" Mar 09 10:13:43 crc kubenswrapper[4861]: I0309 10:13:43.107227 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-bccc79885-pmwxp_4cbd2609-0983-4da2-a0c7-fa66387e36ae/manager/0.log" Mar 09 10:13:43 crc kubenswrapper[4861]: I0309 10:13:43.581613 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-59b6c9788f-lfprc_ec436429-c762-4e15-8f82-19a10cdc7941/manager/0.log" Mar 09 10:13:47 crc kubenswrapper[4861]: I0309 10:13:47.864499 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6db6876945-s4tc6_eaaa08cd-22f8-40a3-9cac-7e29137ea358/manager/0.log" Mar 09 10:14:00 crc kubenswrapper[4861]: I0309 10:14:00.147671 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550854-hsrtq"] Mar 09 10:14:00 crc kubenswrapper[4861]: E0309 10:14:00.148781 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd6fb393-541e-443b-8a05-9b2bd02aa078" containerName="extract-content" Mar 09 10:14:00 crc kubenswrapper[4861]: I0309 10:14:00.148800 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd6fb393-541e-443b-8a05-9b2bd02aa078" containerName="extract-content" Mar 09 10:14:00 crc kubenswrapper[4861]: E0309 10:14:00.148820 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd6fb393-541e-443b-8a05-9b2bd02aa078" containerName="extract-utilities" Mar 09 10:14:00 crc kubenswrapper[4861]: I0309 10:14:00.148829 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd6fb393-541e-443b-8a05-9b2bd02aa078" containerName="extract-utilities" Mar 09 10:14:00 crc kubenswrapper[4861]: E0309 10:14:00.148847 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd6fb393-541e-443b-8a05-9b2bd02aa078" containerName="registry-server" Mar 09 10:14:00 crc kubenswrapper[4861]: I0309 10:14:00.148854 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd6fb393-541e-443b-8a05-9b2bd02aa078" containerName="registry-server" Mar 09 10:14:00 crc kubenswrapper[4861]: I0309 10:14:00.149065 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd6fb393-541e-443b-8a05-9b2bd02aa078" containerName="registry-server" Mar 09 10:14:00 crc kubenswrapper[4861]: I0309 10:14:00.149863 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550854-hsrtq" Mar 09 10:14:00 crc kubenswrapper[4861]: I0309 10:14:00.151892 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 10:14:00 crc kubenswrapper[4861]: I0309 10:14:00.152107 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k86l8" Mar 09 10:14:00 crc kubenswrapper[4861]: I0309 10:14:00.153190 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 10:14:00 crc kubenswrapper[4861]: I0309 10:14:00.157528 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550854-hsrtq"] Mar 09 10:14:00 crc kubenswrapper[4861]: I0309 10:14:00.167752 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcs8j\" (UniqueName: \"kubernetes.io/projected/2af42883-8efc-4d48-920f-255783b3fe87-kube-api-access-fcs8j\") pod \"auto-csr-approver-29550854-hsrtq\" (UID: \"2af42883-8efc-4d48-920f-255783b3fe87\") " pod="openshift-infra/auto-csr-approver-29550854-hsrtq" Mar 09 10:14:00 crc kubenswrapper[4861]: I0309 10:14:00.269713 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcs8j\" (UniqueName: \"kubernetes.io/projected/2af42883-8efc-4d48-920f-255783b3fe87-kube-api-access-fcs8j\") pod \"auto-csr-approver-29550854-hsrtq\" (UID: \"2af42883-8efc-4d48-920f-255783b3fe87\") " pod="openshift-infra/auto-csr-approver-29550854-hsrtq" Mar 09 10:14:00 crc kubenswrapper[4861]: I0309 10:14:00.292582 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcs8j\" (UniqueName: \"kubernetes.io/projected/2af42883-8efc-4d48-920f-255783b3fe87-kube-api-access-fcs8j\") pod \"auto-csr-approver-29550854-hsrtq\" (UID: \"2af42883-8efc-4d48-920f-255783b3fe87\") " pod="openshift-infra/auto-csr-approver-29550854-hsrtq" Mar 09 10:14:00 crc kubenswrapper[4861]: I0309 10:14:00.474961 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550854-hsrtq" Mar 09 10:14:00 crc kubenswrapper[4861]: I0309 10:14:00.976196 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550854-hsrtq"] Mar 09 10:14:00 crc kubenswrapper[4861]: I0309 10:14:00.982422 4861 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 10:14:01 crc kubenswrapper[4861]: I0309 10:14:01.488008 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550854-hsrtq" event={"ID":"2af42883-8efc-4d48-920f-255783b3fe87","Type":"ContainerStarted","Data":"aae4a7e6aca008723f599d0d4dd65196d5bac93d50dcc0f859e0c58750e55ac1"} Mar 09 10:14:02 crc kubenswrapper[4861]: I0309 10:14:02.498396 4861 generic.go:334] "Generic (PLEG): container finished" podID="2af42883-8efc-4d48-920f-255783b3fe87" containerID="1a9ecce6c458a9e7e278205d91d96832b49da37f5d18c2c356f0a230b7b8b468" exitCode=0 Mar 09 10:14:02 crc kubenswrapper[4861]: I0309 10:14:02.498455 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550854-hsrtq" event={"ID":"2af42883-8efc-4d48-920f-255783b3fe87","Type":"ContainerDied","Data":"1a9ecce6c458a9e7e278205d91d96832b49da37f5d18c2c356f0a230b7b8b468"} Mar 09 10:14:03 crc kubenswrapper[4861]: I0309 10:14:03.937756 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550854-hsrtq" Mar 09 10:14:03 crc kubenswrapper[4861]: I0309 10:14:03.948885 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcs8j\" (UniqueName: \"kubernetes.io/projected/2af42883-8efc-4d48-920f-255783b3fe87-kube-api-access-fcs8j\") pod \"2af42883-8efc-4d48-920f-255783b3fe87\" (UID: \"2af42883-8efc-4d48-920f-255783b3fe87\") " Mar 09 10:14:03 crc kubenswrapper[4861]: I0309 10:14:03.957605 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2af42883-8efc-4d48-920f-255783b3fe87-kube-api-access-fcs8j" (OuterVolumeSpecName: "kube-api-access-fcs8j") pod "2af42883-8efc-4d48-920f-255783b3fe87" (UID: "2af42883-8efc-4d48-920f-255783b3fe87"). InnerVolumeSpecName "kube-api-access-fcs8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:14:04 crc kubenswrapper[4861]: I0309 10:14:04.050827 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcs8j\" (UniqueName: \"kubernetes.io/projected/2af42883-8efc-4d48-920f-255783b3fe87-kube-api-access-fcs8j\") on node \"crc\" DevicePath \"\"" Mar 09 10:14:04 crc kubenswrapper[4861]: I0309 10:14:04.522987 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550854-hsrtq" event={"ID":"2af42883-8efc-4d48-920f-255783b3fe87","Type":"ContainerDied","Data":"aae4a7e6aca008723f599d0d4dd65196d5bac93d50dcc0f859e0c58750e55ac1"} Mar 09 10:14:04 crc kubenswrapper[4861]: I0309 10:14:04.523024 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aae4a7e6aca008723f599d0d4dd65196d5bac93d50dcc0f859e0c58750e55ac1" Mar 09 10:14:04 crc kubenswrapper[4861]: I0309 10:14:04.523082 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550854-hsrtq" Mar 09 10:14:05 crc kubenswrapper[4861]: I0309 10:14:05.036522 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550848-x9fsv"] Mar 09 10:14:05 crc kubenswrapper[4861]: I0309 10:14:05.049466 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550848-x9fsv"] Mar 09 10:14:05 crc kubenswrapper[4861]: I0309 10:14:05.308867 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-trcr9_d8afafe8-bf56-46a7-bab9-c5a1c221a740/control-plane-machine-set-operator/0.log" Mar 09 10:14:05 crc kubenswrapper[4861]: I0309 10:14:05.500142 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-gqnxr_da4004a6-c6fd-41d6-a651-b4aaec2d6454/machine-api-operator/0.log" Mar 09 10:14:05 crc kubenswrapper[4861]: I0309 10:14:05.512768 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-gqnxr_da4004a6-c6fd-41d6-a651-b4aaec2d6454/kube-rbac-proxy/0.log" Mar 09 10:14:05 crc kubenswrapper[4861]: I0309 10:14:05.672074 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7aa3697f-7f16-47f8-8c14-c39f309602d2" path="/var/lib/kubelet/pods/7aa3697f-7f16-47f8-8c14-c39f309602d2/volumes" Mar 09 10:14:18 crc kubenswrapper[4861]: I0309 10:14:18.798929 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-nrqjp_07ac624a-3ef3-4179-96d7-aa49ff085d5e/cert-manager-controller/0.log" Mar 09 10:14:19 crc kubenswrapper[4861]: I0309 10:14:19.047696 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-6wl44_6f869345-5b73-43d1-9617-bf883a753bb8/cert-manager-cainjector/0.log" Mar 09 10:14:19 crc kubenswrapper[4861]: I0309 10:14:19.059785 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-g724f_55473545-bf70-472a-96e5-18cc3bfac07d/cert-manager-webhook/0.log" Mar 09 10:14:24 crc kubenswrapper[4861]: I0309 10:14:24.605881 4861 patch_prober.go:28] interesting pod/machine-config-daemon-5g7gc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 10:14:24 crc kubenswrapper[4861]: I0309 10:14:24.606469 4861 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 10:14:31 crc kubenswrapper[4861]: I0309 10:14:31.461081 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5dcbbd79cf-mgp5h_d29132f4-b735-40f3-94da-033f1174963f/nmstate-console-plugin/0.log" Mar 09 10:14:31 crc kubenswrapper[4861]: I0309 10:14:31.833656 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-qvk5q_09f364f8-e2b6-4ffe-b51a-37af17081bf8/nmstate-handler/0.log" Mar 09 10:14:31 crc kubenswrapper[4861]: I0309 10:14:31.948260 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-wz5b2_194a66ef-2b30-40b7-bbfd-5a2c3a51ad55/kube-rbac-proxy/0.log" Mar 09 10:14:31 crc kubenswrapper[4861]: I0309 10:14:31.989306 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-wz5b2_194a66ef-2b30-40b7-bbfd-5a2c3a51ad55/nmstate-metrics/0.log" Mar 09 10:14:32 crc kubenswrapper[4861]: I0309 10:14:32.076709 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-75c5dccd6c-bgvbh_ec451b1d-d99e-48c4-a550-83bac053d5dc/nmstate-operator/0.log" Mar 09 10:14:32 crc kubenswrapper[4861]: I0309 10:14:32.198216 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-786f45cff4-27zgp_f313fc1b-02bd-4bbd-bfdb-a18a300ed8bb/nmstate-webhook/0.log" Mar 09 10:14:38 crc kubenswrapper[4861]: I0309 10:14:38.761088 4861 scope.go:117] "RemoveContainer" containerID="14acbca4a79b72e30e8cc1f45a779deec009fccc1e0216175acaaa5f8b817002" Mar 09 10:14:49 crc kubenswrapper[4861]: I0309 10:14:49.161135 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-72xwj"] Mar 09 10:14:49 crc kubenswrapper[4861]: E0309 10:14:49.162222 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2af42883-8efc-4d48-920f-255783b3fe87" containerName="oc" Mar 09 10:14:49 crc kubenswrapper[4861]: I0309 10:14:49.162240 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="2af42883-8efc-4d48-920f-255783b3fe87" containerName="oc" Mar 09 10:14:49 crc kubenswrapper[4861]: I0309 10:14:49.162536 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="2af42883-8efc-4d48-920f-255783b3fe87" containerName="oc" Mar 09 10:14:49 crc kubenswrapper[4861]: I0309 10:14:49.164343 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-72xwj" Mar 09 10:14:49 crc kubenswrapper[4861]: I0309 10:14:49.173102 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-72xwj"] Mar 09 10:14:49 crc kubenswrapper[4861]: I0309 10:14:49.262736 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b008521-717f-4b98-bfa4-b1b1469a1bd7-utilities\") pod \"community-operators-72xwj\" (UID: \"0b008521-717f-4b98-bfa4-b1b1469a1bd7\") " pod="openshift-marketplace/community-operators-72xwj" Mar 09 10:14:49 crc kubenswrapper[4861]: I0309 10:14:49.263293 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b008521-717f-4b98-bfa4-b1b1469a1bd7-catalog-content\") pod \"community-operators-72xwj\" (UID: \"0b008521-717f-4b98-bfa4-b1b1469a1bd7\") " pod="openshift-marketplace/community-operators-72xwj" Mar 09 10:14:49 crc kubenswrapper[4861]: I0309 10:14:49.263655 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lp68v\" (UniqueName: \"kubernetes.io/projected/0b008521-717f-4b98-bfa4-b1b1469a1bd7-kube-api-access-lp68v\") pod \"community-operators-72xwj\" (UID: \"0b008521-717f-4b98-bfa4-b1b1469a1bd7\") " pod="openshift-marketplace/community-operators-72xwj" Mar 09 10:14:49 crc kubenswrapper[4861]: I0309 10:14:49.365769 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b008521-717f-4b98-bfa4-b1b1469a1bd7-catalog-content\") pod \"community-operators-72xwj\" (UID: \"0b008521-717f-4b98-bfa4-b1b1469a1bd7\") " pod="openshift-marketplace/community-operators-72xwj" Mar 09 10:14:49 crc kubenswrapper[4861]: I0309 10:14:49.365859 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lp68v\" (UniqueName: \"kubernetes.io/projected/0b008521-717f-4b98-bfa4-b1b1469a1bd7-kube-api-access-lp68v\") pod \"community-operators-72xwj\" (UID: \"0b008521-717f-4b98-bfa4-b1b1469a1bd7\") " pod="openshift-marketplace/community-operators-72xwj" Mar 09 10:14:49 crc kubenswrapper[4861]: I0309 10:14:49.365905 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b008521-717f-4b98-bfa4-b1b1469a1bd7-utilities\") pod \"community-operators-72xwj\" (UID: \"0b008521-717f-4b98-bfa4-b1b1469a1bd7\") " pod="openshift-marketplace/community-operators-72xwj" Mar 09 10:14:49 crc kubenswrapper[4861]: I0309 10:14:49.366472 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b008521-717f-4b98-bfa4-b1b1469a1bd7-catalog-content\") pod \"community-operators-72xwj\" (UID: \"0b008521-717f-4b98-bfa4-b1b1469a1bd7\") " pod="openshift-marketplace/community-operators-72xwj" Mar 09 10:14:49 crc kubenswrapper[4861]: I0309 10:14:49.366526 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b008521-717f-4b98-bfa4-b1b1469a1bd7-utilities\") pod \"community-operators-72xwj\" (UID: \"0b008521-717f-4b98-bfa4-b1b1469a1bd7\") " pod="openshift-marketplace/community-operators-72xwj" Mar 09 10:14:49 crc kubenswrapper[4861]: I0309 10:14:49.387588 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lp68v\" (UniqueName: \"kubernetes.io/projected/0b008521-717f-4b98-bfa4-b1b1469a1bd7-kube-api-access-lp68v\") pod \"community-operators-72xwj\" (UID: \"0b008521-717f-4b98-bfa4-b1b1469a1bd7\") " pod="openshift-marketplace/community-operators-72xwj" Mar 09 10:14:49 crc kubenswrapper[4861]: I0309 10:14:49.499149 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-72xwj" Mar 09 10:14:50 crc kubenswrapper[4861]: I0309 10:14:50.044927 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-72xwj"] Mar 09 10:14:50 crc kubenswrapper[4861]: I0309 10:14:50.943465 4861 generic.go:334] "Generic (PLEG): container finished" podID="0b008521-717f-4b98-bfa4-b1b1469a1bd7" containerID="a9112c4aaa7fb492d1ccd46ef508d67afd91cf4451c21ffc6ef9d4a7707126bb" exitCode=0 Mar 09 10:14:50 crc kubenswrapper[4861]: I0309 10:14:50.943579 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-72xwj" event={"ID":"0b008521-717f-4b98-bfa4-b1b1469a1bd7","Type":"ContainerDied","Data":"a9112c4aaa7fb492d1ccd46ef508d67afd91cf4451c21ffc6ef9d4a7707126bb"} Mar 09 10:14:50 crc kubenswrapper[4861]: I0309 10:14:50.943955 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-72xwj" event={"ID":"0b008521-717f-4b98-bfa4-b1b1469a1bd7","Type":"ContainerStarted","Data":"37072e3c0209565c43706aefedce4c731b3d9aee7066e2fbfc7a1c68961cd4ea"} Mar 09 10:14:54 crc kubenswrapper[4861]: I0309 10:14:54.606305 4861 patch_prober.go:28] interesting pod/machine-config-daemon-5g7gc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 10:14:54 crc kubenswrapper[4861]: I0309 10:14:54.606961 4861 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 10:14:55 crc kubenswrapper[4861]: I0309 10:14:55.998043 4861 generic.go:334] "Generic (PLEG): container finished" podID="0b008521-717f-4b98-bfa4-b1b1469a1bd7" containerID="8646a993423d70cf81212e95b48ef162e539221c00403f36cfc47489f975c541" exitCode=0 Mar 09 10:14:55 crc kubenswrapper[4861]: I0309 10:14:55.998171 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-72xwj" event={"ID":"0b008521-717f-4b98-bfa4-b1b1469a1bd7","Type":"ContainerDied","Data":"8646a993423d70cf81212e95b48ef162e539221c00403f36cfc47489f975c541"} Mar 09 10:14:57 crc kubenswrapper[4861]: I0309 10:14:57.008723 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-72xwj" event={"ID":"0b008521-717f-4b98-bfa4-b1b1469a1bd7","Type":"ContainerStarted","Data":"4b811a7f46b33b47df23c7f2e4075152a90e0b4358985cc48392b365613489cb"} Mar 09 10:14:57 crc kubenswrapper[4861]: I0309 10:14:57.055018 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-72xwj" podStartSLOduration=2.573534543 podStartE2EDuration="8.054996392s" podCreationTimestamp="2026-03-09 10:14:49 +0000 UTC" firstStartedPulling="2026-03-09 10:14:50.945425816 +0000 UTC m=+4134.030465217" lastFinishedPulling="2026-03-09 10:14:56.426887635 +0000 UTC m=+4139.511927066" observedRunningTime="2026-03-09 10:14:57.033318101 +0000 UTC m=+4140.118357502" watchObservedRunningTime="2026-03-09 10:14:57.054996392 +0000 UTC m=+4140.140035793" Mar 09 10:14:59 crc kubenswrapper[4861]: I0309 10:14:59.485657 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-hxljm_2e2a9a00-e47a-4d97-9b06-58dd635a7a55/kube-rbac-proxy/0.log" Mar 09 10:14:59 crc kubenswrapper[4861]: I0309 10:14:59.499248 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-72xwj" Mar 09 10:14:59 crc kubenswrapper[4861]: I0309 10:14:59.500321 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-72xwj" Mar 09 10:14:59 crc kubenswrapper[4861]: I0309 10:14:59.525798 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-hxljm_2e2a9a00-e47a-4d97-9b06-58dd635a7a55/controller/0.log" Mar 09 10:14:59 crc kubenswrapper[4861]: I0309 10:14:59.555298 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-72xwj" Mar 09 10:14:59 crc kubenswrapper[4861]: I0309 10:14:59.654347 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p5vjx_416fe11c-136c-4b38-9f85-2ba8df311664/cp-frr-files/0.log" Mar 09 10:14:59 crc kubenswrapper[4861]: I0309 10:14:59.776256 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p5vjx_416fe11c-136c-4b38-9f85-2ba8df311664/cp-frr-files/0.log" Mar 09 10:14:59 crc kubenswrapper[4861]: I0309 10:14:59.792715 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p5vjx_416fe11c-136c-4b38-9f85-2ba8df311664/cp-metrics/0.log" Mar 09 10:14:59 crc kubenswrapper[4861]: I0309 10:14:59.827031 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p5vjx_416fe11c-136c-4b38-9f85-2ba8df311664/cp-reloader/0.log" Mar 09 10:14:59 crc kubenswrapper[4861]: I0309 10:14:59.890869 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p5vjx_416fe11c-136c-4b38-9f85-2ba8df311664/cp-reloader/0.log" Mar 09 10:15:00 crc kubenswrapper[4861]: I0309 10:15:00.056298 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p5vjx_416fe11c-136c-4b38-9f85-2ba8df311664/cp-frr-files/0.log" Mar 09 10:15:00 crc kubenswrapper[4861]: I0309 10:15:00.058962 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p5vjx_416fe11c-136c-4b38-9f85-2ba8df311664/cp-metrics/0.log" Mar 09 10:15:00 crc kubenswrapper[4861]: I0309 10:15:00.071271 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p5vjx_416fe11c-136c-4b38-9f85-2ba8df311664/cp-metrics/0.log" Mar 09 10:15:00 crc kubenswrapper[4861]: I0309 10:15:00.080359 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p5vjx_416fe11c-136c-4b38-9f85-2ba8df311664/cp-reloader/0.log" Mar 09 10:15:00 crc kubenswrapper[4861]: I0309 10:15:00.146354 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550855-4w8wq"] Mar 09 10:15:00 crc kubenswrapper[4861]: I0309 10:15:00.148707 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550855-4w8wq" Mar 09 10:15:00 crc kubenswrapper[4861]: I0309 10:15:00.151320 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 09 10:15:00 crc kubenswrapper[4861]: I0309 10:15:00.152335 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 09 10:15:00 crc kubenswrapper[4861]: I0309 10:15:00.157694 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550855-4w8wq"] Mar 09 10:15:00 crc kubenswrapper[4861]: I0309 10:15:00.285386 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/28de8914-20a6-4801-9a0d-6d90d585385c-config-volume\") pod \"collect-profiles-29550855-4w8wq\" (UID: \"28de8914-20a6-4801-9a0d-6d90d585385c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550855-4w8wq" Mar 09 10:15:00 crc kubenswrapper[4861]: I0309 10:15:00.286120 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/28de8914-20a6-4801-9a0d-6d90d585385c-secret-volume\") pod \"collect-profiles-29550855-4w8wq\" (UID: \"28de8914-20a6-4801-9a0d-6d90d585385c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550855-4w8wq" Mar 09 10:15:00 crc kubenswrapper[4861]: I0309 10:15:00.286360 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpdkb\" (UniqueName: \"kubernetes.io/projected/28de8914-20a6-4801-9a0d-6d90d585385c-kube-api-access-hpdkb\") pod \"collect-profiles-29550855-4w8wq\" (UID: \"28de8914-20a6-4801-9a0d-6d90d585385c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550855-4w8wq" Mar 09 10:15:00 crc kubenswrapper[4861]: I0309 10:15:00.324705 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p5vjx_416fe11c-136c-4b38-9f85-2ba8df311664/cp-metrics/0.log" Mar 09 10:15:00 crc kubenswrapper[4861]: I0309 10:15:00.331571 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p5vjx_416fe11c-136c-4b38-9f85-2ba8df311664/cp-reloader/0.log" Mar 09 10:15:00 crc kubenswrapper[4861]: I0309 10:15:00.334062 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p5vjx_416fe11c-136c-4b38-9f85-2ba8df311664/controller/0.log" Mar 09 10:15:00 crc kubenswrapper[4861]: I0309 10:15:00.349455 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p5vjx_416fe11c-136c-4b38-9f85-2ba8df311664/cp-frr-files/0.log" Mar 09 10:15:00 crc kubenswrapper[4861]: I0309 10:15:00.388500 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/28de8914-20a6-4801-9a0d-6d90d585385c-secret-volume\") pod \"collect-profiles-29550855-4w8wq\" (UID: \"28de8914-20a6-4801-9a0d-6d90d585385c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550855-4w8wq" Mar 09 10:15:00 crc kubenswrapper[4861]: I0309 10:15:00.388593 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpdkb\" (UniqueName: \"kubernetes.io/projected/28de8914-20a6-4801-9a0d-6d90d585385c-kube-api-access-hpdkb\") pod \"collect-profiles-29550855-4w8wq\" (UID: \"28de8914-20a6-4801-9a0d-6d90d585385c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550855-4w8wq" Mar 09 10:15:00 crc kubenswrapper[4861]: I0309 10:15:00.388675 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/28de8914-20a6-4801-9a0d-6d90d585385c-config-volume\") pod \"collect-profiles-29550855-4w8wq\" (UID: \"28de8914-20a6-4801-9a0d-6d90d585385c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550855-4w8wq" Mar 09 10:15:00 crc kubenswrapper[4861]: I0309 10:15:00.389767 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/28de8914-20a6-4801-9a0d-6d90d585385c-config-volume\") pod \"collect-profiles-29550855-4w8wq\" (UID: \"28de8914-20a6-4801-9a0d-6d90d585385c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550855-4w8wq" Mar 09 10:15:00 crc kubenswrapper[4861]: I0309 10:15:00.413472 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/28de8914-20a6-4801-9a0d-6d90d585385c-secret-volume\") pod \"collect-profiles-29550855-4w8wq\" (UID: \"28de8914-20a6-4801-9a0d-6d90d585385c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550855-4w8wq" Mar 09 10:15:00 crc kubenswrapper[4861]: I0309 10:15:00.413582 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpdkb\" (UniqueName: \"kubernetes.io/projected/28de8914-20a6-4801-9a0d-6d90d585385c-kube-api-access-hpdkb\") pod \"collect-profiles-29550855-4w8wq\" (UID: \"28de8914-20a6-4801-9a0d-6d90d585385c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550855-4w8wq" Mar 09 10:15:00 crc kubenswrapper[4861]: I0309 10:15:00.482355 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550855-4w8wq" Mar 09 10:15:00 crc kubenswrapper[4861]: I0309 10:15:00.521921 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p5vjx_416fe11c-136c-4b38-9f85-2ba8df311664/frr-metrics/0.log" Mar 09 10:15:00 crc kubenswrapper[4861]: I0309 10:15:00.587852 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p5vjx_416fe11c-136c-4b38-9f85-2ba8df311664/kube-rbac-proxy/0.log" Mar 09 10:15:00 crc kubenswrapper[4861]: I0309 10:15:00.606604 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p5vjx_416fe11c-136c-4b38-9f85-2ba8df311664/kube-rbac-proxy-frr/0.log" Mar 09 10:15:00 crc kubenswrapper[4861]: I0309 10:15:00.793065 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p5vjx_416fe11c-136c-4b38-9f85-2ba8df311664/reloader/0.log" Mar 09 10:15:00 crc kubenswrapper[4861]: I0309 10:15:00.848832 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-2qds9_b5b4c14f-550c-483f-8a1d-5b596130b713/frr-k8s-webhook-server/0.log" Mar 09 10:15:00 crc kubenswrapper[4861]: I0309 10:15:00.969718 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550855-4w8wq"] Mar 09 10:15:00 crc kubenswrapper[4861]: W0309 10:15:00.973164 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28de8914_20a6_4801_9a0d_6d90d585385c.slice/crio-5ccc9f58bfc8ec2adb2a0b0d57ee5ba59ff429736cd9bdc9b0ea8f0aa43e06fc WatchSource:0}: Error finding container 5ccc9f58bfc8ec2adb2a0b0d57ee5ba59ff429736cd9bdc9b0ea8f0aa43e06fc: Status 404 returned error can't find the container with id 5ccc9f58bfc8ec2adb2a0b0d57ee5ba59ff429736cd9bdc9b0ea8f0aa43e06fc Mar 09 10:15:01 crc kubenswrapper[4861]: I0309 10:15:01.045223 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550855-4w8wq" event={"ID":"28de8914-20a6-4801-9a0d-6d90d585385c","Type":"ContainerStarted","Data":"5ccc9f58bfc8ec2adb2a0b0d57ee5ba59ff429736cd9bdc9b0ea8f0aa43e06fc"} Mar 09 10:15:01 crc kubenswrapper[4861]: I0309 10:15:01.109313 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-72xwj" Mar 09 10:15:01 crc kubenswrapper[4861]: I0309 10:15:01.155216 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-57f84dc5b8-mv5gg_bcfd9e4a-11a5-40dc-aa44-e6348ac2069b/manager/0.log" Mar 09 10:15:01 crc kubenswrapper[4861]: I0309 10:15:01.185675 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-72xwj"] Mar 09 10:15:01 crc kubenswrapper[4861]: I0309 10:15:01.240513 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qjh8b"] Mar 09 10:15:01 crc kubenswrapper[4861]: I0309 10:15:01.244236 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qjh8b" podUID="d0523999-9c2d-4335-8c1e-249abc1099b9" containerName="registry-server" containerID="cri-o://0df84067cc98849729fc39e4d221ab3a8f3f17455530f3286215af520cb110f7" gracePeriod=2 Mar 09 10:15:01 crc kubenswrapper[4861]: I0309 10:15:01.273836 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7569c9dcdc-v9vw2_9eb5d549-165a-4d97-8526-e082c80ed71b/webhook-server/0.log" Mar 09 10:15:01 crc kubenswrapper[4861]: I0309 10:15:01.514317 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-knlc5_48e2ee3b-9d66-4ffb-a9e7-f30dbe52d03b/kube-rbac-proxy/0.log" Mar 09 10:15:02 crc kubenswrapper[4861]: I0309 10:15:02.027657 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-knlc5_48e2ee3b-9d66-4ffb-a9e7-f30dbe52d03b/speaker/0.log" Mar 09 10:15:02 crc kubenswrapper[4861]: I0309 10:15:02.113855 4861 generic.go:334] "Generic (PLEG): container finished" podID="28de8914-20a6-4801-9a0d-6d90d585385c" containerID="346229ff8ae9ebc4c16782e7d48a81eb921e698bf01a1c51c4202d534380cccd" exitCode=0 Mar 09 10:15:02 crc kubenswrapper[4861]: I0309 10:15:02.113973 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550855-4w8wq" event={"ID":"28de8914-20a6-4801-9a0d-6d90d585385c","Type":"ContainerDied","Data":"346229ff8ae9ebc4c16782e7d48a81eb921e698bf01a1c51c4202d534380cccd"} Mar 09 10:15:02 crc kubenswrapper[4861]: I0309 10:15:02.141160 4861 generic.go:334] "Generic (PLEG): container finished" podID="d0523999-9c2d-4335-8c1e-249abc1099b9" containerID="0df84067cc98849729fc39e4d221ab3a8f3f17455530f3286215af520cb110f7" exitCode=0 Mar 09 10:15:02 crc kubenswrapper[4861]: I0309 10:15:02.142522 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qjh8b" event={"ID":"d0523999-9c2d-4335-8c1e-249abc1099b9","Type":"ContainerDied","Data":"0df84067cc98849729fc39e4d221ab3a8f3f17455530f3286215af520cb110f7"} Mar 09 10:15:02 crc kubenswrapper[4861]: I0309 10:15:02.556974 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qjh8b" Mar 09 10:15:02 crc kubenswrapper[4861]: I0309 10:15:02.557771 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p5vjx_416fe11c-136c-4b38-9f85-2ba8df311664/frr/0.log" Mar 09 10:15:02 crc kubenswrapper[4861]: I0309 10:15:02.742986 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0523999-9c2d-4335-8c1e-249abc1099b9-utilities\") pod \"d0523999-9c2d-4335-8c1e-249abc1099b9\" (UID: \"d0523999-9c2d-4335-8c1e-249abc1099b9\") " Mar 09 10:15:02 crc kubenswrapper[4861]: I0309 10:15:02.743044 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0523999-9c2d-4335-8c1e-249abc1099b9-catalog-content\") pod \"d0523999-9c2d-4335-8c1e-249abc1099b9\" (UID: \"d0523999-9c2d-4335-8c1e-249abc1099b9\") " Mar 09 10:15:02 crc kubenswrapper[4861]: I0309 10:15:02.743085 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gsq4j\" (UniqueName: \"kubernetes.io/projected/d0523999-9c2d-4335-8c1e-249abc1099b9-kube-api-access-gsq4j\") pod \"d0523999-9c2d-4335-8c1e-249abc1099b9\" (UID: \"d0523999-9c2d-4335-8c1e-249abc1099b9\") " Mar 09 10:15:02 crc kubenswrapper[4861]: I0309 10:15:02.743752 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0523999-9c2d-4335-8c1e-249abc1099b9-utilities" (OuterVolumeSpecName: "utilities") pod "d0523999-9c2d-4335-8c1e-249abc1099b9" (UID: "d0523999-9c2d-4335-8c1e-249abc1099b9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:15:02 crc kubenswrapper[4861]: I0309 10:15:02.744440 4861 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0523999-9c2d-4335-8c1e-249abc1099b9-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 10:15:02 crc kubenswrapper[4861]: I0309 10:15:02.757484 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0523999-9c2d-4335-8c1e-249abc1099b9-kube-api-access-gsq4j" (OuterVolumeSpecName: "kube-api-access-gsq4j") pod "d0523999-9c2d-4335-8c1e-249abc1099b9" (UID: "d0523999-9c2d-4335-8c1e-249abc1099b9"). InnerVolumeSpecName "kube-api-access-gsq4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:15:02 crc kubenswrapper[4861]: I0309 10:15:02.805120 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0523999-9c2d-4335-8c1e-249abc1099b9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d0523999-9c2d-4335-8c1e-249abc1099b9" (UID: "d0523999-9c2d-4335-8c1e-249abc1099b9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:15:02 crc kubenswrapper[4861]: I0309 10:15:02.846945 4861 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0523999-9c2d-4335-8c1e-249abc1099b9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 10:15:02 crc kubenswrapper[4861]: I0309 10:15:02.846976 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gsq4j\" (UniqueName: \"kubernetes.io/projected/d0523999-9c2d-4335-8c1e-249abc1099b9-kube-api-access-gsq4j\") on node \"crc\" DevicePath \"\"" Mar 09 10:15:03 crc kubenswrapper[4861]: I0309 10:15:03.151016 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qjh8b" Mar 09 10:15:03 crc kubenswrapper[4861]: I0309 10:15:03.151027 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qjh8b" event={"ID":"d0523999-9c2d-4335-8c1e-249abc1099b9","Type":"ContainerDied","Data":"e9fa7b58ea53a76f5cf3afb21b30566280e43e7e5f11cfd1f96365846f1d0f59"} Mar 09 10:15:03 crc kubenswrapper[4861]: I0309 10:15:03.151090 4861 scope.go:117] "RemoveContainer" containerID="0df84067cc98849729fc39e4d221ab3a8f3f17455530f3286215af520cb110f7" Mar 09 10:15:03 crc kubenswrapper[4861]: I0309 10:15:03.171013 4861 scope.go:117] "RemoveContainer" containerID="9c00705657843f8a6366840a6ac70eaac2b613ad1f8766ce1e063a422529d8d5" Mar 09 10:15:03 crc kubenswrapper[4861]: I0309 10:15:03.189965 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qjh8b"] Mar 09 10:15:03 crc kubenswrapper[4861]: I0309 10:15:03.201399 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qjh8b"] Mar 09 10:15:03 crc kubenswrapper[4861]: I0309 10:15:03.210582 4861 scope.go:117] "RemoveContainer" containerID="536a65f3794ddab69cd20b3ff11ae82e2e93cf196dede9489d1db44acd8c348f" Mar 09 10:15:03 crc kubenswrapper[4861]: I0309 10:15:03.568349 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550855-4w8wq" Mar 09 10:15:03 crc kubenswrapper[4861]: I0309 10:15:03.669708 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpdkb\" (UniqueName: \"kubernetes.io/projected/28de8914-20a6-4801-9a0d-6d90d585385c-kube-api-access-hpdkb\") pod \"28de8914-20a6-4801-9a0d-6d90d585385c\" (UID: \"28de8914-20a6-4801-9a0d-6d90d585385c\") " Mar 09 10:15:03 crc kubenswrapper[4861]: I0309 10:15:03.669752 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/28de8914-20a6-4801-9a0d-6d90d585385c-config-volume\") pod \"28de8914-20a6-4801-9a0d-6d90d585385c\" (UID: \"28de8914-20a6-4801-9a0d-6d90d585385c\") " Mar 09 10:15:03 crc kubenswrapper[4861]: I0309 10:15:03.669873 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/28de8914-20a6-4801-9a0d-6d90d585385c-secret-volume\") pod \"28de8914-20a6-4801-9a0d-6d90d585385c\" (UID: \"28de8914-20a6-4801-9a0d-6d90d585385c\") " Mar 09 10:15:03 crc kubenswrapper[4861]: I0309 10:15:03.670672 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28de8914-20a6-4801-9a0d-6d90d585385c-config-volume" (OuterVolumeSpecName: "config-volume") pod "28de8914-20a6-4801-9a0d-6d90d585385c" (UID: "28de8914-20a6-4801-9a0d-6d90d585385c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 10:15:03 crc kubenswrapper[4861]: I0309 10:15:03.671749 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0523999-9c2d-4335-8c1e-249abc1099b9" path="/var/lib/kubelet/pods/d0523999-9c2d-4335-8c1e-249abc1099b9/volumes" Mar 09 10:15:03 crc kubenswrapper[4861]: I0309 10:15:03.677063 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28de8914-20a6-4801-9a0d-6d90d585385c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "28de8914-20a6-4801-9a0d-6d90d585385c" (UID: "28de8914-20a6-4801-9a0d-6d90d585385c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 10:15:03 crc kubenswrapper[4861]: I0309 10:15:03.684540 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28de8914-20a6-4801-9a0d-6d90d585385c-kube-api-access-hpdkb" (OuterVolumeSpecName: "kube-api-access-hpdkb") pod "28de8914-20a6-4801-9a0d-6d90d585385c" (UID: "28de8914-20a6-4801-9a0d-6d90d585385c"). InnerVolumeSpecName "kube-api-access-hpdkb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:15:03 crc kubenswrapper[4861]: I0309 10:15:03.771953 4861 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/28de8914-20a6-4801-9a0d-6d90d585385c-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 09 10:15:03 crc kubenswrapper[4861]: I0309 10:15:03.771994 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpdkb\" (UniqueName: \"kubernetes.io/projected/28de8914-20a6-4801-9a0d-6d90d585385c-kube-api-access-hpdkb\") on node \"crc\" DevicePath \"\"" Mar 09 10:15:03 crc kubenswrapper[4861]: I0309 10:15:03.772005 4861 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/28de8914-20a6-4801-9a0d-6d90d585385c-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 10:15:04 crc kubenswrapper[4861]: I0309 10:15:04.160888 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550855-4w8wq" event={"ID":"28de8914-20a6-4801-9a0d-6d90d585385c","Type":"ContainerDied","Data":"5ccc9f58bfc8ec2adb2a0b0d57ee5ba59ff429736cd9bdc9b0ea8f0aa43e06fc"} Mar 09 10:15:04 crc kubenswrapper[4861]: I0309 10:15:04.161195 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ccc9f58bfc8ec2adb2a0b0d57ee5ba59ff429736cd9bdc9b0ea8f0aa43e06fc" Mar 09 10:15:04 crc kubenswrapper[4861]: I0309 10:15:04.161253 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550855-4w8wq" Mar 09 10:15:04 crc kubenswrapper[4861]: I0309 10:15:04.642744 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550810-s4vkz"] Mar 09 10:15:04 crc kubenswrapper[4861]: I0309 10:15:04.650938 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550810-s4vkz"] Mar 09 10:15:05 crc kubenswrapper[4861]: I0309 10:15:05.679045 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50a3028b-48fa-43a1-ac7d-12e409d83703" path="/var/lib/kubelet/pods/50a3028b-48fa-43a1-ac7d-12e409d83703/volumes" Mar 09 10:15:16 crc kubenswrapper[4861]: I0309 10:15:16.693476 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ggwqp_8668713c-12cf-457c-a09c-5302f11d19cc/util/0.log" Mar 09 10:15:16 crc kubenswrapper[4861]: I0309 10:15:16.854677 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ggwqp_8668713c-12cf-457c-a09c-5302f11d19cc/util/0.log" Mar 09 10:15:16 crc kubenswrapper[4861]: I0309 10:15:16.857880 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ggwqp_8668713c-12cf-457c-a09c-5302f11d19cc/pull/0.log" Mar 09 10:15:17 crc kubenswrapper[4861]: I0309 10:15:17.271278 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ggwqp_8668713c-12cf-457c-a09c-5302f11d19cc/pull/0.log" Mar 09 10:15:17 crc kubenswrapper[4861]: I0309 10:15:17.431125 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ggwqp_8668713c-12cf-457c-a09c-5302f11d19cc/extract/0.log" Mar 09 10:15:17 crc kubenswrapper[4861]: I0309 10:15:17.456593 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ggwqp_8668713c-12cf-457c-a09c-5302f11d19cc/pull/0.log" Mar 09 10:15:17 crc kubenswrapper[4861]: I0309 10:15:17.491354 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82ggwqp_8668713c-12cf-457c-a09c-5302f11d19cc/util/0.log" Mar 09 10:15:17 crc kubenswrapper[4861]: I0309 10:15:17.625041 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gcl57_87e64062-b33e-43aa-b264-7a26f9a3e0a0/extract-utilities/0.log" Mar 09 10:15:17 crc kubenswrapper[4861]: I0309 10:15:17.794586 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gcl57_87e64062-b33e-43aa-b264-7a26f9a3e0a0/extract-utilities/0.log" Mar 09 10:15:17 crc kubenswrapper[4861]: I0309 10:15:17.802549 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gcl57_87e64062-b33e-43aa-b264-7a26f9a3e0a0/extract-content/0.log" Mar 09 10:15:17 crc kubenswrapper[4861]: I0309 10:15:17.844680 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gcl57_87e64062-b33e-43aa-b264-7a26f9a3e0a0/extract-content/0.log" Mar 09 10:15:18 crc kubenswrapper[4861]: I0309 10:15:18.014874 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gcl57_87e64062-b33e-43aa-b264-7a26f9a3e0a0/extract-utilities/0.log" Mar 09 10:15:18 crc kubenswrapper[4861]: I0309 10:15:18.026474 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gcl57_87e64062-b33e-43aa-b264-7a26f9a3e0a0/extract-content/0.log" Mar 09 10:15:18 crc kubenswrapper[4861]: I0309 10:15:18.293188 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-72xwj_0b008521-717f-4b98-bfa4-b1b1469a1bd7/extract-utilities/0.log" Mar 09 10:15:18 crc kubenswrapper[4861]: I0309 10:15:18.470954 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-72xwj_0b008521-717f-4b98-bfa4-b1b1469a1bd7/extract-utilities/0.log" Mar 09 10:15:18 crc kubenswrapper[4861]: I0309 10:15:18.536666 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-72xwj_0b008521-717f-4b98-bfa4-b1b1469a1bd7/extract-content/0.log" Mar 09 10:15:18 crc kubenswrapper[4861]: I0309 10:15:18.536679 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-72xwj_0b008521-717f-4b98-bfa4-b1b1469a1bd7/extract-content/0.log" Mar 09 10:15:18 crc kubenswrapper[4861]: I0309 10:15:18.741415 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-72xwj_0b008521-717f-4b98-bfa4-b1b1469a1bd7/extract-content/0.log" Mar 09 10:15:18 crc kubenswrapper[4861]: I0309 10:15:18.846305 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-gcl57_87e64062-b33e-43aa-b264-7a26f9a3e0a0/registry-server/0.log" Mar 09 10:15:18 crc kubenswrapper[4861]: I0309 10:15:18.856137 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-72xwj_0b008521-717f-4b98-bfa4-b1b1469a1bd7/extract-utilities/0.log" Mar 09 10:15:19 crc kubenswrapper[4861]: I0309 10:15:19.056095 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-72xwj_0b008521-717f-4b98-bfa4-b1b1469a1bd7/registry-server/0.log" Mar 09 10:15:19 crc kubenswrapper[4861]: I0309 10:15:19.386341 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s7b95_4410072b-3e11-4357-9ce8-2c754f336515/util/0.log" Mar 09 10:15:19 crc kubenswrapper[4861]: I0309 10:15:19.572698 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s7b95_4410072b-3e11-4357-9ce8-2c754f336515/pull/0.log" Mar 09 10:15:19 crc kubenswrapper[4861]: I0309 10:15:19.586002 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s7b95_4410072b-3e11-4357-9ce8-2c754f336515/pull/0.log" Mar 09 10:15:19 crc kubenswrapper[4861]: I0309 10:15:19.589656 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s7b95_4410072b-3e11-4357-9ce8-2c754f336515/util/0.log" Mar 09 10:15:19 crc kubenswrapper[4861]: I0309 10:15:19.755165 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s7b95_4410072b-3e11-4357-9ce8-2c754f336515/pull/0.log" Mar 09 10:15:19 crc kubenswrapper[4861]: I0309 10:15:19.767324 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s7b95_4410072b-3e11-4357-9ce8-2c754f336515/util/0.log" Mar 09 10:15:19 crc kubenswrapper[4861]: I0309 10:15:19.768530 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s7b95_4410072b-3e11-4357-9ce8-2c754f336515/extract/0.log" Mar 09 10:15:19 crc kubenswrapper[4861]: I0309 10:15:19.978278 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-f2cbr_a5118375-51c4-460f-a7e4-4b0dc454daa2/extract-utilities/0.log" Mar 09 10:15:19 crc kubenswrapper[4861]: I0309 10:15:19.994948 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-9l4hx_5c398209-0537-461f-a2a8-b626abd10525/marketplace-operator/0.log" Mar 09 10:15:20 crc kubenswrapper[4861]: I0309 10:15:20.145349 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-f2cbr_a5118375-51c4-460f-a7e4-4b0dc454daa2/extract-content/0.log" Mar 09 10:15:20 crc kubenswrapper[4861]: I0309 10:15:20.148797 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-f2cbr_a5118375-51c4-460f-a7e4-4b0dc454daa2/extract-content/0.log" Mar 09 10:15:20 crc kubenswrapper[4861]: I0309 10:15:20.157796 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-f2cbr_a5118375-51c4-460f-a7e4-4b0dc454daa2/extract-utilities/0.log" Mar 09 10:15:20 crc kubenswrapper[4861]: I0309 10:15:20.314011 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-f2cbr_a5118375-51c4-460f-a7e4-4b0dc454daa2/extract-content/0.log" Mar 09 10:15:20 crc kubenswrapper[4861]: I0309 10:15:20.334970 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-f2cbr_a5118375-51c4-460f-a7e4-4b0dc454daa2/extract-utilities/0.log" Mar 09 10:15:20 crc kubenswrapper[4861]: I0309 10:15:20.535836 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ktkhz_f16d7e47-0abb-4811-9e99-3f68d1fd64ab/extract-utilities/0.log" Mar 09 10:15:20 crc kubenswrapper[4861]: I0309 10:15:20.540682 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-f2cbr_a5118375-51c4-460f-a7e4-4b0dc454daa2/registry-server/0.log" Mar 09 10:15:20 crc kubenswrapper[4861]: I0309 10:15:20.703944 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ktkhz_f16d7e47-0abb-4811-9e99-3f68d1fd64ab/extract-content/0.log" Mar 09 10:15:20 crc kubenswrapper[4861]: I0309 10:15:20.704416 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ktkhz_f16d7e47-0abb-4811-9e99-3f68d1fd64ab/extract-utilities/0.log" Mar 09 10:15:20 crc kubenswrapper[4861]: I0309 10:15:20.716776 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ktkhz_f16d7e47-0abb-4811-9e99-3f68d1fd64ab/extract-content/0.log" Mar 09 10:15:20 crc kubenswrapper[4861]: I0309 10:15:20.902175 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ktkhz_f16d7e47-0abb-4811-9e99-3f68d1fd64ab/extract-content/0.log" Mar 09 10:15:20 crc kubenswrapper[4861]: I0309 10:15:20.954516 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ktkhz_f16d7e47-0abb-4811-9e99-3f68d1fd64ab/extract-utilities/0.log" Mar 09 10:15:21 crc kubenswrapper[4861]: I0309 10:15:21.661699 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-ktkhz_f16d7e47-0abb-4811-9e99-3f68d1fd64ab/registry-server/0.log" Mar 09 10:15:24 crc kubenswrapper[4861]: I0309 10:15:24.606130 4861 patch_prober.go:28] interesting pod/machine-config-daemon-5g7gc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 10:15:24 crc kubenswrapper[4861]: I0309 10:15:24.606484 4861 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 10:15:24 crc kubenswrapper[4861]: I0309 10:15:24.606569 4861 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" Mar 09 10:15:24 crc kubenswrapper[4861]: I0309 10:15:24.607354 4861 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8fd7c92f085870944d490eb7e486a6ca47dfe81622ad08419017e4cba450475b"} pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 10:15:24 crc kubenswrapper[4861]: I0309 10:15:24.607428 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" containerID="cri-o://8fd7c92f085870944d490eb7e486a6ca47dfe81622ad08419017e4cba450475b" gracePeriod=600 Mar 09 10:15:25 crc kubenswrapper[4861]: I0309 10:15:25.355536 4861 generic.go:334] "Generic (PLEG): container finished" podID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerID="8fd7c92f085870944d490eb7e486a6ca47dfe81622ad08419017e4cba450475b" exitCode=0 Mar 09 10:15:25 crc kubenswrapper[4861]: I0309 10:15:25.355671 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" event={"ID":"6f7875e3-174f-4c67-8675-d878de74aa4f","Type":"ContainerDied","Data":"8fd7c92f085870944d490eb7e486a6ca47dfe81622ad08419017e4cba450475b"} Mar 09 10:15:25 crc kubenswrapper[4861]: I0309 10:15:25.356218 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" event={"ID":"6f7875e3-174f-4c67-8675-d878de74aa4f","Type":"ContainerStarted","Data":"c600e1aa8a7ba1a360c635c31a448b7027fd4628a8566a50e85d312351e1d37c"} Mar 09 10:15:25 crc kubenswrapper[4861]: I0309 10:15:25.356255 4861 scope.go:117] "RemoveContainer" containerID="7bbbec7e8a5f7da112767a5599dcb0b362a1472008cac65425c981acbf405224" Mar 09 10:15:38 crc kubenswrapper[4861]: I0309 10:15:38.992063 4861 scope.go:117] "RemoveContainer" containerID="c26847120e43444813aa069a4121a676c0450177f1ebc4fc00ce050eb622d965" Mar 09 10:15:40 crc kubenswrapper[4861]: I0309 10:15:40.456201 4861 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-665f4b6689-tfdk9" podUID="db5464b8-011f-4569-a47e-36766fa6c72e" containerName="neutron-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Mar 09 10:16:00 crc kubenswrapper[4861]: I0309 10:16:00.143394 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550856-f6c8f"] Mar 09 10:16:00 crc kubenswrapper[4861]: E0309 10:16:00.144553 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0523999-9c2d-4335-8c1e-249abc1099b9" containerName="extract-content" Mar 09 10:16:00 crc kubenswrapper[4861]: I0309 10:16:00.144568 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0523999-9c2d-4335-8c1e-249abc1099b9" containerName="extract-content" Mar 09 10:16:00 crc kubenswrapper[4861]: E0309 10:16:00.144583 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0523999-9c2d-4335-8c1e-249abc1099b9" containerName="registry-server" Mar 09 10:16:00 crc kubenswrapper[4861]: I0309 10:16:00.144589 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0523999-9c2d-4335-8c1e-249abc1099b9" containerName="registry-server" Mar 09 10:16:00 crc kubenswrapper[4861]: E0309 10:16:00.144599 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28de8914-20a6-4801-9a0d-6d90d585385c" containerName="collect-profiles" Mar 09 10:16:00 crc kubenswrapper[4861]: I0309 10:16:00.144606 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="28de8914-20a6-4801-9a0d-6d90d585385c" containerName="collect-profiles" Mar 09 10:16:00 crc kubenswrapper[4861]: E0309 10:16:00.144657 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0523999-9c2d-4335-8c1e-249abc1099b9" containerName="extract-utilities" Mar 09 10:16:00 crc kubenswrapper[4861]: I0309 10:16:00.144671 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0523999-9c2d-4335-8c1e-249abc1099b9" containerName="extract-utilities" Mar 09 10:16:00 crc kubenswrapper[4861]: I0309 10:16:00.144957 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0523999-9c2d-4335-8c1e-249abc1099b9" containerName="registry-server" Mar 09 10:16:00 crc kubenswrapper[4861]: I0309 10:16:00.144974 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="28de8914-20a6-4801-9a0d-6d90d585385c" containerName="collect-profiles" Mar 09 10:16:00 crc kubenswrapper[4861]: I0309 10:16:00.145848 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550856-f6c8f" Mar 09 10:16:00 crc kubenswrapper[4861]: I0309 10:16:00.150071 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 10:16:00 crc kubenswrapper[4861]: I0309 10:16:00.150236 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k86l8" Mar 09 10:16:00 crc kubenswrapper[4861]: I0309 10:16:00.150615 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 10:16:00 crc kubenswrapper[4861]: I0309 10:16:00.158557 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550856-f6c8f"] Mar 09 10:16:00 crc kubenswrapper[4861]: I0309 10:16:00.319497 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5fb6\" (UniqueName: \"kubernetes.io/projected/411b28cc-1d68-43de-aab3-f97bc785c97e-kube-api-access-h5fb6\") pod \"auto-csr-approver-29550856-f6c8f\" (UID: \"411b28cc-1d68-43de-aab3-f97bc785c97e\") " pod="openshift-infra/auto-csr-approver-29550856-f6c8f" Mar 09 10:16:00 crc kubenswrapper[4861]: I0309 10:16:00.422103 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5fb6\" (UniqueName: \"kubernetes.io/projected/411b28cc-1d68-43de-aab3-f97bc785c97e-kube-api-access-h5fb6\") pod \"auto-csr-approver-29550856-f6c8f\" (UID: \"411b28cc-1d68-43de-aab3-f97bc785c97e\") " pod="openshift-infra/auto-csr-approver-29550856-f6c8f" Mar 09 10:16:00 crc kubenswrapper[4861]: I0309 10:16:00.443953 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5fb6\" (UniqueName: \"kubernetes.io/projected/411b28cc-1d68-43de-aab3-f97bc785c97e-kube-api-access-h5fb6\") pod \"auto-csr-approver-29550856-f6c8f\" (UID: \"411b28cc-1d68-43de-aab3-f97bc785c97e\") " pod="openshift-infra/auto-csr-approver-29550856-f6c8f" Mar 09 10:16:00 crc kubenswrapper[4861]: I0309 10:16:00.467747 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550856-f6c8f" Mar 09 10:16:00 crc kubenswrapper[4861]: I0309 10:16:00.918684 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550856-f6c8f"] Mar 09 10:16:01 crc kubenswrapper[4861]: I0309 10:16:01.687122 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550856-f6c8f" event={"ID":"411b28cc-1d68-43de-aab3-f97bc785c97e","Type":"ContainerStarted","Data":"32505c702ea28dd22cf56f13c0108d1a13d203369f69843e256c8ae25520df7e"} Mar 09 10:16:02 crc kubenswrapper[4861]: I0309 10:16:02.696201 4861 generic.go:334] "Generic (PLEG): container finished" podID="411b28cc-1d68-43de-aab3-f97bc785c97e" containerID="1df8482be4ff633209cbcd2cc0ce78b24b2e71b2784e35adad797cf30491aeb5" exitCode=0 Mar 09 10:16:02 crc kubenswrapper[4861]: I0309 10:16:02.696252 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550856-f6c8f" event={"ID":"411b28cc-1d68-43de-aab3-f97bc785c97e","Type":"ContainerDied","Data":"1df8482be4ff633209cbcd2cc0ce78b24b2e71b2784e35adad797cf30491aeb5"} Mar 09 10:16:04 crc kubenswrapper[4861]: I0309 10:16:04.109202 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550856-f6c8f" Mar 09 10:16:04 crc kubenswrapper[4861]: I0309 10:16:04.134885 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5fb6\" (UniqueName: \"kubernetes.io/projected/411b28cc-1d68-43de-aab3-f97bc785c97e-kube-api-access-h5fb6\") pod \"411b28cc-1d68-43de-aab3-f97bc785c97e\" (UID: \"411b28cc-1d68-43de-aab3-f97bc785c97e\") " Mar 09 10:16:04 crc kubenswrapper[4861]: I0309 10:16:04.140685 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/411b28cc-1d68-43de-aab3-f97bc785c97e-kube-api-access-h5fb6" (OuterVolumeSpecName: "kube-api-access-h5fb6") pod "411b28cc-1d68-43de-aab3-f97bc785c97e" (UID: "411b28cc-1d68-43de-aab3-f97bc785c97e"). InnerVolumeSpecName "kube-api-access-h5fb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:16:04 crc kubenswrapper[4861]: I0309 10:16:04.236672 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5fb6\" (UniqueName: \"kubernetes.io/projected/411b28cc-1d68-43de-aab3-f97bc785c97e-kube-api-access-h5fb6\") on node \"crc\" DevicePath \"\"" Mar 09 10:16:04 crc kubenswrapper[4861]: I0309 10:16:04.739836 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550856-f6c8f" event={"ID":"411b28cc-1d68-43de-aab3-f97bc785c97e","Type":"ContainerDied","Data":"32505c702ea28dd22cf56f13c0108d1a13d203369f69843e256c8ae25520df7e"} Mar 09 10:16:04 crc kubenswrapper[4861]: I0309 10:16:04.739880 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32505c702ea28dd22cf56f13c0108d1a13d203369f69843e256c8ae25520df7e" Mar 09 10:16:04 crc kubenswrapper[4861]: I0309 10:16:04.739945 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550856-f6c8f" Mar 09 10:16:05 crc kubenswrapper[4861]: I0309 10:16:05.177429 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550850-4rt8d"] Mar 09 10:16:05 crc kubenswrapper[4861]: I0309 10:16:05.186276 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550850-4rt8d"] Mar 09 10:16:05 crc kubenswrapper[4861]: I0309 10:16:05.670003 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19172363-d5d0-4123-af55-e3c88be4c003" path="/var/lib/kubelet/pods/19172363-d5d0-4123-af55-e3c88be4c003/volumes" Mar 09 10:16:39 crc kubenswrapper[4861]: I0309 10:16:39.112546 4861 scope.go:117] "RemoveContainer" containerID="406f31b9548ff94dc29c9f306cb7699fd28f6cb51aebac0b5d0913dd27ddff64" Mar 09 10:17:17 crc kubenswrapper[4861]: I0309 10:17:17.121304 4861 generic.go:334] "Generic (PLEG): container finished" podID="13cd8779-33ad-4eb1-93f2-2d1dec868cb9" containerID="1ae5f47fb5b55e0c37696da9f6e90ba678bd3f553f434d7e332a7e53b00b566d" exitCode=0 Mar 09 10:17:17 crc kubenswrapper[4861]: I0309 10:17:17.121384 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wqw8f/must-gather-nz6cl" event={"ID":"13cd8779-33ad-4eb1-93f2-2d1dec868cb9","Type":"ContainerDied","Data":"1ae5f47fb5b55e0c37696da9f6e90ba678bd3f553f434d7e332a7e53b00b566d"} Mar 09 10:17:17 crc kubenswrapper[4861]: I0309 10:17:17.122825 4861 scope.go:117] "RemoveContainer" containerID="1ae5f47fb5b55e0c37696da9f6e90ba678bd3f553f434d7e332a7e53b00b566d" Mar 09 10:17:17 crc kubenswrapper[4861]: I0309 10:17:17.839838 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-wqw8f_must-gather-nz6cl_13cd8779-33ad-4eb1-93f2-2d1dec868cb9/gather/0.log" Mar 09 10:17:24 crc kubenswrapper[4861]: I0309 10:17:24.606514 4861 patch_prober.go:28] interesting pod/machine-config-daemon-5g7gc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 10:17:24 crc kubenswrapper[4861]: I0309 10:17:24.607146 4861 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 10:17:29 crc kubenswrapper[4861]: I0309 10:17:29.979167 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-wqw8f/must-gather-nz6cl"] Mar 09 10:17:29 crc kubenswrapper[4861]: I0309 10:17:29.980245 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-wqw8f/must-gather-nz6cl" podUID="13cd8779-33ad-4eb1-93f2-2d1dec868cb9" containerName="copy" containerID="cri-o://abed4f95404b849d5539b361a3af5603da617586feb32afb7f2419909a73754c" gracePeriod=2 Mar 09 10:17:29 crc kubenswrapper[4861]: I0309 10:17:29.989549 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-wqw8f/must-gather-nz6cl"] Mar 09 10:17:30 crc kubenswrapper[4861]: I0309 10:17:30.262181 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-wqw8f_must-gather-nz6cl_13cd8779-33ad-4eb1-93f2-2d1dec868cb9/copy/0.log" Mar 09 10:17:30 crc kubenswrapper[4861]: I0309 10:17:30.262960 4861 generic.go:334] "Generic (PLEG): container finished" podID="13cd8779-33ad-4eb1-93f2-2d1dec868cb9" containerID="abed4f95404b849d5539b361a3af5603da617586feb32afb7f2419909a73754c" exitCode=143 Mar 09 10:17:30 crc kubenswrapper[4861]: I0309 10:17:30.424621 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-wqw8f_must-gather-nz6cl_13cd8779-33ad-4eb1-93f2-2d1dec868cb9/copy/0.log" Mar 09 10:17:30 crc kubenswrapper[4861]: I0309 10:17:30.424981 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wqw8f/must-gather-nz6cl" Mar 09 10:17:30 crc kubenswrapper[4861]: I0309 10:17:30.498222 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v48wx\" (UniqueName: \"kubernetes.io/projected/13cd8779-33ad-4eb1-93f2-2d1dec868cb9-kube-api-access-v48wx\") pod \"13cd8779-33ad-4eb1-93f2-2d1dec868cb9\" (UID: \"13cd8779-33ad-4eb1-93f2-2d1dec868cb9\") " Mar 09 10:17:30 crc kubenswrapper[4861]: I0309 10:17:30.498519 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/13cd8779-33ad-4eb1-93f2-2d1dec868cb9-must-gather-output\") pod \"13cd8779-33ad-4eb1-93f2-2d1dec868cb9\" (UID: \"13cd8779-33ad-4eb1-93f2-2d1dec868cb9\") " Mar 09 10:17:30 crc kubenswrapper[4861]: I0309 10:17:30.511633 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13cd8779-33ad-4eb1-93f2-2d1dec868cb9-kube-api-access-v48wx" (OuterVolumeSpecName: "kube-api-access-v48wx") pod "13cd8779-33ad-4eb1-93f2-2d1dec868cb9" (UID: "13cd8779-33ad-4eb1-93f2-2d1dec868cb9"). InnerVolumeSpecName "kube-api-access-v48wx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:17:30 crc kubenswrapper[4861]: I0309 10:17:30.601311 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v48wx\" (UniqueName: \"kubernetes.io/projected/13cd8779-33ad-4eb1-93f2-2d1dec868cb9-kube-api-access-v48wx\") on node \"crc\" DevicePath \"\"" Mar 09 10:17:30 crc kubenswrapper[4861]: I0309 10:17:30.703636 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13cd8779-33ad-4eb1-93f2-2d1dec868cb9-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "13cd8779-33ad-4eb1-93f2-2d1dec868cb9" (UID: "13cd8779-33ad-4eb1-93f2-2d1dec868cb9"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:17:30 crc kubenswrapper[4861]: I0309 10:17:30.704169 4861 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/13cd8779-33ad-4eb1-93f2-2d1dec868cb9-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 09 10:17:31 crc kubenswrapper[4861]: I0309 10:17:31.272914 4861 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-wqw8f_must-gather-nz6cl_13cd8779-33ad-4eb1-93f2-2d1dec868cb9/copy/0.log" Mar 09 10:17:31 crc kubenswrapper[4861]: I0309 10:17:31.273494 4861 scope.go:117] "RemoveContainer" containerID="abed4f95404b849d5539b361a3af5603da617586feb32afb7f2419909a73754c" Mar 09 10:17:31 crc kubenswrapper[4861]: I0309 10:17:31.273671 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wqw8f/must-gather-nz6cl" Mar 09 10:17:31 crc kubenswrapper[4861]: I0309 10:17:31.305348 4861 scope.go:117] "RemoveContainer" containerID="1ae5f47fb5b55e0c37696da9f6e90ba678bd3f553f434d7e332a7e53b00b566d" Mar 09 10:17:31 crc kubenswrapper[4861]: I0309 10:17:31.670310 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13cd8779-33ad-4eb1-93f2-2d1dec868cb9" path="/var/lib/kubelet/pods/13cd8779-33ad-4eb1-93f2-2d1dec868cb9/volumes" Mar 09 10:17:54 crc kubenswrapper[4861]: I0309 10:17:54.606300 4861 patch_prober.go:28] interesting pod/machine-config-daemon-5g7gc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 10:17:54 crc kubenswrapper[4861]: I0309 10:17:54.607538 4861 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 10:18:00 crc kubenswrapper[4861]: I0309 10:18:00.147185 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550858-wpb7f"] Mar 09 10:18:00 crc kubenswrapper[4861]: E0309 10:18:00.148223 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="411b28cc-1d68-43de-aab3-f97bc785c97e" containerName="oc" Mar 09 10:18:00 crc kubenswrapper[4861]: I0309 10:18:00.148238 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="411b28cc-1d68-43de-aab3-f97bc785c97e" containerName="oc" Mar 09 10:18:00 crc kubenswrapper[4861]: E0309 10:18:00.148258 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13cd8779-33ad-4eb1-93f2-2d1dec868cb9" containerName="gather" Mar 09 10:18:00 crc kubenswrapper[4861]: I0309 10:18:00.148265 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="13cd8779-33ad-4eb1-93f2-2d1dec868cb9" containerName="gather" Mar 09 10:18:00 crc kubenswrapper[4861]: E0309 10:18:00.148299 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13cd8779-33ad-4eb1-93f2-2d1dec868cb9" containerName="copy" Mar 09 10:18:00 crc kubenswrapper[4861]: I0309 10:18:00.148309 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="13cd8779-33ad-4eb1-93f2-2d1dec868cb9" containerName="copy" Mar 09 10:18:00 crc kubenswrapper[4861]: I0309 10:18:00.148543 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="411b28cc-1d68-43de-aab3-f97bc785c97e" containerName="oc" Mar 09 10:18:00 crc kubenswrapper[4861]: I0309 10:18:00.148562 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="13cd8779-33ad-4eb1-93f2-2d1dec868cb9" containerName="copy" Mar 09 10:18:00 crc kubenswrapper[4861]: I0309 10:18:00.148572 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="13cd8779-33ad-4eb1-93f2-2d1dec868cb9" containerName="gather" Mar 09 10:18:00 crc kubenswrapper[4861]: I0309 10:18:00.149306 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550858-wpb7f" Mar 09 10:18:00 crc kubenswrapper[4861]: I0309 10:18:00.152337 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 10:18:00 crc kubenswrapper[4861]: I0309 10:18:00.152339 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k86l8" Mar 09 10:18:00 crc kubenswrapper[4861]: I0309 10:18:00.157584 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 10:18:00 crc kubenswrapper[4861]: I0309 10:18:00.164292 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550858-wpb7f"] Mar 09 10:18:00 crc kubenswrapper[4861]: I0309 10:18:00.254950 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpx6j\" (UniqueName: \"kubernetes.io/projected/10447f6b-7b9a-4031-84b6-22514a21e220-kube-api-access-rpx6j\") pod \"auto-csr-approver-29550858-wpb7f\" (UID: \"10447f6b-7b9a-4031-84b6-22514a21e220\") " pod="openshift-infra/auto-csr-approver-29550858-wpb7f" Mar 09 10:18:00 crc kubenswrapper[4861]: I0309 10:18:00.357258 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpx6j\" (UniqueName: \"kubernetes.io/projected/10447f6b-7b9a-4031-84b6-22514a21e220-kube-api-access-rpx6j\") pod \"auto-csr-approver-29550858-wpb7f\" (UID: \"10447f6b-7b9a-4031-84b6-22514a21e220\") " pod="openshift-infra/auto-csr-approver-29550858-wpb7f" Mar 09 10:18:00 crc kubenswrapper[4861]: I0309 10:18:00.386147 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpx6j\" (UniqueName: \"kubernetes.io/projected/10447f6b-7b9a-4031-84b6-22514a21e220-kube-api-access-rpx6j\") pod \"auto-csr-approver-29550858-wpb7f\" (UID: \"10447f6b-7b9a-4031-84b6-22514a21e220\") " pod="openshift-infra/auto-csr-approver-29550858-wpb7f" Mar 09 10:18:00 crc kubenswrapper[4861]: I0309 10:18:00.471408 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550858-wpb7f" Mar 09 10:18:00 crc kubenswrapper[4861]: I0309 10:18:00.915129 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550858-wpb7f"] Mar 09 10:18:01 crc kubenswrapper[4861]: I0309 10:18:01.562163 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550858-wpb7f" event={"ID":"10447f6b-7b9a-4031-84b6-22514a21e220","Type":"ContainerStarted","Data":"e93a037da6c5519e4a010a79ff3eb075c7a2465d5902acd332fcd72fd9c85bf9"} Mar 09 10:18:02 crc kubenswrapper[4861]: I0309 10:18:02.583644 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550858-wpb7f" event={"ID":"10447f6b-7b9a-4031-84b6-22514a21e220","Type":"ContainerStarted","Data":"391ef43b783aff88fc5929b2aa6c649d06e949288a70f71bb0cb9e260e7c0548"} Mar 09 10:18:02 crc kubenswrapper[4861]: I0309 10:18:02.609992 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550858-wpb7f" podStartSLOduration=1.365850818 podStartE2EDuration="2.609972369s" podCreationTimestamp="2026-03-09 10:18:00 +0000 UTC" firstStartedPulling="2026-03-09 10:18:00.918952089 +0000 UTC m=+4324.003991490" lastFinishedPulling="2026-03-09 10:18:02.16307364 +0000 UTC m=+4325.248113041" observedRunningTime="2026-03-09 10:18:02.596488897 +0000 UTC m=+4325.681528298" watchObservedRunningTime="2026-03-09 10:18:02.609972369 +0000 UTC m=+4325.695011760" Mar 09 10:18:03 crc kubenswrapper[4861]: I0309 10:18:03.594530 4861 generic.go:334] "Generic (PLEG): container finished" podID="10447f6b-7b9a-4031-84b6-22514a21e220" containerID="391ef43b783aff88fc5929b2aa6c649d06e949288a70f71bb0cb9e260e7c0548" exitCode=0 Mar 09 10:18:03 crc kubenswrapper[4861]: I0309 10:18:03.594594 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550858-wpb7f" event={"ID":"10447f6b-7b9a-4031-84b6-22514a21e220","Type":"ContainerDied","Data":"391ef43b783aff88fc5929b2aa6c649d06e949288a70f71bb0cb9e260e7c0548"} Mar 09 10:18:04 crc kubenswrapper[4861]: I0309 10:18:04.954753 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550858-wpb7f" Mar 09 10:18:05 crc kubenswrapper[4861]: I0309 10:18:05.042991 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpx6j\" (UniqueName: \"kubernetes.io/projected/10447f6b-7b9a-4031-84b6-22514a21e220-kube-api-access-rpx6j\") pod \"10447f6b-7b9a-4031-84b6-22514a21e220\" (UID: \"10447f6b-7b9a-4031-84b6-22514a21e220\") " Mar 09 10:18:05 crc kubenswrapper[4861]: I0309 10:18:05.048437 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10447f6b-7b9a-4031-84b6-22514a21e220-kube-api-access-rpx6j" (OuterVolumeSpecName: "kube-api-access-rpx6j") pod "10447f6b-7b9a-4031-84b6-22514a21e220" (UID: "10447f6b-7b9a-4031-84b6-22514a21e220"). InnerVolumeSpecName "kube-api-access-rpx6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:18:05 crc kubenswrapper[4861]: I0309 10:18:05.144988 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rpx6j\" (UniqueName: \"kubernetes.io/projected/10447f6b-7b9a-4031-84b6-22514a21e220-kube-api-access-rpx6j\") on node \"crc\" DevicePath \"\"" Mar 09 10:18:05 crc kubenswrapper[4861]: I0309 10:18:05.619389 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550858-wpb7f" event={"ID":"10447f6b-7b9a-4031-84b6-22514a21e220","Type":"ContainerDied","Data":"e93a037da6c5519e4a010a79ff3eb075c7a2465d5902acd332fcd72fd9c85bf9"} Mar 09 10:18:05 crc kubenswrapper[4861]: I0309 10:18:05.619710 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e93a037da6c5519e4a010a79ff3eb075c7a2465d5902acd332fcd72fd9c85bf9" Mar 09 10:18:05 crc kubenswrapper[4861]: I0309 10:18:05.619774 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550858-wpb7f" Mar 09 10:18:05 crc kubenswrapper[4861]: I0309 10:18:05.669021 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550852-hspj2"] Mar 09 10:18:05 crc kubenswrapper[4861]: I0309 10:18:05.677433 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550852-hspj2"] Mar 09 10:18:07 crc kubenswrapper[4861]: I0309 10:18:07.668861 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6f55d8b-22d1-4b1a-99a4-9ae7ce5e6366" path="/var/lib/kubelet/pods/a6f55d8b-22d1-4b1a-99a4-9ae7ce5e6366/volumes" Mar 09 10:18:24 crc kubenswrapper[4861]: I0309 10:18:24.606153 4861 patch_prober.go:28] interesting pod/machine-config-daemon-5g7gc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 10:18:24 crc kubenswrapper[4861]: I0309 10:18:24.606742 4861 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 10:18:24 crc kubenswrapper[4861]: I0309 10:18:24.606794 4861 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" Mar 09 10:18:24 crc kubenswrapper[4861]: I0309 10:18:24.607595 4861 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c600e1aa8a7ba1a360c635c31a448b7027fd4628a8566a50e85d312351e1d37c"} pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 10:18:24 crc kubenswrapper[4861]: I0309 10:18:24.607662 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerName="machine-config-daemon" containerID="cri-o://c600e1aa8a7ba1a360c635c31a448b7027fd4628a8566a50e85d312351e1d37c" gracePeriod=600 Mar 09 10:18:24 crc kubenswrapper[4861]: E0309 10:18:24.732507 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 10:18:25 crc kubenswrapper[4861]: I0309 10:18:25.304738 4861 generic.go:334] "Generic (PLEG): container finished" podID="6f7875e3-174f-4c67-8675-d878de74aa4f" containerID="c600e1aa8a7ba1a360c635c31a448b7027fd4628a8566a50e85d312351e1d37c" exitCode=0 Mar 09 10:18:25 crc kubenswrapper[4861]: I0309 10:18:25.304790 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" event={"ID":"6f7875e3-174f-4c67-8675-d878de74aa4f","Type":"ContainerDied","Data":"c600e1aa8a7ba1a360c635c31a448b7027fd4628a8566a50e85d312351e1d37c"} Mar 09 10:18:25 crc kubenswrapper[4861]: I0309 10:18:25.304828 4861 scope.go:117] "RemoveContainer" containerID="8fd7c92f085870944d490eb7e486a6ca47dfe81622ad08419017e4cba450475b" Mar 09 10:18:25 crc kubenswrapper[4861]: I0309 10:18:25.305552 4861 scope.go:117] "RemoveContainer" containerID="c600e1aa8a7ba1a360c635c31a448b7027fd4628a8566a50e85d312351e1d37c" Mar 09 10:18:25 crc kubenswrapper[4861]: E0309 10:18:25.305899 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 10:18:37 crc kubenswrapper[4861]: I0309 10:18:37.665333 4861 scope.go:117] "RemoveContainer" containerID="c600e1aa8a7ba1a360c635c31a448b7027fd4628a8566a50e85d312351e1d37c" Mar 09 10:18:37 crc kubenswrapper[4861]: E0309 10:18:37.666294 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 10:18:39 crc kubenswrapper[4861]: I0309 10:18:39.697897 4861 scope.go:117] "RemoveContainer" containerID="162595f9247d08dd3b707d0360e464245380ea9d519c544096833e89f7ea5340" Mar 09 10:18:39 crc kubenswrapper[4861]: I0309 10:18:39.821024 4861 scope.go:117] "RemoveContainer" containerID="bcff50287b12a70d33f0ccd64336df345f79598f2f3d1d475b8a0083a67ec703" Mar 09 10:18:52 crc kubenswrapper[4861]: I0309 10:18:52.657735 4861 scope.go:117] "RemoveContainer" containerID="c600e1aa8a7ba1a360c635c31a448b7027fd4628a8566a50e85d312351e1d37c" Mar 09 10:18:52 crc kubenswrapper[4861]: E0309 10:18:52.658497 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 10:19:06 crc kubenswrapper[4861]: I0309 10:19:06.658474 4861 scope.go:117] "RemoveContainer" containerID="c600e1aa8a7ba1a360c635c31a448b7027fd4628a8566a50e85d312351e1d37c" Mar 09 10:19:06 crc kubenswrapper[4861]: E0309 10:19:06.659291 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 10:19:21 crc kubenswrapper[4861]: I0309 10:19:21.658863 4861 scope.go:117] "RemoveContainer" containerID="c600e1aa8a7ba1a360c635c31a448b7027fd4628a8566a50e85d312351e1d37c" Mar 09 10:19:21 crc kubenswrapper[4861]: E0309 10:19:21.659777 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 10:19:31 crc kubenswrapper[4861]: I0309 10:19:31.142487 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ztbmg"] Mar 09 10:19:31 crc kubenswrapper[4861]: E0309 10:19:31.143704 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10447f6b-7b9a-4031-84b6-22514a21e220" containerName="oc" Mar 09 10:19:31 crc kubenswrapper[4861]: I0309 10:19:31.143726 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="10447f6b-7b9a-4031-84b6-22514a21e220" containerName="oc" Mar 09 10:19:31 crc kubenswrapper[4861]: I0309 10:19:31.144062 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="10447f6b-7b9a-4031-84b6-22514a21e220" containerName="oc" Mar 09 10:19:31 crc kubenswrapper[4861]: I0309 10:19:31.146470 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ztbmg" Mar 09 10:19:31 crc kubenswrapper[4861]: I0309 10:19:31.151866 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ztbmg"] Mar 09 10:19:31 crc kubenswrapper[4861]: I0309 10:19:31.239607 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shqr8\" (UniqueName: \"kubernetes.io/projected/9ad41b34-0f10-4e71-8533-7ff463c4b2a0-kube-api-access-shqr8\") pod \"certified-operators-ztbmg\" (UID: \"9ad41b34-0f10-4e71-8533-7ff463c4b2a0\") " pod="openshift-marketplace/certified-operators-ztbmg" Mar 09 10:19:31 crc kubenswrapper[4861]: I0309 10:19:31.240031 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ad41b34-0f10-4e71-8533-7ff463c4b2a0-utilities\") pod \"certified-operators-ztbmg\" (UID: \"9ad41b34-0f10-4e71-8533-7ff463c4b2a0\") " pod="openshift-marketplace/certified-operators-ztbmg" Mar 09 10:19:31 crc kubenswrapper[4861]: I0309 10:19:31.240180 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ad41b34-0f10-4e71-8533-7ff463c4b2a0-catalog-content\") pod \"certified-operators-ztbmg\" (UID: \"9ad41b34-0f10-4e71-8533-7ff463c4b2a0\") " pod="openshift-marketplace/certified-operators-ztbmg" Mar 09 10:19:31 crc kubenswrapper[4861]: I0309 10:19:31.341521 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ad41b34-0f10-4e71-8533-7ff463c4b2a0-catalog-content\") pod \"certified-operators-ztbmg\" (UID: \"9ad41b34-0f10-4e71-8533-7ff463c4b2a0\") " pod="openshift-marketplace/certified-operators-ztbmg" Mar 09 10:19:31 crc kubenswrapper[4861]: I0309 10:19:31.341567 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shqr8\" (UniqueName: \"kubernetes.io/projected/9ad41b34-0f10-4e71-8533-7ff463c4b2a0-kube-api-access-shqr8\") pod \"certified-operators-ztbmg\" (UID: \"9ad41b34-0f10-4e71-8533-7ff463c4b2a0\") " pod="openshift-marketplace/certified-operators-ztbmg" Mar 09 10:19:31 crc kubenswrapper[4861]: I0309 10:19:31.341640 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ad41b34-0f10-4e71-8533-7ff463c4b2a0-utilities\") pod \"certified-operators-ztbmg\" (UID: \"9ad41b34-0f10-4e71-8533-7ff463c4b2a0\") " pod="openshift-marketplace/certified-operators-ztbmg" Mar 09 10:19:31 crc kubenswrapper[4861]: I0309 10:19:31.342213 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ad41b34-0f10-4e71-8533-7ff463c4b2a0-utilities\") pod \"certified-operators-ztbmg\" (UID: \"9ad41b34-0f10-4e71-8533-7ff463c4b2a0\") " pod="openshift-marketplace/certified-operators-ztbmg" Mar 09 10:19:31 crc kubenswrapper[4861]: I0309 10:19:31.342228 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ad41b34-0f10-4e71-8533-7ff463c4b2a0-catalog-content\") pod \"certified-operators-ztbmg\" (UID: \"9ad41b34-0f10-4e71-8533-7ff463c4b2a0\") " pod="openshift-marketplace/certified-operators-ztbmg" Mar 09 10:19:31 crc kubenswrapper[4861]: I0309 10:19:31.711534 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shqr8\" (UniqueName: \"kubernetes.io/projected/9ad41b34-0f10-4e71-8533-7ff463c4b2a0-kube-api-access-shqr8\") pod \"certified-operators-ztbmg\" (UID: \"9ad41b34-0f10-4e71-8533-7ff463c4b2a0\") " pod="openshift-marketplace/certified-operators-ztbmg" Mar 09 10:19:31 crc kubenswrapper[4861]: I0309 10:19:31.774679 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ztbmg" Mar 09 10:19:32 crc kubenswrapper[4861]: I0309 10:19:32.296088 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ztbmg"] Mar 09 10:19:32 crc kubenswrapper[4861]: I0309 10:19:32.948217 4861 generic.go:334] "Generic (PLEG): container finished" podID="9ad41b34-0f10-4e71-8533-7ff463c4b2a0" containerID="87209f647c1cbdbc39759f43ce337d14cbbe5c930172cdf89b3ab36e7b3ef31e" exitCode=0 Mar 09 10:19:32 crc kubenswrapper[4861]: I0309 10:19:32.948441 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ztbmg" event={"ID":"9ad41b34-0f10-4e71-8533-7ff463c4b2a0","Type":"ContainerDied","Data":"87209f647c1cbdbc39759f43ce337d14cbbe5c930172cdf89b3ab36e7b3ef31e"} Mar 09 10:19:32 crc kubenswrapper[4861]: I0309 10:19:32.948512 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ztbmg" event={"ID":"9ad41b34-0f10-4e71-8533-7ff463c4b2a0","Type":"ContainerStarted","Data":"1039adcc3435a65882f25abb7547f130ea4db94302d78dbffb4dbedd3e2a0e85"} Mar 09 10:19:32 crc kubenswrapper[4861]: I0309 10:19:32.950005 4861 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 10:19:33 crc kubenswrapper[4861]: I0309 10:19:33.959168 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ztbmg" event={"ID":"9ad41b34-0f10-4e71-8533-7ff463c4b2a0","Type":"ContainerStarted","Data":"179d478e40f3197ac01ec0cb55cce9e29d80a79c39947d8cd0df83acef13986e"} Mar 09 10:19:34 crc kubenswrapper[4861]: I0309 10:19:34.971660 4861 generic.go:334] "Generic (PLEG): container finished" podID="9ad41b34-0f10-4e71-8533-7ff463c4b2a0" containerID="179d478e40f3197ac01ec0cb55cce9e29d80a79c39947d8cd0df83acef13986e" exitCode=0 Mar 09 10:19:34 crc kubenswrapper[4861]: I0309 10:19:34.972351 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ztbmg" event={"ID":"9ad41b34-0f10-4e71-8533-7ff463c4b2a0","Type":"ContainerDied","Data":"179d478e40f3197ac01ec0cb55cce9e29d80a79c39947d8cd0df83acef13986e"} Mar 09 10:19:36 crc kubenswrapper[4861]: I0309 10:19:36.657835 4861 scope.go:117] "RemoveContainer" containerID="c600e1aa8a7ba1a360c635c31a448b7027fd4628a8566a50e85d312351e1d37c" Mar 09 10:19:36 crc kubenswrapper[4861]: E0309 10:19:36.658599 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 10:19:36 crc kubenswrapper[4861]: I0309 10:19:36.994272 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ztbmg" event={"ID":"9ad41b34-0f10-4e71-8533-7ff463c4b2a0","Type":"ContainerStarted","Data":"72d521475bec85115a65ace8d36955cbacd8943b017aa3db6bb877f7d4a558bf"} Mar 09 10:19:37 crc kubenswrapper[4861]: I0309 10:19:37.021434 4861 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ztbmg" podStartSLOduration=2.979058834 podStartE2EDuration="6.021408978s" podCreationTimestamp="2026-03-09 10:19:31 +0000 UTC" firstStartedPulling="2026-03-09 10:19:32.949787623 +0000 UTC m=+4416.034827024" lastFinishedPulling="2026-03-09 10:19:35.992137767 +0000 UTC m=+4419.077177168" observedRunningTime="2026-03-09 10:19:37.012446897 +0000 UTC m=+4420.097486308" watchObservedRunningTime="2026-03-09 10:19:37.021408978 +0000 UTC m=+4420.106448379" Mar 09 10:19:41 crc kubenswrapper[4861]: I0309 10:19:41.775418 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ztbmg" Mar 09 10:19:41 crc kubenswrapper[4861]: I0309 10:19:41.775891 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ztbmg" Mar 09 10:19:41 crc kubenswrapper[4861]: I0309 10:19:41.822842 4861 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ztbmg" Mar 09 10:19:42 crc kubenswrapper[4861]: I0309 10:19:42.085810 4861 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ztbmg" Mar 09 10:19:42 crc kubenswrapper[4861]: I0309 10:19:42.145474 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ztbmg"] Mar 09 10:19:44 crc kubenswrapper[4861]: I0309 10:19:44.055817 4861 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ztbmg" podUID="9ad41b34-0f10-4e71-8533-7ff463c4b2a0" containerName="registry-server" containerID="cri-o://72d521475bec85115a65ace8d36955cbacd8943b017aa3db6bb877f7d4a558bf" gracePeriod=2 Mar 09 10:19:44 crc kubenswrapper[4861]: I0309 10:19:44.525654 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ztbmg" Mar 09 10:19:44 crc kubenswrapper[4861]: I0309 10:19:44.711797 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ad41b34-0f10-4e71-8533-7ff463c4b2a0-catalog-content\") pod \"9ad41b34-0f10-4e71-8533-7ff463c4b2a0\" (UID: \"9ad41b34-0f10-4e71-8533-7ff463c4b2a0\") " Mar 09 10:19:44 crc kubenswrapper[4861]: I0309 10:19:44.711945 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ad41b34-0f10-4e71-8533-7ff463c4b2a0-utilities\") pod \"9ad41b34-0f10-4e71-8533-7ff463c4b2a0\" (UID: \"9ad41b34-0f10-4e71-8533-7ff463c4b2a0\") " Mar 09 10:19:44 crc kubenswrapper[4861]: I0309 10:19:44.712026 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shqr8\" (UniqueName: \"kubernetes.io/projected/9ad41b34-0f10-4e71-8533-7ff463c4b2a0-kube-api-access-shqr8\") pod \"9ad41b34-0f10-4e71-8533-7ff463c4b2a0\" (UID: \"9ad41b34-0f10-4e71-8533-7ff463c4b2a0\") " Mar 09 10:19:44 crc kubenswrapper[4861]: I0309 10:19:44.713042 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ad41b34-0f10-4e71-8533-7ff463c4b2a0-utilities" (OuterVolumeSpecName: "utilities") pod "9ad41b34-0f10-4e71-8533-7ff463c4b2a0" (UID: "9ad41b34-0f10-4e71-8533-7ff463c4b2a0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:19:44 crc kubenswrapper[4861]: I0309 10:19:44.777608 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ad41b34-0f10-4e71-8533-7ff463c4b2a0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9ad41b34-0f10-4e71-8533-7ff463c4b2a0" (UID: "9ad41b34-0f10-4e71-8533-7ff463c4b2a0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 10:19:44 crc kubenswrapper[4861]: I0309 10:19:44.814598 4861 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ad41b34-0f10-4e71-8533-7ff463c4b2a0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 10:19:44 crc kubenswrapper[4861]: I0309 10:19:44.814635 4861 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ad41b34-0f10-4e71-8533-7ff463c4b2a0-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 10:19:44 crc kubenswrapper[4861]: I0309 10:19:44.917494 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ad41b34-0f10-4e71-8533-7ff463c4b2a0-kube-api-access-shqr8" (OuterVolumeSpecName: "kube-api-access-shqr8") pod "9ad41b34-0f10-4e71-8533-7ff463c4b2a0" (UID: "9ad41b34-0f10-4e71-8533-7ff463c4b2a0"). InnerVolumeSpecName "kube-api-access-shqr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:19:45 crc kubenswrapper[4861]: I0309 10:19:45.018312 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shqr8\" (UniqueName: \"kubernetes.io/projected/9ad41b34-0f10-4e71-8533-7ff463c4b2a0-kube-api-access-shqr8\") on node \"crc\" DevicePath \"\"" Mar 09 10:19:45 crc kubenswrapper[4861]: I0309 10:19:45.068291 4861 generic.go:334] "Generic (PLEG): container finished" podID="9ad41b34-0f10-4e71-8533-7ff463c4b2a0" containerID="72d521475bec85115a65ace8d36955cbacd8943b017aa3db6bb877f7d4a558bf" exitCode=0 Mar 09 10:19:45 crc kubenswrapper[4861]: I0309 10:19:45.068336 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ztbmg" event={"ID":"9ad41b34-0f10-4e71-8533-7ff463c4b2a0","Type":"ContainerDied","Data":"72d521475bec85115a65ace8d36955cbacd8943b017aa3db6bb877f7d4a558bf"} Mar 09 10:19:45 crc kubenswrapper[4861]: I0309 10:19:45.068381 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ztbmg" event={"ID":"9ad41b34-0f10-4e71-8533-7ff463c4b2a0","Type":"ContainerDied","Data":"1039adcc3435a65882f25abb7547f130ea4db94302d78dbffb4dbedd3e2a0e85"} Mar 09 10:19:45 crc kubenswrapper[4861]: I0309 10:19:45.068401 4861 scope.go:117] "RemoveContainer" containerID="72d521475bec85115a65ace8d36955cbacd8943b017aa3db6bb877f7d4a558bf" Mar 09 10:19:45 crc kubenswrapper[4861]: I0309 10:19:45.068413 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ztbmg" Mar 09 10:19:45 crc kubenswrapper[4861]: I0309 10:19:45.088648 4861 scope.go:117] "RemoveContainer" containerID="179d478e40f3197ac01ec0cb55cce9e29d80a79c39947d8cd0df83acef13986e" Mar 09 10:19:45 crc kubenswrapper[4861]: I0309 10:19:45.114683 4861 scope.go:117] "RemoveContainer" containerID="87209f647c1cbdbc39759f43ce337d14cbbe5c930172cdf89b3ab36e7b3ef31e" Mar 09 10:19:45 crc kubenswrapper[4861]: I0309 10:19:45.117084 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ztbmg"] Mar 09 10:19:45 crc kubenswrapper[4861]: I0309 10:19:45.125037 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ztbmg"] Mar 09 10:19:45 crc kubenswrapper[4861]: I0309 10:19:45.252384 4861 scope.go:117] "RemoveContainer" containerID="72d521475bec85115a65ace8d36955cbacd8943b017aa3db6bb877f7d4a558bf" Mar 09 10:19:45 crc kubenswrapper[4861]: E0309 10:19:45.252842 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72d521475bec85115a65ace8d36955cbacd8943b017aa3db6bb877f7d4a558bf\": container with ID starting with 72d521475bec85115a65ace8d36955cbacd8943b017aa3db6bb877f7d4a558bf not found: ID does not exist" containerID="72d521475bec85115a65ace8d36955cbacd8943b017aa3db6bb877f7d4a558bf" Mar 09 10:19:45 crc kubenswrapper[4861]: I0309 10:19:45.252878 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72d521475bec85115a65ace8d36955cbacd8943b017aa3db6bb877f7d4a558bf"} err="failed to get container status \"72d521475bec85115a65ace8d36955cbacd8943b017aa3db6bb877f7d4a558bf\": rpc error: code = NotFound desc = could not find container \"72d521475bec85115a65ace8d36955cbacd8943b017aa3db6bb877f7d4a558bf\": container with ID starting with 72d521475bec85115a65ace8d36955cbacd8943b017aa3db6bb877f7d4a558bf not found: ID does not exist" Mar 09 10:19:45 crc kubenswrapper[4861]: I0309 10:19:45.252903 4861 scope.go:117] "RemoveContainer" containerID="179d478e40f3197ac01ec0cb55cce9e29d80a79c39947d8cd0df83acef13986e" Mar 09 10:19:45 crc kubenswrapper[4861]: E0309 10:19:45.253245 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"179d478e40f3197ac01ec0cb55cce9e29d80a79c39947d8cd0df83acef13986e\": container with ID starting with 179d478e40f3197ac01ec0cb55cce9e29d80a79c39947d8cd0df83acef13986e not found: ID does not exist" containerID="179d478e40f3197ac01ec0cb55cce9e29d80a79c39947d8cd0df83acef13986e" Mar 09 10:19:45 crc kubenswrapper[4861]: I0309 10:19:45.253277 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"179d478e40f3197ac01ec0cb55cce9e29d80a79c39947d8cd0df83acef13986e"} err="failed to get container status \"179d478e40f3197ac01ec0cb55cce9e29d80a79c39947d8cd0df83acef13986e\": rpc error: code = NotFound desc = could not find container \"179d478e40f3197ac01ec0cb55cce9e29d80a79c39947d8cd0df83acef13986e\": container with ID starting with 179d478e40f3197ac01ec0cb55cce9e29d80a79c39947d8cd0df83acef13986e not found: ID does not exist" Mar 09 10:19:45 crc kubenswrapper[4861]: I0309 10:19:45.253296 4861 scope.go:117] "RemoveContainer" containerID="87209f647c1cbdbc39759f43ce337d14cbbe5c930172cdf89b3ab36e7b3ef31e" Mar 09 10:19:45 crc kubenswrapper[4861]: E0309 10:19:45.253611 4861 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87209f647c1cbdbc39759f43ce337d14cbbe5c930172cdf89b3ab36e7b3ef31e\": container with ID starting with 87209f647c1cbdbc39759f43ce337d14cbbe5c930172cdf89b3ab36e7b3ef31e not found: ID does not exist" containerID="87209f647c1cbdbc39759f43ce337d14cbbe5c930172cdf89b3ab36e7b3ef31e" Mar 09 10:19:45 crc kubenswrapper[4861]: I0309 10:19:45.253644 4861 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87209f647c1cbdbc39759f43ce337d14cbbe5c930172cdf89b3ab36e7b3ef31e"} err="failed to get container status \"87209f647c1cbdbc39759f43ce337d14cbbe5c930172cdf89b3ab36e7b3ef31e\": rpc error: code = NotFound desc = could not find container \"87209f647c1cbdbc39759f43ce337d14cbbe5c930172cdf89b3ab36e7b3ef31e\": container with ID starting with 87209f647c1cbdbc39759f43ce337d14cbbe5c930172cdf89b3ab36e7b3ef31e not found: ID does not exist" Mar 09 10:19:45 crc kubenswrapper[4861]: I0309 10:19:45.669071 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ad41b34-0f10-4e71-8533-7ff463c4b2a0" path="/var/lib/kubelet/pods/9ad41b34-0f10-4e71-8533-7ff463c4b2a0/volumes" Mar 09 10:19:48 crc kubenswrapper[4861]: I0309 10:19:48.657885 4861 scope.go:117] "RemoveContainer" containerID="c600e1aa8a7ba1a360c635c31a448b7027fd4628a8566a50e85d312351e1d37c" Mar 09 10:19:48 crc kubenswrapper[4861]: E0309 10:19:48.658407 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 10:20:00 crc kubenswrapper[4861]: I0309 10:20:00.142810 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550860-pwtcn"] Mar 09 10:20:00 crc kubenswrapper[4861]: E0309 10:20:00.145648 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ad41b34-0f10-4e71-8533-7ff463c4b2a0" containerName="extract-content" Mar 09 10:20:00 crc kubenswrapper[4861]: I0309 10:20:00.145811 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ad41b34-0f10-4e71-8533-7ff463c4b2a0" containerName="extract-content" Mar 09 10:20:00 crc kubenswrapper[4861]: E0309 10:20:00.145957 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ad41b34-0f10-4e71-8533-7ff463c4b2a0" containerName="registry-server" Mar 09 10:20:00 crc kubenswrapper[4861]: I0309 10:20:00.146071 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ad41b34-0f10-4e71-8533-7ff463c4b2a0" containerName="registry-server" Mar 09 10:20:00 crc kubenswrapper[4861]: E0309 10:20:00.146197 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ad41b34-0f10-4e71-8533-7ff463c4b2a0" containerName="extract-utilities" Mar 09 10:20:00 crc kubenswrapper[4861]: I0309 10:20:00.146316 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ad41b34-0f10-4e71-8533-7ff463c4b2a0" containerName="extract-utilities" Mar 09 10:20:00 crc kubenswrapper[4861]: I0309 10:20:00.146821 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ad41b34-0f10-4e71-8533-7ff463c4b2a0" containerName="registry-server" Mar 09 10:20:00 crc kubenswrapper[4861]: I0309 10:20:00.147995 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550860-pwtcn" Mar 09 10:20:00 crc kubenswrapper[4861]: I0309 10:20:00.150537 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 10:20:00 crc kubenswrapper[4861]: I0309 10:20:00.150834 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 10:20:00 crc kubenswrapper[4861]: I0309 10:20:00.151986 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k86l8" Mar 09 10:20:00 crc kubenswrapper[4861]: I0309 10:20:00.153564 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550860-pwtcn"] Mar 09 10:20:00 crc kubenswrapper[4861]: I0309 10:20:00.224775 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rvhc\" (UniqueName: \"kubernetes.io/projected/1f7bb3cd-e737-419c-9132-5b6246d90eb6-kube-api-access-7rvhc\") pod \"auto-csr-approver-29550860-pwtcn\" (UID: \"1f7bb3cd-e737-419c-9132-5b6246d90eb6\") " pod="openshift-infra/auto-csr-approver-29550860-pwtcn" Mar 09 10:20:00 crc kubenswrapper[4861]: I0309 10:20:00.327613 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rvhc\" (UniqueName: \"kubernetes.io/projected/1f7bb3cd-e737-419c-9132-5b6246d90eb6-kube-api-access-7rvhc\") pod \"auto-csr-approver-29550860-pwtcn\" (UID: \"1f7bb3cd-e737-419c-9132-5b6246d90eb6\") " pod="openshift-infra/auto-csr-approver-29550860-pwtcn" Mar 09 10:20:00 crc kubenswrapper[4861]: I0309 10:20:00.349707 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rvhc\" (UniqueName: \"kubernetes.io/projected/1f7bb3cd-e737-419c-9132-5b6246d90eb6-kube-api-access-7rvhc\") pod \"auto-csr-approver-29550860-pwtcn\" (UID: \"1f7bb3cd-e737-419c-9132-5b6246d90eb6\") " pod="openshift-infra/auto-csr-approver-29550860-pwtcn" Mar 09 10:20:00 crc kubenswrapper[4861]: I0309 10:20:00.471465 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550860-pwtcn" Mar 09 10:20:00 crc kubenswrapper[4861]: I0309 10:20:00.931814 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550860-pwtcn"] Mar 09 10:20:01 crc kubenswrapper[4861]: I0309 10:20:01.214183 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550860-pwtcn" event={"ID":"1f7bb3cd-e737-419c-9132-5b6246d90eb6","Type":"ContainerStarted","Data":"1a7e5697ac13984669b6843844d9772cbd0aa9b30befcd88902a2c61eb1aa8a2"} Mar 09 10:20:01 crc kubenswrapper[4861]: I0309 10:20:01.658534 4861 scope.go:117] "RemoveContainer" containerID="c600e1aa8a7ba1a360c635c31a448b7027fd4628a8566a50e85d312351e1d37c" Mar 09 10:20:01 crc kubenswrapper[4861]: E0309 10:20:01.658973 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 10:20:04 crc kubenswrapper[4861]: I0309 10:20:04.244420 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550860-pwtcn" event={"ID":"1f7bb3cd-e737-419c-9132-5b6246d90eb6","Type":"ContainerDied","Data":"c3f4dffe7a81d8e54fa7dba3dc1505c43a7602ae06f34c088ef686011089ca21"} Mar 09 10:20:04 crc kubenswrapper[4861]: I0309 10:20:04.244347 4861 generic.go:334] "Generic (PLEG): container finished" podID="1f7bb3cd-e737-419c-9132-5b6246d90eb6" containerID="c3f4dffe7a81d8e54fa7dba3dc1505c43a7602ae06f34c088ef686011089ca21" exitCode=0 Mar 09 10:20:05 crc kubenswrapper[4861]: I0309 10:20:05.572242 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550860-pwtcn" Mar 09 10:20:05 crc kubenswrapper[4861]: I0309 10:20:05.635299 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rvhc\" (UniqueName: \"kubernetes.io/projected/1f7bb3cd-e737-419c-9132-5b6246d90eb6-kube-api-access-7rvhc\") pod \"1f7bb3cd-e737-419c-9132-5b6246d90eb6\" (UID: \"1f7bb3cd-e737-419c-9132-5b6246d90eb6\") " Mar 09 10:20:05 crc kubenswrapper[4861]: I0309 10:20:05.641545 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f7bb3cd-e737-419c-9132-5b6246d90eb6-kube-api-access-7rvhc" (OuterVolumeSpecName: "kube-api-access-7rvhc") pod "1f7bb3cd-e737-419c-9132-5b6246d90eb6" (UID: "1f7bb3cd-e737-419c-9132-5b6246d90eb6"). InnerVolumeSpecName "kube-api-access-7rvhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:20:05 crc kubenswrapper[4861]: I0309 10:20:05.737806 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rvhc\" (UniqueName: \"kubernetes.io/projected/1f7bb3cd-e737-419c-9132-5b6246d90eb6-kube-api-access-7rvhc\") on node \"crc\" DevicePath \"\"" Mar 09 10:20:06 crc kubenswrapper[4861]: I0309 10:20:06.264152 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550860-pwtcn" event={"ID":"1f7bb3cd-e737-419c-9132-5b6246d90eb6","Type":"ContainerDied","Data":"1a7e5697ac13984669b6843844d9772cbd0aa9b30befcd88902a2c61eb1aa8a2"} Mar 09 10:20:06 crc kubenswrapper[4861]: I0309 10:20:06.264198 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a7e5697ac13984669b6843844d9772cbd0aa9b30befcd88902a2c61eb1aa8a2" Mar 09 10:20:06 crc kubenswrapper[4861]: I0309 10:20:06.264229 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550860-pwtcn" Mar 09 10:20:06 crc kubenswrapper[4861]: I0309 10:20:06.644899 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550854-hsrtq"] Mar 09 10:20:06 crc kubenswrapper[4861]: I0309 10:20:06.653133 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550854-hsrtq"] Mar 09 10:20:07 crc kubenswrapper[4861]: I0309 10:20:07.667682 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2af42883-8efc-4d48-920f-255783b3fe87" path="/var/lib/kubelet/pods/2af42883-8efc-4d48-920f-255783b3fe87/volumes" Mar 09 10:20:16 crc kubenswrapper[4861]: I0309 10:20:16.658041 4861 scope.go:117] "RemoveContainer" containerID="c600e1aa8a7ba1a360c635c31a448b7027fd4628a8566a50e85d312351e1d37c" Mar 09 10:20:16 crc kubenswrapper[4861]: E0309 10:20:16.658871 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 10:20:31 crc kubenswrapper[4861]: I0309 10:20:31.659826 4861 scope.go:117] "RemoveContainer" containerID="c600e1aa8a7ba1a360c635c31a448b7027fd4628a8566a50e85d312351e1d37c" Mar 09 10:20:31 crc kubenswrapper[4861]: E0309 10:20:31.660768 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 10:20:39 crc kubenswrapper[4861]: I0309 10:20:39.900978 4861 scope.go:117] "RemoveContainer" containerID="1a9ecce6c458a9e7e278205d91d96832b49da37f5d18c2c356f0a230b7b8b468" Mar 09 10:20:45 crc kubenswrapper[4861]: I0309 10:20:45.658609 4861 scope.go:117] "RemoveContainer" containerID="c600e1aa8a7ba1a360c635c31a448b7027fd4628a8566a50e85d312351e1d37c" Mar 09 10:20:45 crc kubenswrapper[4861]: E0309 10:20:45.660958 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 10:20:58 crc kubenswrapper[4861]: I0309 10:20:58.658678 4861 scope.go:117] "RemoveContainer" containerID="c600e1aa8a7ba1a360c635c31a448b7027fd4628a8566a50e85d312351e1d37c" Mar 09 10:20:58 crc kubenswrapper[4861]: E0309 10:20:58.660637 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 10:21:10 crc kubenswrapper[4861]: I0309 10:21:10.658180 4861 scope.go:117] "RemoveContainer" containerID="c600e1aa8a7ba1a360c635c31a448b7027fd4628a8566a50e85d312351e1d37c" Mar 09 10:21:10 crc kubenswrapper[4861]: E0309 10:21:10.659028 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 10:21:23 crc kubenswrapper[4861]: I0309 10:21:23.658661 4861 scope.go:117] "RemoveContainer" containerID="c600e1aa8a7ba1a360c635c31a448b7027fd4628a8566a50e85d312351e1d37c" Mar 09 10:21:23 crc kubenswrapper[4861]: E0309 10:21:23.659512 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 10:21:34 crc kubenswrapper[4861]: I0309 10:21:34.658021 4861 scope.go:117] "RemoveContainer" containerID="c600e1aa8a7ba1a360c635c31a448b7027fd4628a8566a50e85d312351e1d37c" Mar 09 10:21:34 crc kubenswrapper[4861]: E0309 10:21:34.658753 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 10:21:48 crc kubenswrapper[4861]: I0309 10:21:48.659090 4861 scope.go:117] "RemoveContainer" containerID="c600e1aa8a7ba1a360c635c31a448b7027fd4628a8566a50e85d312351e1d37c" Mar 09 10:21:48 crc kubenswrapper[4861]: E0309 10:21:48.659839 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 10:22:00 crc kubenswrapper[4861]: I0309 10:22:00.158727 4861 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550862-9pb2j"] Mar 09 10:22:00 crc kubenswrapper[4861]: E0309 10:22:00.159818 4861 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f7bb3cd-e737-419c-9132-5b6246d90eb6" containerName="oc" Mar 09 10:22:00 crc kubenswrapper[4861]: I0309 10:22:00.159835 4861 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f7bb3cd-e737-419c-9132-5b6246d90eb6" containerName="oc" Mar 09 10:22:00 crc kubenswrapper[4861]: I0309 10:22:00.160097 4861 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f7bb3cd-e737-419c-9132-5b6246d90eb6" containerName="oc" Mar 09 10:22:00 crc kubenswrapper[4861]: I0309 10:22:00.161138 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550862-9pb2j" Mar 09 10:22:00 crc kubenswrapper[4861]: I0309 10:22:00.168962 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550862-9pb2j"] Mar 09 10:22:00 crc kubenswrapper[4861]: I0309 10:22:00.170496 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 10:22:00 crc kubenswrapper[4861]: I0309 10:22:00.170726 4861 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 10:22:00 crc kubenswrapper[4861]: I0309 10:22:00.170944 4861 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-k86l8" Mar 09 10:22:00 crc kubenswrapper[4861]: I0309 10:22:00.233627 4861 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjgjr\" (UniqueName: \"kubernetes.io/projected/4a981366-c2b3-4d38-992e-40231193dfac-kube-api-access-rjgjr\") pod \"auto-csr-approver-29550862-9pb2j\" (UID: \"4a981366-c2b3-4d38-992e-40231193dfac\") " pod="openshift-infra/auto-csr-approver-29550862-9pb2j" Mar 09 10:22:00 crc kubenswrapper[4861]: I0309 10:22:00.334865 4861 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjgjr\" (UniqueName: \"kubernetes.io/projected/4a981366-c2b3-4d38-992e-40231193dfac-kube-api-access-rjgjr\") pod \"auto-csr-approver-29550862-9pb2j\" (UID: \"4a981366-c2b3-4d38-992e-40231193dfac\") " pod="openshift-infra/auto-csr-approver-29550862-9pb2j" Mar 09 10:22:00 crc kubenswrapper[4861]: I0309 10:22:00.355458 4861 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjgjr\" (UniqueName: \"kubernetes.io/projected/4a981366-c2b3-4d38-992e-40231193dfac-kube-api-access-rjgjr\") pod \"auto-csr-approver-29550862-9pb2j\" (UID: \"4a981366-c2b3-4d38-992e-40231193dfac\") " pod="openshift-infra/auto-csr-approver-29550862-9pb2j" Mar 09 10:22:00 crc kubenswrapper[4861]: I0309 10:22:00.487247 4861 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550862-9pb2j" Mar 09 10:22:01 crc kubenswrapper[4861]: I0309 10:22:01.001096 4861 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550862-9pb2j"] Mar 09 10:22:01 crc kubenswrapper[4861]: W0309 10:22:01.003633 4861 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a981366_c2b3_4d38_992e_40231193dfac.slice/crio-d8c6e5c59472d2672e1c9ef1a957f2da5b62152bf40657365b4a45f6f89b8dca WatchSource:0}: Error finding container d8c6e5c59472d2672e1c9ef1a957f2da5b62152bf40657365b4a45f6f89b8dca: Status 404 returned error can't find the container with id d8c6e5c59472d2672e1c9ef1a957f2da5b62152bf40657365b4a45f6f89b8dca Mar 09 10:22:01 crc kubenswrapper[4861]: I0309 10:22:01.415553 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550862-9pb2j" event={"ID":"4a981366-c2b3-4d38-992e-40231193dfac","Type":"ContainerStarted","Data":"d8c6e5c59472d2672e1c9ef1a957f2da5b62152bf40657365b4a45f6f89b8dca"} Mar 09 10:22:02 crc kubenswrapper[4861]: I0309 10:22:02.658481 4861 scope.go:117] "RemoveContainer" containerID="c600e1aa8a7ba1a360c635c31a448b7027fd4628a8566a50e85d312351e1d37c" Mar 09 10:22:02 crc kubenswrapper[4861]: E0309 10:22:02.659618 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 10:22:03 crc kubenswrapper[4861]: I0309 10:22:03.442696 4861 generic.go:334] "Generic (PLEG): container finished" podID="4a981366-c2b3-4d38-992e-40231193dfac" containerID="50ddfcc795673d25fee27a21bd30563e6f8bcf4e587d157ab33695d13a6659d5" exitCode=0 Mar 09 10:22:03 crc kubenswrapper[4861]: I0309 10:22:03.442749 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550862-9pb2j" event={"ID":"4a981366-c2b3-4d38-992e-40231193dfac","Type":"ContainerDied","Data":"50ddfcc795673d25fee27a21bd30563e6f8bcf4e587d157ab33695d13a6659d5"} Mar 09 10:22:04 crc kubenswrapper[4861]: I0309 10:22:04.785170 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550862-9pb2j" Mar 09 10:22:04 crc kubenswrapper[4861]: I0309 10:22:04.925754 4861 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjgjr\" (UniqueName: \"kubernetes.io/projected/4a981366-c2b3-4d38-992e-40231193dfac-kube-api-access-rjgjr\") pod \"4a981366-c2b3-4d38-992e-40231193dfac\" (UID: \"4a981366-c2b3-4d38-992e-40231193dfac\") " Mar 09 10:22:04 crc kubenswrapper[4861]: I0309 10:22:04.931970 4861 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a981366-c2b3-4d38-992e-40231193dfac-kube-api-access-rjgjr" (OuterVolumeSpecName: "kube-api-access-rjgjr") pod "4a981366-c2b3-4d38-992e-40231193dfac" (UID: "4a981366-c2b3-4d38-992e-40231193dfac"). InnerVolumeSpecName "kube-api-access-rjgjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 10:22:05 crc kubenswrapper[4861]: I0309 10:22:05.028304 4861 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjgjr\" (UniqueName: \"kubernetes.io/projected/4a981366-c2b3-4d38-992e-40231193dfac-kube-api-access-rjgjr\") on node \"crc\" DevicePath \"\"" Mar 09 10:22:05 crc kubenswrapper[4861]: I0309 10:22:05.463834 4861 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550862-9pb2j" event={"ID":"4a981366-c2b3-4d38-992e-40231193dfac","Type":"ContainerDied","Data":"d8c6e5c59472d2672e1c9ef1a957f2da5b62152bf40657365b4a45f6f89b8dca"} Mar 09 10:22:05 crc kubenswrapper[4861]: I0309 10:22:05.463879 4861 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8c6e5c59472d2672e1c9ef1a957f2da5b62152bf40657365b4a45f6f89b8dca" Mar 09 10:22:05 crc kubenswrapper[4861]: I0309 10:22:05.463930 4861 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550862-9pb2j" Mar 09 10:22:05 crc kubenswrapper[4861]: I0309 10:22:05.860308 4861 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550856-f6c8f"] Mar 09 10:22:05 crc kubenswrapper[4861]: I0309 10:22:05.875815 4861 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550856-f6c8f"] Mar 09 10:22:07 crc kubenswrapper[4861]: I0309 10:22:07.683708 4861 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="411b28cc-1d68-43de-aab3-f97bc785c97e" path="/var/lib/kubelet/pods/411b28cc-1d68-43de-aab3-f97bc785c97e/volumes" Mar 09 10:22:16 crc kubenswrapper[4861]: I0309 10:22:16.658422 4861 scope.go:117] "RemoveContainer" containerID="c600e1aa8a7ba1a360c635c31a448b7027fd4628a8566a50e85d312351e1d37c" Mar 09 10:22:16 crc kubenswrapper[4861]: E0309 10:22:16.659138 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 10:22:27 crc kubenswrapper[4861]: I0309 10:22:27.666405 4861 scope.go:117] "RemoveContainer" containerID="c600e1aa8a7ba1a360c635c31a448b7027fd4628a8566a50e85d312351e1d37c" Mar 09 10:22:27 crc kubenswrapper[4861]: E0309 10:22:27.667257 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" Mar 09 10:22:40 crc kubenswrapper[4861]: I0309 10:22:40.039663 4861 scope.go:117] "RemoveContainer" containerID="1df8482be4ff633209cbcd2cc0ce78b24b2e71b2784e35adad797cf30491aeb5" Mar 09 10:22:40 crc kubenswrapper[4861]: I0309 10:22:40.658251 4861 scope.go:117] "RemoveContainer" containerID="c600e1aa8a7ba1a360c635c31a448b7027fd4628a8566a50e85d312351e1d37c" Mar 09 10:22:40 crc kubenswrapper[4861]: E0309 10:22:40.658830 4861 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-5g7gc_openshift-machine-config-operator(6f7875e3-174f-4c67-8675-d878de74aa4f)\"" pod="openshift-machine-config-operator/machine-config-daemon-5g7gc" podUID="6f7875e3-174f-4c67-8675-d878de74aa4f" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515153517606024456 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015153517607017374 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015153506177016517 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015153506177015467 5ustar corecore